How I Set Up PostHog Analytics Across Frontend and Backend
I needed analytics for my projects, but I wanted something that wouldn’t break the bank or require a PhD to set up. I tried a few options, but PostHog ended up being my go-to solution because it combines everything I need in one package—product analytics, session recordings, feature flags, and error tracking.
It has a generous free tier and you can self-host if you need to. After implementing it across multiple projects, here’s how I set up a production analytics system that works well.
My Architecture: Frontend and Backend Working Together
I capture events on both frontend and backend:
- Frontend — user interactions, page views, button clicks, form submissions, and client-side errors
- Backend — payment events, email campaigns, user lifecycle events, and server-side operations
I maintain consistent user identification across both, which gives me a complete picture from initial visit to long-term engagement.
I use Vue for my frontend and Fastify for my backend, but the principles apply to any frontend and backend stack.
Environment Setup
First, I set up my environment variables like this:
# Frontend (.env)
VITE_POSTHOG_TOKEN=phc_your_project_api_key
# Backend (.env)
POSTHOG_API_KEY=phc_your_project_api_key
POSTHOG_PERSONAL_API_KEY=phx_your_personal_api_key
The PERSONAL_API_KEY is needed for backend data queries and user management operations.
Frontend Implementation
Smart Initialization
I don’t slap PostHog everywhere—it pays to initialize it thoughtfully:
import posthog from "posthog-js"
export default {
install(app: App) {
if (!env.VITE_POSTHOG_TOKEN) {
return
}
const storageAvailable = isStorageAvailable()
app.config.globalProperties.$posthog = posthog.init(env.VITE_POSTHOG_TOKEN, {
api_host: "https://us.i.posthog.com",
person_profiles: "identified_only", // or 'always' to create profiles for anonymous users as well
disable_surveys: !storageAvailable,
session_recording: {
maskAllInputs: false,
maskInputOptions: {
password: true,
},
},
})
if (env.VITE_RENDER_GIT_COMMIT && env.VITE_RENDER_GIT_COMMIT.trim() !== "") {
posthog.register_for_session({
git: env.VITE_RENDER_GIT_COMMIT,
})
posthog.setPersonProperties({ lastSeen: new Date().toISOString() })
}
},
}
A few decisions worth explaining:
I set person_profiles: 'identified_only' so I only track users I can identify – protects privacy while still getting meaningful data.
I tag each session with the git commit hash, which helps correlate issues with specific deployments. I also set lastSeen here and firstSeen (as a set-once property) elsewhere as soon as I can. My site supports light and dark themes so it’s one of the first places PostHog has a chance to run.
I disable surveys when localStorage isn’t available. I added this because I suspected it caused errors in production.
Session Recording with Privacy
Session recordings are invaluable for understanding user behavior, but I’m careful about privacy:
posthog.startSessionRecording()
// Password fields automatically masked
// Check storage availability before enabling
I make sure to always check localStorage availability first—nothing worse than breaking your app because analytics tried to start in an unsupported environment.
Backend Implementation
Singleton Pattern
I use a singleton pattern for my backend PostHog client to avoid multiple instances:
import { PostHog } from 'posthog-node'
let posthogClient: PostHog | null = null
export function getPostHogClient(): PostHog | null {
if (posthogClient) return posthogClient
if (!env.POSTHOG_API_KEY) {
logger.info("PostHog API key not configured")
return null
}
posthogClient = new PostHog(env.POSTHOG_API_KEY, {
apiHost: 'https://us.i.posthog.com',
flushAt: 20,
flushInterval: 10000
})
return posthogClient
}
This approach prevents connection issues and ensures consistent configuration across my backend services.
Event Strategy: Quality Over Quantity
Centralized Event Management
// frontend/src/lib/posthog-events.ts
export const LOGIN_BLUESKY_EVENT = "login_bluesky"
export const SCHEDULED_POST_EVENT = "scheduled_post"
export const SUBSCRIBED_EVENT = "subscribed"
// ... organize all events by category
This prevents typos, makes refactoring easier, and gives me TypeScript autocomplete.
Rich Event Properties
I always include context with events:
posthog.capture(SCHEDULED_POST_EVENT, {
count: promises.length,
source: "follow back cta",
socialNetwork: "bluesky",
scheduledAt: timestamp
})
This makes the data actually useful when I look at it later.
User Identification Strategy
Email-Based Consistency
The most important thing about analytics is maintaining user identity across devices and platforms. I use a normalized email-based approach:
export function trackingIdentifierForEmail(email: string): string {
return email.toLowerCase().trim()
}
// Frontend identification
function identifyUserInPosthog(email: string, subscriptionData: any) {
const distinctID = trackingIdentifierForEmail(email)
posthog.identify(distinctID, {
email,
hasSubscription: subscriptionData.hasSubscription,
subscriptionType: subscriptionData.type,
// Add other relevant properties
})
}
Handling User ID Migrations
Users change identifiers over time. I handle transitions gracefully (because I used to use Bluesky handles as identity):
// Migrate from .bluesky suffix to email-based IDs
if (oldDistinctID.endsWith('.bluesky')) {
posthog.alias(newDistinctID, oldDistinctID)
}
posthog.identify(newDistinctID, personProperties)
It is critical to call .alias() before .identify(). I made the mistake of doing it the other way around previously.
This ensures historical data stays linked to users as my identification strategy evolves.
Error Tracking Integration
Error tracking is one of PostHog’s most powerful features. But I don’t capture everything—I’m strategic:
try {
await riskyOperation()
} catch (error) {
logger.error("Operation failed", { error, stack: error.stack })
if (!isExpectedError(error)) {
posthog.captureException(error)
}
}
I track unexpected errors, API failures, and validation issues. I skip expected failures like expired links, Safari compatibility issues, and intentional rejections. This keeps the error dashboard useful instead of noisy.
Advanced Backend Events
Immediate Flushing for Critical Events
Some events need immediate attention:
export function capturePaymentEvent(email: string, event: string, properties: object) {
const posthog = getPostHogClient()
if (!posthog) return
posthog.capture({
distinctId: trackingIdentifierForEmail(email),
event,
properties
})
// Ensure events are sent immediately for critical events
void posthog.flush()
}
Email Campaign Analytics
I track my email effectiveness systematically:
posthog.capture(SEND_ACTIVATION_EMAIL_EVENT, {
email,
reason: "inactive_user",
daysSinceLastLogin: 30
})
This helps me understand which campaigns drive real user engagement. I create PostHog funnels to observe the impact of (re)activation emails.
Data Query and Analysis
Backend Query Library
Sometimes I need to pull analytics data programmatically:
export async function fetchPersonByDistinctID(distinctID: string) {
const response = await fetch(
`https://us.i.posthog.com/api/persons/?distinct_id=${distinctID}`,
{
headers: { 'Authorization': `Bearer ${env.POSTHOG_PERSONAL_API_KEY}` }
}
)
return response.json()
}
This enables custom dashboards, user lookup tools, and automated reporting.
Operational Best Practices
Environment-Specific Behavior
I disable analytics in development to prevent noise:
if (!import.meta.env.PROD && !posthogToken) {
console.log("PostHog disabled in development")
return
}
Debugging and Maintenance
I include context in all events for easier troubleshooting:
// Structured error logging
logger.error({
error: errorObject,
stack: errorObject.stack,
distinctId: userIdentifier
}, "PostHog event capture failed")
Conversion Tracking
I prevent duplicate conversions with careful state management:
if (!conversionShown.value) {
posthog.capture(SUBSCRIBED_EVENT, {
plan: priceID,
value: amount,
currency: "USD"
})
conversionShown.value = true
}
With this setup, I can trace a user from first visit through to conversion, see where they drop off, and catch errors before they report them. PostHog’s free tier covers all of this, which is great when you don’t want to pay hundreds of dollars a month for analytics.
What’s your experience with analytics implementation? Send me your tips and challenges via email at hboon@motionobj.com. Thanks!