Building Production Analytics with PostHog: A Complete Implementation Guide
I needed analytics for my projects, but I wanted something that wouldn’t break the bank or require a PhD to set up. I tried a few options, but PostHog ended up being my go-to solution because it combines everything I need in one package—product analytics, session recordings, feature flags, and error tracking.
The best part? It has a generous free tier, and you can self-host if you need to. But getting the most out of PostHog requires more than just dropping in a snippet. After implementing it across multiple projects, I want to share how I built a production-ready analytics system that scales with my applications.
My Architecture: Frontend and Backend Working Together
The key to effective analytics is capturing user interactions consistently across your entire stack. Here’s how I approach it:
- Frontend: I track user interactions, page views, button clicks, form submissions, and client-side errors
- Backend: I capture payment events, email campaigns, user lifecycle events, and server-side operations
- Shared: I maintain consistent user identification across both platforms
This dual approach gives me a complete picture of the user journey, from initial discovery to long-term engagement.
I use Vue for my frontend and Fastify for my backend, but the principles apply to any frontend and backend stack.
Getting Started: Environment Setup
First, I set up my environment variables like this:
# Frontend (.env)
VITE_POSTHOG_TOKEN=phc_your_project_api_key
# Backend (.env)
POSTHOG_API_KEY=phc_your_project_api_key
POSTHOG_PERSONAL_API_KEY=phx_your_personal_api_key
The PERSONAL_API_KEY is needed for backend data queries and user management operations.
Frontend Implementation
Smart Initialization
I don’t slap PostHog everywhere—it pays to initialize it thoughtfully:
import posthog from "posthog-js"
export default {
install(app: App) {
if (!env.VITE_POSTHOG_TOKEN) {
return
}
const storageAvailable = isStorageAvailable()
app.config.globalProperties.$posthog = posthog.init(env.VITE_POSTHOG_TOKEN, {
api_host: "https://us.i.posthog.com",
person_profiles: "identified_only", // or 'always' to create profiles for anonymous users as well
disable_surveys: !storageAvailable,
session_recording: {
maskAllInputs: false,
maskInputOptions: {
password: true,
},
},
})
if (env.VITE_RENDER_GIT_COMMIT && env.VITE_RENDER_GIT_COMMIT.trim() !== "") {
posthog.register_for_session({
git: env.VITE_RENDER_GIT_COMMIT,
})
posthog.setPersonProperties({ lastSeen: new Date().toISOString() })
}
},
}
The key decisions I made here:
person_profiles: 'identified_only': I only track users I can identify, protecting privacy while getting meaningful data- Custom page view handling: This integrates with Vue Router for accurate SPA tracking
- Git commit tagging: This helps correlate issues with specific deployments
- lastSeen: This helps to know when the user last visited the site
- firstSeen: Elsewhere, I do:
{ firstSeen: new Date().toISOString() }as a set-once property as soon as I can. My site supports light and dark themes so it’s one of the first place PostHog has a chance to run - Survey control: I disable surveys in environments where localStorage isn’t available. I added this because I suspected this caused errors for me in production
Session Recording with Privacy
Session recordings are invaluable for understanding user behavior, but I’m careful about privacy:
posthog.startSessionRecording()
// Password fields automatically masked
// Check storage availability before enabling
I make sure to always check localStorage availability first—nothing worse than breaking your app because analytics tried to start in an unsupported environment.
Backend Implementation
Singleton Pattern
I use a singleton pattern for my backend PostHog client to avoid multiple instances:
import { PostHog } from 'posthog-node'
let posthogClient: PostHog | null = null
export function getPostHogClient(): PostHog | null {
if (posthogClient) return posthogClient
if (!env.POSTHOG_API_KEY) {
logger.info("PostHog API key not configured")
return null
}
posthogClient = new PostHog(env.POSTHOG_API_KEY, {
apiHost: 'https://us.i.posthog.com',
flushAt: 20,
flushInterval: 10000
})
return posthogClient
}
This approach prevents connection issues and ensures consistent configuration across my backend services.
Event Strategy: Quality Over Quantity
Centralized Event Management
// frontend/src/lib/posthog-events.ts
export const LOGIN_BLUESKY_EVENT = "login_bluesky"
export const SCHEDULED_POST_EVENT = "scheduled_post"
export const SUBSCRIBED_EVENT = "subscribed"
// ... organize all events by category
This prevents typos, enables easy refactoring, provides TypeScript autocomplete, and ensures consistency across my entire stack.
Rich Event Properties
I don’t just capture events—I capture context:
posthog.capture(SCHEDULED_POST_EVENT, {
count: promises.length,
source: "follow back cta",
socialNetwork: "bluesky",
scheduledAt: timestamp
})
Rich properties turn raw data into actionable insights.
User Identification Strategy
Email-Based Consistency
The most important thing about analytics is maintaining user identity across devices and platforms. I use a normalized email-based approach:
export function trackingIdentifierForEmail(email: string): string {
return email.toLowerCase().trim()
}
// Frontend identification
function identifyUserInPosthog(email: string, subscriptionData: any) {
const distinctID = trackingIdentifierForEmail(email)
posthog.identify(distinctID, {
email,
hasSubscription: subscriptionData.hasSubscription,
subscriptionType: subscriptionData.type,
// Add other relevant properties
})
}
Handling User ID Migrations
Users change identifiers over time. I handle transitions gracefully (because I used to use Bluesky handles as identity):
// Migrate from .bluesky suffix to email-based IDs
if (oldDistinctID.endsWith('.bluesky')) {
posthog.alias(newDistinctID, oldDistinctID)
}
posthog.identify(newDistinctID, personProperties)
It is critical to call .alias() before .identify(). I made the mistake of doing it the other way around previously.
This ensures historical data stays linked to users as my identification strategy evolves.
Error Tracking Integration
Error tracking is one of PostHog’s most powerful features. But I don’t capture everything—I’m strategic:
try {
await riskyOperation()
} catch (error) {
logger.error("Operation failed", { error, stack: error.stack })
if (!isExpectedError(error)) {
posthog.captureException(error)
}
}
I track: Unexpected errors, API failures, validation issues I don’t track: Expected failures (expired links, Safari compatibility issues, intentional rejections)
This approach keeps my error dashboard clean and focused on actual problems that need attention.
Advanced Backend Events
Immediate Flushing for Critical Events
Some events need immediate attention:
export function capturePaymentEvent(email: string, event: string, properties: object) {
const posthog = getPostHogClient()
if (!posthog) return
posthog.capture({
distinctId: trackingIdentifierForEmail(email),
event,
properties
})
// Ensure events are sent immediately for critical events
void posthog.flush()
}
Email Campaign Analytics
I track my email effectiveness systematically:
posthog.capture(SEND_ACTIVATION_EMAIL_EVENT, {
email,
reason: "inactive_user",
daysSinceLastLogin: 30
})
This helps me understand which campaigns drive real user engagement. I create PostHog funnels to observe the impact of (re)activation emails.
Data Query and Analysis
Backend Query Library
Sometimes I need to pull analytics data programmatically:
export async function fetchPersonByDistinctID(distinctID: string) {
const response = await fetch(
`https://us.i.posthog.com/api/persons/?distinct_id=${distinctID}`,
{
headers: { 'Authorization': `Bearer ${env.POSTHOG_PERSONAL_API_KEY}` }
}
)
return response.json()
}
This enables custom dashboards, user lookup tools, and automated reporting.
Operational Best Practices
Environment-Specific Behavior
I disable analytics in development to prevent noise:
if (!import.meta.env.PROD && !posthogToken) {
console.log("PostHog disabled in development")
return
}
Debugging and Maintenance
I include context in all events for easier troubleshooting:
// Structured error logging
logger.error({
error: errorObject,
stack: errorObject.stack,
distinctId: userIdentifier
}, "PostHog event capture failed")
Conversion Tracking
I prevent duplicate conversions with careful state management:
if (!conversionShown.value) {
posthog.capture(SUBSCRIBED_EVENT, {
plan: priceID,
value: amount,
currency: "USD"
})
conversionShown.value = true
}
What I Get: Actionable Insights
With this setup, I’m able to:
- Understand user journeys: Track from initial visit to conversion
- Identify friction points: See where users drop off or encounter errors
- Measure feature adoption: Know which features drive engagement
- Optimize conversions: Understand what drives paying customers
- Debug issues proactively: Catch problems before users report them
Getting Started
- Start small: Begin with basic page views and key actions
- Iterate: Add more events as you understand your users better
- Review regularly: Clean up unused events and refine tracking
- Share insights: Make analytics data useful across your team
Analytics should drive action, not just collect data. I focus on tracking what helps me build better products for my users.
The value of good analytics has never been more real for my projects. Being able to understand user behavior without paying hundreds of dollars per month has been a game changer.
What’s your experience with analytics implementation? Send me your tips and challenges via email at hboon@motionobj.com. Thanks!