denysdotmedenysdotme
May 2025
MongoDBReactNodeJsTensorFlow

Why Did Trendflow Come To Be?

I found a lot of time that Not knowing hurts more than being wrong. There is a lot to unpack here, but in the most simplest terms a lot of real world projects solve dev problems and staying in the know about new projects is the most healthiest way to be best developer. Wether it is new practices, new technologies or projects the best thing is to be aware of it so that in the next project you are not caught off guard. It is also true that the best way to learn is with hands on and trendflow attempts to encourages you to open and start your own simple or complex project in whatever tech it is that you discover. Finally, this brief technical blog on key features and technologies trendflow uses:

Introduction

What is trendflow real goal? As a full-featured MERN (MongoDB, Express, React, Node.js) this application designed to help developers and product teams discover emerging technology trends. By combining historical metrics with real-time data feeds and machine learning predictions, TrendFlow delivers accurate rankings and forecasts that power dashboards, newsletters, and alert systems.

Architecture Overview

  • Frontend: React SPA with server-side rendering (SSR) for SEO benefits and fast first-paint.
  • API Layer: Express.js server exposing REST endpoints for trends, user profiles, and forecasts.
  • Database: MongoDB for flexible document storage, with strategic indexing and caching.
  • Machine Learning: TensorFlow model served through a Flask micro-service for generating trendflow' data and trend forecasting.
  • Real-Time Pipeline: Change Streams and WebSockets to push updates when view counts or sentiment spikes.

Scoring Methodology

Besides generated information (mini get started blog with highlighted strengths) about each trend which specifically details its use and edge, each trend in trendflow receives metrics which showcase the current polarity on the web. This is obviously designed not to discourage use of a particular trend, but to showcase the wider spread adaption that signals more support. Small example of the weight used in computation is a combined score derived from multiple metrics:

export const weights = {
t_score: 0.4,
views: 0.3,
trendStatus: 0.1,
f_score: 0.2,
};
  • t_score: Historical growth metric computed from past data.
  • views: Recent user views, capturing popularity.
  • trendStatus: Categorical boost (e.g., “breakout,” “cool-off”).
  • f_score: Forecasted future score from the ML micro-service.

Real-Time Adjustments

App is designed to let user filter quickly and precisely based on categories which are important. In order to maintain performance the app fetches only certain amount of trends per fetch and uses cursor-based based pagination. Instead of skipping N × page rows which gets slower the deeper you go paginateAndSortCursor turns the last record from the previous batch into a bookmark and asks MongoDB for “everything after this document.” When a cursor query-string is present it’s split into updatedAt and _id, then folded into an $or that keeps the sort order stable but guarantees no duplicates:

const [updatedAtCursor, idCursor] = cursor.split('|');
queryObject.$or = [
  { updatedAt: { $lt: updatedAtCursor } },
  {
    updatedAt: updatedAtCursor,
    _id: { $gt: idCursor },
  },
];

Because we ask for limit + 1 items, the function can tell if there’s a next page without an extra round-trip. After trimming the extra doc, it forges the next bookmark so the client can keep scrolling forever:

const lastTrend = trends[trends.length - 1];
nextCursor = `${lastTrend.updatedAt.toISOString()}|${lastTrend._id}`;

Score Management

  • Scores Updated Weekly: score updated weekly using Cron based on trend view count. In updateScores bulkWrite is used for low RAM use as a one round‑trip to Mongo.
  • Granularity: Cache is scoped per endpoint and query parameters to avoid stale mixes.

A Little on Security

Like any modern application trendflow uses JWT. With the use of JWT and validation middleware layers trendflow puts a triple lock on every request: validate it, stamp it, then check if the stamp has any rights. However one of the goals in this project was to maintain easy access even if user has no desire to waste time with account creation. Todo this securely authenticateUser first tries to verify a JWT if none exists, it gracefully downgrades to a time-boxed guestUser pulled from a cookie—handy for read-only browsing without forcing sign-up. Any tampered or expired token blows up with UnauthenticatedError, keeping session spoofing at bay.

export const authenticateUser = async (req, res, next) => {
  const { token, guestUserID } = req.cookies;
  if (!token && guestUserID) {
    const guest = await UserModel.findById(guestUserID);
    if (guest && guest.role === 'guestUser' && new Date() < guest.expiresAt) {
      req.user = { userID: guest._id, role: guest.role };
      return next();
    }
  }

In this small example the middleware falls back to a time-boxed guest account when no JWT is present, keeping friction low while still tracking abuse. Downstream, authorizedPermissions('write') gates routes by comparing the caller’s role to a tiny in-memory ACL—no DB hop needed.

Security highlights: secrets live only in process.env, validation strips XSS and role-escalation attempts, JWTs are signed with a server-side key, and every custom error bubbles to one centralized handler—no ugly stack traces in production. Net result: clean, fast, abuse-resistant endpoints without over-engineering.

MongoDB Indexing

Efficient queries depend on strategic indexes:

const trendSchema = new mongoose.Schema({ /_ fields omitted _/ }, { timestamps: true });

trendSchema.index({ combinedScore: -1 });
trendSchema.index({ views: -1 });
trendSchema.index({ createdAt: -1 });
trendSchema.index({ statusBoost: -1 });
  • Compound Indexes can be added for multi-field sorts (e.g., {statusBoost: -1, views: -1}).
  • TTL Indexes manage archival of stale trends after a set duration.

TensorFlow Data Processing Micro-service (forecast)

One interesting implemented feature in trendflow is forecasting. For forecasting python micro-service spins up NeuralProphet—essentially Facebook Prophet with a PyTorch heart—configured for only weekly seasonality to keep the network tiny:

model = NeuralProphet(yearly_seasonality=False,
                      weekly_seasonality=True,
                      daily_seasonality=False)
metrics   = model.fit(df, freq='W', progress='none')
forecast  = model.predict(model.make_future_dataframe(df, periods=12, n_historic_predictions=True))

Because the model is small and GPU-less (torch==2.2.2+cpu), training a few dozen points takes well under a second.

The forecast engine turns a raw list of weekly {date, count} objects into three things: a categorical status (trending, breakout, cool-off …), a 12-week forecast of counts, and an f-score that normalizes how strong that forecast looks.

model.fit(df, freq='W', progress='none')
future = model.make_future_dataframe(
    df,
    periods=12,
    n_historic_predictions=True
)

Deployment & Scalability (in progress)

TrendFlow uses Docker Compose for local dev and Kubernetes for production:

version: '3.8'
services:
api:
build: ./server
ports: ['4000:4000']
depends_on: ['mongo', 'redis', 'forecast']
client:
build: ./client
ports: ['3000:3000']
mongo:
image: mongo:6
volumes: ['./data/db:/data/db']
redis:
image: redis:7
forecast:
build: ./forecast_service
ports: ['5001:5001']
  • Auto-Scaling: Kubernetes HorizontalPodAutoscaler on API and forecast pods.
  • Load Balancing: NGINX ingress routing traffic.
  • CI/CD: GitHub Actions pipelines build and push Docker images, then update Kubernetes deployments with zero-downtime rolling updates.

FUTURE enhancements:

Caching Strategy

To minimize database load and reduce latency for high-traffic Redis will be implemented for caching:

import Redis from 'ioredis';

const redis = new Redis();
export async function cache(key, fetchFn) {
  const cached = await redis.get(key);
  if (cached) return JSON.parse(cached);
  const result = await fetchFn();
  await redis.set(key, JSON.stringify(result), 'EX', 60);
  return result;
}
  • Invalidation: Whenever underlying data changes (e.g., a trend score update), the cache key is deleted.
  • Granularity: Cache is scoped per endpoint and query parameters to avoid stale mixes.

Highlighting few Trendflow Benefits

  1. SEO & Performance: SSR + caching accelerates page loads and improves search rankings.
  2. Real-Time Insights: Change Streams and WebSockets keep dashboards live.
  3. Scalable Data Access: Indexing + cursor pagination ensures consistent response times.
  4. Modular ML: Flask micro-service isolates TensorFlow concerns.
  5. Customizable Scoring: Simple weight tweaks adapt to evolving business priorities.
  6. Hardened Auth: Express-validator blocks malformed payloads; JWT + role-based guards keep routes locked down while still allowing a friction-free guest-user flow.

MERN TL;DR

Consequently trendflow showcases how MERN is a powerful stack for realizing the potential of the project in quick manner. trendflow' layers form a tidy playground for anyone who wants to see a full-stack production flow end-to-end: React Router demonstrates component-scoped data loaders; Express shows how thin routers delegate to testable controllers; middleware enforce validation, auth, and error hygiene without cluttering business code all facilitate perfect project worth exploring. The project is available on github under onflowai organization.

Conclusion

My hope is simple: lower the friction between “I heard about X” and “I actually tried X.” trendflow helps developers explore tools by swiftly linking developers straight to simple starter doc / information, and even nudges them in a direction they potentially didn't know existed. In the process of wanting to be helpful I myself grown to explore and learn enormous amount of developer tool that grow my curiosity evermore. By surfacing credible metrics on top of this cocoon of data developers who will crawl trendflow can also pick the right experiments or full-fledged projects with knowledge that tech is relevant, and walk into the next meeting already speaking the language of tomorrow’s stack.