← Back to Blog

Building a Full-Stack Trading Simulator: Architecture Lessons from Wall Street Rats

When I started Wall Street Rats, I wanted to create something that felt like a real trading platform. The challenge wasn't just building a frontend-it was orchestrating real-time market data, persisting user state, and delivering a snappy user experience. Here's what I learned.

The Architecture Challenge

I chose a classic stack: Nuxt 3 + Flask backend + PostgreSQL + Redis. The core challenge was handling real-time stock quotes without overwhelming the database or API rate limits.

Solving the Real-Time Data Problem

The Alpaca API provides live market data, but polling it for every user for every stock would be wasteful. I implemented a Redis caching layer that:

  • Fetches quotes once and serves them to all users
  • Implements TTL (time-to-live) on cache entries for freshness
  • Falls back to database for historical data
  • Reduces API calls by ~90%
# Flask backend pseudocode
@app.route('/api/quote/<symbol>')
def get_quote(symbol):
    # Check Redis first
    cached = redis.get(f'quote:{symbol}')
    if cached:
        return json.loads(cached)

    # Fetch from Alpaca, cache it
    quote = alpaca.get_latest_quote(symbol)
    redis.setex(f'quote:{symbol}', 60, json.dumps(quote))
    return quote

Frontend Decisions

On the Vue side, I built a real-time dashboard that updates without page reloads. Using WebSocket connections would've been ideal, but I wanted to keep the backend simple. Instead, I used intelligent polling with exponential backoff:

  • Poll every 1 second during market hours
  • Back off to 30 seconds after hours
  • Deduplicate requests client-side to avoid redundant API calls

Algolia handled full-text search across 5,000+ stocks, turning a naive SQL query (which would have been slow) into sub-millisecond lookups.

What Shipped vs. What I'd Change

What worked:

  • Separating frontend and backend made scaling easier
  • Docker containerization meant consistent deployments
  • Redis caching was a huge win for performance

What I'd revisit:

  • WebSockets instead of polling for truly real-time updates
  • Message queues (like Celery) for background jobs
  • More comprehensive API tests

Key Takeaway

Building "real" features teaches you real constraints. You can't fake market data—it forces you to think about latency, caching strategies, and graceful degradation. That's the kind of thinking every SWE role values.