A premium, local-first market research and problem discovery platform. Find real user problems, analyze opportunities with AI, validate with multi-source market signals, and discover your next product idea.
| Feature | Description |
|---|---|
| π Problem Discovery | Search Reddit, HackerNews, Twitter, ProductHunt, G2, Capterra for user pain points |
| π€ AI Analysis | Score opportunities 1-10, extract JTBD, identify target audiences |
| π Multi-Source Validation | Validate ideas with Jobs, News, Social, and E-commerce signals via Apify |
| π Analytics Dashboard | Visualize trends, platform breakdowns, score distributions |
| π Collections | Save & organize promising problems into collections |
| π§ Weekly Digests | Email summaries of top opportunities (via Resend) |
| π€ Export | CSV, JSON, PDF-ready HTML reports |
| π Real-time Updates | WebSocket notifications for job completions |
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Frontend (Next.js 16 + React 19 + Shadcn/UI) β
β ββββββββββββ ββββββββββββ ββββββββββββ ββββββββββββ β
β βDashboard β β Discover β βAnalytics β βOpportun- β ... β
β β β β β β β β ities β β
β ββββββββββββ ββββββββββββ ββββββββββββ ββββββββββββ β
β β HTTP/WebSocket β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β Backend (FastAPI + SQLAlchemy) β
β ββββββββββββ ββββββββββββ ββββββββββββ ββββββββββββ β
β βDiscovery β β Data β βAnalytics β β Exports β β
β β Routes β βProviders β β Routes β β Routes β β
β ββββββββββββ ββββββββββββ ββββββββββββ ββββββββββββ β
β β β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β ββββββββββββββββββββ ββββββββββββββββββββ β
β β PostgreSQL DB β β External APIs β β
β β (or SQLite) β β Apify, OpenRouterβ β
β ββββββββββββββββββββ ββββββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
| Requirement | Version | Notes |
|---|---|---|
| Python | 3.11+ | Tested with 3.13 |
| Node.js | 18+ | Tested with 20.x |
| PostgreSQL | 14+ | Recommended for production |
| npm or pnpm | Latest | Package manager |
- Apify API Token - For problem discovery and multi-source validation
- OpenRouter API Key - For AI-powered opportunity analysis
- Resend API Key - For sending weekly digest emails
git clone https://github.com/your-repo/reddit-ops-console.git
cd reddit-ops-consoleUsing Docker (easiest):
docker run -d \
--name reddit-ops-db \
-e POSTGRES_USER=postgres \
-e POSTGRES_PASSWORD=postgres123 \
-e POSTGRES_DB=reddit_ops \
-p 5432:5432 \
postgres:14Or use an existing PostgreSQL installation and create the database:
CREATE DATABASE reddit_ops;# Navigate to backend
cd apps/backend
# Create virtual environment
python3 -m venv .venv
# Activate virtual environment
source .venv/bin/activate # On macOS/Linux
# .venv\Scripts\activate # On Windows
# Install dependencies
pip install -r requirements.txt
# Copy environment file
cp ../../.env.example .env
# Edit .env with your configuration (see Environment Variables section)# Navigate to frontend (from project root)
cd apps/frontend
# Install dependencies
npm installEdit apps/backend/.env:
# Database (PostgreSQL)
DATABASE_URL=postgresql+asyncpg://postgres:postgres123@localhost:5432/reddit_ops
# Required: Apify for problem discovery & multi-source validation
APIFY_API_TOKEN=apify_api_xxxxxxxxxxxxx
# Required: OpenRouter for AI analysis
OPENROUTER_API_KEY=sk-or-v1-xxxxxxxxxxxxx
# Optional: Email digests
RESEND_API_KEY=re_xxxxxxxxxxxxx
RESEND_FROM_EMAIL=insights@yourdomain.comTerminal 1 - Backend:
cd apps/backend
./dev.shTerminal 2 - Frontend:
cd apps/frontend
./dev.sh
β οΈ Important: Always useCtrl+Cto stop servers. Never useCtrl+Z(which suspends processes and causes issues).
Navigate to http://localhost:3000 in your browser.
| Variable | Required | Default | Description |
|---|---|---|---|
DATABASE_URL |
Yes | - | PostgreSQL connection string |
APIFY_API_TOKEN |
Yes | - | Apify token for discovery & multi-source |
OPENROUTER_API_KEY |
Yes | - | OpenRouter key for AI analysis |
RESEND_API_KEY |
No | - | Resend key for email digests |
RESEND_FROM_EMAIL |
No | - | Sender email address |
DEBUG |
No | false |
Enable debug mode |
PostgreSQL (Recommended):
DATABASE_URL=postgresql+asyncpg://user:password@localhost:5432/reddit_ops
SQLite (Development only):
DATABASE_URL=sqlite+aiosqlite:///./data/reddit_ops.db
| Endpoint | Method | Description |
|---|---|---|
/discovery/platforms |
GET | List available platforms |
/discovery/keywords |
GET | Get default search keywords |
/discovery/search |
POST | Run a new problem search |
/discovery/problems |
GET | List discovered problems |
/discovery/problems/{id}/analyze |
POST | AI-analyze a problem |
| Endpoint | Method | Description |
|---|---|---|
/discovery/providers/status |
GET | Check data provider configuration |
/discovery/problems/{id}/enrich-multi |
POST | Fetch signals from Jobs, News, Social |
/discovery/signals/{id} |
GET | Get market signals for a problem |
| Endpoint | Method | Description |
|---|---|---|
/analytics/dashboard |
GET | Full dashboard data |
/analytics/platform-comparison |
GET | Platform trends |
/analytics/score-trends |
GET | Score history |
| Endpoint | Method | Description |
|---|---|---|
/collections |
GET/POST | List/create collections |
/exports/discovery/export |
POST | Export to CSV/JSON |
/exports/discovery/pdf |
GET | PDF-ready download |
reddit-ops-console/
βββ apps/
β βββ backend/ # FastAPI Backend
β β βββ .venv/ # Python virtual environment (create locally)
β β βββ main.py # App entry point
β β βββ requirements.txt # Python dependencies
β β β
β β βββ backend_api/ # API Routes
β β β βββ routes/
β β β βββ discovery.py # Problem discovery + multi-source
β β β βββ analytics.py # Dashboard analytics
β β β βββ collections.py
β β β βββ digest.py
β β β βββ exports.py
β β β
β β βββ backend_core/ # Business logic
β β β βββ apify_service.py # Google Search scraping
β β β βββ ai_enrichment.py # OpenRouter AI integration
β β β βββ email_service.py # Resend integration
β β β
β β βββ data_providers/ # Multi-source validation (NEW)
β β β βββ base.py # DataProviderManager
β β β βββ apify_client.py # Apify wrapper with caching
β β β βββ jobs.py # LinkedIn, Indeed, Adzuna
β β β βββ news.py # Google News, RSS, Product Hunt
β β β βββ social.py # Twitter, YouTube, TikTok, Instagram
β β β
β β βββ backend_db/ # Database models
β β βββ database.py # Connection & pooling
β β βββ models.py # Base models
β β βββ discovery_models.py # Problems, Insights, Signals
β β
β βββ frontend/ # Next.js Frontend
β βββ app/ # App Router pages
β β βββ page.tsx # Dashboard
β β βββ discover/ # Problem discovery
β β βββ opportunities/ # Opportunity details + signals
β β βββ analytics/ # Analytics page
β β βββ settings/ # Settings page
β β
β βββ components/ # React components
β β βββ charts/ # Recharts visualizations
β β βββ collections/ # Collection management
β β βββ signals/ # Market Signals UI (NEW)
β β βββ layout/ # Navigation, sidebar
β β
β βββ hooks/ # Custom React hooks
β βββ useOpportunities.ts
β
βββ .env.example # Environment template
βββ docker-compose.yml # Docker setup
βββ README.md # This file
Tables are created automatically on first startup. The backend checks for existing tables and creates any missing ones.
# PostgreSQL
psql -U postgres -c "DROP DATABASE reddit_ops; CREATE DATABASE reddit_ops;"
# SQLite
rm apps/backend/data/reddit_ops.db
# Restart backend - tables recreate automatically| Table | Description |
|---|---|
discovered_problems |
Found user problems |
problem_insights |
AI analysis results |
market_signals |
Multi-source validation signals (NEW) |
multi_source_validations |
Aggregated confidence scores (NEW) |
problem_collections |
User collections |
digest_settings |
Email digest preferences |
Cause: Virtual environment not activated or dependencies not installed.
Fix:
cd apps/backend
source .venv/bin/activate
pip install -r requirements.txt
.venv/bin/python3 -m uvicorn main:app --host 0.0.0.0 --port 8000Cause: Using system Python instead of virtual environment.
Fix:
cd apps/backend
# Use full path to venv Python
.venv/bin/python3 -m uvicorn main:app --host 0.0.0.0 --port 8000Cause: PostgreSQL not running.
Fix:
# Using Docker
docker start reddit-ops-db
# Or check if running
docker ps | grep postgresFix:
lsof -ti :8000 | xargs kill -9
# Wait 3 seconds, then start againCause: Backend not running or on wrong port.
Fix:
- Verify backend is running:
curl http://localhost:8000/health - Check frontend
.env.localhas correct API URL:NEXT_PUBLIC_API_URL=http://localhost:8000
Cause: Some Apify actors require paid rental.
Info: Free-tier working actors: Indeed Jobs, TikTok Trends, Instagram Posts. Google News, LinkedIn require rental.
Cause: Process was suspended with Ctrl+Z instead of stopped with Ctrl+C.
Fix:
# From project root, run the reset script
./project-reset.sh
# Then restart services
cd apps/backend && ./dev.sh
cd apps/frontend && ./dev.shPrevention: Always use
Ctrl+Cto stop servers. Never useCtrl+Z.
- Find a problem opportunity in the Discover page
- Click to open details sidebar
- Scroll to "Multi-Source Market Signals" section
- Click "Fetch Market Signals"
- System queries 10 data providers via Apify
- Results show confidence score + signals by type
| Type | Providers | Signal |
|---|---|---|
| Jobs | LinkedIn, Indeed, Adzuna | Hiring demand = market need |
| News | Google News, RSS, Product Hunt | Media coverage = trend |
| Social | Twitter, YouTube, TikTok, Instagram | User discussions = pain |
Weighted algorithm:
- Jobs signals: 30%
- News signals: 25%
- Social signals: 25%
- Developer signals: 10%
- E-commerce signals: 10%
- Create account at https://apify.com
- Get API token from Settings β Integrations
- Cost: ~$0.50 / 1000 search results
- Some actors require monthly rental ($5-50/mo)
- Create account at https://openrouter.ai
- Add credits and get API key
- Uses Claude by default (~$0.003 per analysis)
- Create account at https://resend.com
- Get API key from dashboard
- Free tier: 100 emails/day
MIT License - Feel free to use, modify, and distribute.
Built with β€οΈ for indie hackers and product builders.