Python Flask XXXX XXXX Scraping Temu Product Data
Description
Budget: $30 - $250
I’m looking for an experienced developer to build a lightweight Python Flask XXXX XXXX can scrape detailed product data from Temu.com.
Core Functionality:
Build a REST API endpoint that accepts one or more Temu product URLs Fetch and parse the product page to extract all visible data including: Product name Full description Price Availability Ratings Images Variations and any other relevant fields Return the scraped data as clean, human-readable (pretty-printed) JSON in the API response Persist the same JSON data into a local SQLite database with a timestamp for future querying
Midway Requirement: To ensure you have read the full description, include the word "TEMUS" in your proposal. Bids without this will be rejected.
Technical Requirements:
Python 3.x with Flask (FastAPI is acceptable if strongly preferred) Clean architecture with separation between: Scraping logic Data models API routes Implement respectful scraping practices (rate limiting, retries, and handling blocks/s gracefully) Use SQLAlchemy (or raw SQL) with a simple schema and include migration setup Code should be clean, modular, and well-documented
Acceptance Criteria:
Calling /scrape?url=... returns complete product data in formatted JSON The same data is successfully stored as a new row in SQLite Project runs with python app.py without extra configuration
Additional Notes:
Preference will be given to developers who have prior experience scraping Temu and can share a working demo via a live URL Include a brief README with setup steps, environment variables, and a simple one-line cURL test No dashboard, analytics, or external storage required at this stage Focus is on accurate data extraction and clean, maintainable code
Important: Automated bids will be rejected. Only serious and relevant proposals will be considered.
Looking forward to working with skilled developers!
Skills
Want AI to find more roles like this?
Upload your CV once. Get matched to relevant assignments automatically.