Python Automation Mastery: Build Income While You Sleep
Complete 6-module course on building automated income streams with Python
Course Contents
Module 1: Python Fundamentals for Automation
Module 1: Python Fundamentals for Automation
Setting Up Your Environment
Install Python 3.11+ and create your first virtual environment:
python -m venv autohustle-env
source autohustle-env/bin/activate # Mac/Linux
autohustle-env\Scripts\activate # Windows
pip install requests httpx beautifulsoup4 schedule python-dotenv
Core Automation Concepts
1. Working with HTTP APIs
Most money-making opportunities involve APIs. Master these patterns:
- GET requests to fetch data (prices, trends, competitor info)
- POST requests to create content (publish blog posts, list products)
- Authentication (API keys, OAuth tokens, Bearer headers)
2. File Automation
- Reading/writing CSV files for bulk product imports
- JSON config files for API credentials
- Automating file organization and processing
3. Scheduling Tasks
Run your automation 24/7 without manual intervention:
schedulelibrary for simple cron-like jobs- Windows Task Scheduler for system-level scheduling
- GitHub Actions for cloud-based automation (free tier)
Your First Automation: Price Monitor
import httpx
import schedule
import time
def check_crypto_prices():
r = httpx.get("https://api.coingecko.com/api/v3/simple/price?ids=bitcoin&vs_currencies=usd")
btc_price = r.json()["bitcoin"]["usd"]
print(f"BTC: ${btc_price:,.2f}")
if btc_price < 40000:
print("BUY SIGNAL!")
schedule.every(5).minutes.do(check_crypto_prices)
while True:
schedule.run_pending()
time.sleep(1)
Exercise: Extend this to monitor 10 coins and save alerts to a CSV file.
Module 2: Web Scraping for Market Intelligence
Module 2: Web Scraping for Market Intelligence
Ethical Scraping Principles
- Always check robots.txt before scraping
- Add delays between requests (2-5 seconds minimum)
- Use a descriptive User-Agent header
- Never scrape personal data or private information
BeautifulSoup Fundamentals
import httpx
from bs4 import BeautifulSoup
import time
def scrape_trending_products(url):
headers = {"User-Agent": "AutoHustle Research Bot 1.0"}
r = httpx.get(url, headers=headers, timeout=10)
soup = BeautifulSoup(r.text, "html.parser")
products = []
for item in soup.select(".product-card"):
title = item.select_one("h3")
price = item.select_one(".price")
if title and price:
products.append({
"title": title.text.strip(),
"price": price.text.strip()
})
return products
Real-World Application: Amazon Best Sellers Monitor
Track trending products in any category to identify dropshipping opportunities:
- Scrape Amazon Best Sellers in your niche
- Calculate price spreads (retail - wholesale)
- Flag products with >40% margin potential
- Auto-import to Shopify via API
CoinGecko Market Data (No Scraping Needed — Free API)
async def get_trending_cryptos():
async with httpx.AsyncClient() as client:
r = await client.get("https://api.coingecko.com/api/v3/search/trending")
coins = r.json()["coins"][:7]
return [{"name": c["item"]["name"], "rank": c["item"]["market_cap_rank"]}
for c in coins]
Project: Build a daily market intelligence report that scrapes 3 trend sources and emails you a summary.
Module 3: Automated Content Creation & Publishing
Module 3: Automated Content Creation & Publishing
GitHub Pages as a Free Publishing Platform
GitHub Pages gives you free web hosting. Combined with Python automation:
- Publish 100+ SEO articles per day for free
- Never pay for hosting
- Automatic HTTPS and CDN
Setting Up Automated Blog Publishing
from git import Repo
import os
def publish_article(title, content, repo_path="./my-blog"):
# Convert title to URL slug
slug = title.lower().replace(" ", "-")
slug = "".join(c for c in slug if c.isalnum() or c == "-")
# Write HTML file
html = f'''
{title}
{title}
{content}'''
filepath = f"{repo_path}/blog/{slug}/index.html"
os.makedirs(os.path.dirname(filepath), exist_ok=True)
with open(filepath, "w") as f:
f.write(html)
# Commit and push
repo = Repo(repo_path)
repo.git.add(A=True)
repo.index.commit(f"Add: {title}")
repo.remotes.origin.push()
print(f"Published: https://yourusername.github.io/my-blog/blog/{slug}/")
AI-Generated Content at Scale
Use free LLM APIs to generate articles:
- Groq API: 500,000 tokens/day free (ultra-fast Llama 3.3 70B)
- Google Gemini: 1,000 requests/day free
- Ollama: Run Llama locally, unlimited, free
import httpx
async def generate_article(topic, api_key):
r = await httpx.AsyncClient().post(
"https://api.groq.com/openai/v1/chat/completions",
headers={"Authorization": f"Bearer {api_key}"},
json={
"model": "llama-3.3-70b-versatile",
"messages": [{"role": "user", "content": f"Write a 1000-word SEO article about: {topic}"}],
"max_tokens": 2000
}
)
return r.json()["choices"][0]["message"]["content"]
Project: Build a pipeline that generates 10 articles per day on trending topics and auto-publishes them to GitHub Pages with affiliate links.
Module 4: Shopify Automation & E-Commerce
Module 4: Shopify Automation & E-Commerce
Shopify Admin API Basics
The Shopify Admin REST API v2025-01 gives you full control over your store:
import httpx
class ShopifyAPI:
def __init__(self, shop_url, access_token):
self.base = f"https://{shop_url}/admin/api/2025-01"
self.headers = {
"X-Shopify-Access-Token": access_token,
"Content-Type": "application/json"
}
async def create_product(self, title, description, price, images=[]):
async with httpx.AsyncClient() as client:
r = await client.post(
f"{self.base}/products.json",
headers=self.headers,
json={"product": {
"title": title,
"body_html": description,
"variants": [{"price": str(price), "inventory_quantity": 999}],
"images": [{"src": img} for img in images],
"status": "active"
}}
)
return r.json()["product"]
CJ Dropshipping Integration
CJ Dropshipping provides free access to 500,000+ products:
- Register at cjdropshipping.com (free)
- Get API key from dashboard
- Search products by keyword
- Auto-import to Shopify with markup
async def import_cj_product(cj_product_id, markup_multiplier=2.5):
# Fetch product from CJ
cj_data = await get_cj_product(cj_product_id)
cost = float(cj_data["salePrice"])
# Create in Shopify at 2.5x markup
shopify_product = await shopify.create_product(
title=cj_data["productName"],
description=cj_data["description"],
price=round(cost * markup_multiplier, 2),
images=cj_data["imageList"][:5]
)
return shopify_product
Automated Order Fulfillment
When a customer pays → auto-place order with supplier:
- Webhook triggers when order is paid
- Your code places order with CJ/Printful
- Supplier ships directly to customer
- You pocket the difference
Project: Set up a Shopify store that auto-imports 5 new products daily and auto-fulfills orders.
Module 5: Revenue Tracking & Optimization
Module 5: Revenue Tracking & Optimization
Setting Up a Revenue Dashboard
Track all income streams in one place with SQLite:
import sqlite3
from datetime import datetime
def setup_revenue_db():
conn = sqlite3.connect("revenue.db")
conn.execute('''
CREATE TABLE IF NOT EXISTS revenue (
id INTEGER PRIMARY KEY,
source TEXT,
amount REAL,
description TEXT,
recorded_at TEXT DEFAULT CURRENT_TIMESTAMP
)
''')
conn.commit()
return conn
def record_revenue(source, amount, description=""):
conn = sqlite3.connect("revenue.db")
conn.execute("INSERT INTO revenue (source, amount, description) VALUES (?,?,?)",
(source, amount, description))
conn.commit()
print(f"Revenue: +${amount:.2f} from {source}")
A/B Testing for Content
Test which article titles, product descriptions, and CTAs perform best:
- Track click-through rates from blog to store
- Test 2 versions of product descriptions
- Auto-promote the winner after 48 hours
Affiliate Link Optimization
Amazon Associates pays 1-10% commission. Maximize it:
- Place affiliate links in highest-traffic positions
- Track which articles generate most clicks
- Update low-performing links with better alternatives
def add_affiliate_links(content, associate_tag):
# Auto-replace product mentions with affiliate links
products_to_link = {
"laptop": f"https://www.amazon.com/s?k=laptop&tag={associate_tag}",
"headphones": f"https://www.amazon.com/s?k=headphones&tag={associate_tag}",
}
for product, link in products_to_link.items():
content = content.replace(product, f'{product}')
return content
Project: Build a daily email report showing revenue by source, top articles, and week-over-week growth.
Module 6: Scaling to $1,000/Month
Module 6: Scaling to $1,000/Month
The $1K/Month Formula
Based on real AutoHustle data, here's the proven path:
Month 1 ($0-$100): Foundation
- Set up GitHub Pages blog (0 cost)
- Publish 60 SEO articles (automated, free)
- Add Amazon affiliate links
- Expected: 10 sales × $10 avg = $100
Month 2 ($100-$400): Products
- Launch 2 digital products on Stripe ($29-$57 each)
- Use your blog traffic to sell products
- Add email list (ConvertKit free tier, 1000 subscribers)
- Expected: $100 affiliate + $300 products = $400
Month 3 ($400-$1,000): Scale
- Open Shopify store with dropshipping
- Automate 5 products/day with CJ Dropshipping
- Run Google Shopping ads ($50/month budget)
- Expected: $400 content + $400 store + $200 misc = $1,000+
Automation Stack (All Free)
Your 30-Day Action Plan
Week 1: Set up all accounts, publish 20 articles
Week 2: Add affiliate links, set up email capture
Week 3: Launch first digital product on Stripe
Week 4: Open Shopify store, import first 10 products
The Key: Automate everything. The goal is to build once and earn forever.
Your Python scripts should be running 24/7 generating revenue while you sleep.
Congratulations on completing Python Automation Mastery!
Questions? Reply to your purchase confirmation email.
Get All AutoHustle Products
Access every course, guide, and tool in one membership.