Hacker News Commands
Overview
The netapi hn commands provide CLI access to Hacker News via two APIs. All commands support -f json for jq piping.
-
Firebase API (
hacker-news.firebaseio.com/v0/) — official, real-time, item-by-item access -
Algolia API (
hn.algolia.com/api/v1/) — full-text search, faster for bulk queries
Prerequisites
No authentication required. Both APIs are public and free with no documented rate limits. Be respectful with request frequency.
# No tokens or keys needed
netapi hn top
Commands (13 total)
| Command | Description |
|---|---|
|
Top stories (default 30) |
|
Newest stories |
|
Best stories |
|
Ask HN posts |
|
Show HN posts |
|
Job postings |
|
Full story with metadata |
|
Comment tree (configurable depth) |
|
User profile and karma |
|
Full-text search (Algolia) |
|
Search sorted by date |
|
Filter by type: story, comment, poll, show_hn, ask_hn |
|
Raw Firebase API access |
JSON Output
All commands support -f json for machine-readable output:
netapi hn top --limit 30 -f json
netapi hn story 12345678 -f json
netapi hn search "rust programming" -f json
netapi hn user dang -f json
jq Patterns
Exploring Structure
# See all keys in first story object
netapi hn top -f json | jq '.[0] | keys'
# Keys with types
netapi hn top -f json | jq '.[0] | to_entries | .[] | "\(.key): \(.value | type)"'
# Full first object
netapi hn top -f json | jq '.[0]'
Top Stories
# Titles only
netapi hn top -f json | jq -r '.[].title'
# Title, score, URL, comment count
netapi hn top -f json | jq -r '.[] | "\(.title) (\(.score) pts, \(.descendants) comments)"'
# Compact table: rank, score, title
netapi hn top -f json | jq -r 'to_entries | .[] | "\(.key + 1). [\(.value.score)] \(.value.title)"'
# Top 10 by score
netapi hn top --limit 30 -f json | jq 'sort_by(-.score) | .[0:10] | .[] | "\(.score) - \(.title)"'
# Stories with URLs only (skip Ask HN, polls)
netapi hn top -f json | jq '.[] | select(.url) | {title, url, score}'
# Time since posting (epoch to relative)
netapi hn top -f json | jq --arg now "$(date +%s)" \
'.[] | {title, hours_ago: (($now | tonumber) - .time) / 3600 | floor}'
New and Best Stories
# Newest stories with scores
netapi hn new --limit 20 -f json | jq -r '.[] | "[\(.score)] \(.title)"'
# Best stories -- high signal, sorted by comment activity
netapi hn best --limit 20 -f json | jq 'sort_by(-.descendants) | .[] | {title, score, comments: .descendants}'
# Ask HN with comment counts
netapi hn ask --limit 10 -f json | jq -r '.[] | "#\(.id): \(.title) (\(.descendants) comments)"'
# Show HN -- extract project URLs
netapi hn show --limit 10 -f json | jq -r '.[] | select(.url) | "\(.title)\n \(.url)\n"'
# Job postings -- titles and links
netapi hn jobs --limit 10 -f json | jq -r '.[] | .title'
Single Story
# Full story details
netapi hn story 12345678 -f json | jq '.'
# Summary object
netapi hn story 12345678 -f json | jq '{
title,
author: .by,
score,
comments: .descendants,
url,
time: (.time | todate),
type
}'
# Just the URL
netapi hn story 12345678 -f json | jq -r '.url'
Comment Trees
# Flat list of comment texts
netapi hn comments 12345678 --depth 3 -f json | jq -r '.[] | .text'
# Comments with author and nesting level
netapi hn comments 12345678 --depth 3 -f json | jq -r '.[] | "\(" " * .level)\(.by): \(.text[0:120])"'
# Top-level comments only
netapi hn comments 12345678 --depth 1 -f json | jq '.[] | select(.level == 0) | {by, text}'
# Count comments per author
netapi hn comments 12345678 --depth 3 -f json | jq '[.[].by] | group_by(.) | map({author: .[0], count: length}) | sort_by(-.count)'
# Extract URLs mentioned in comments
netapi hn comments 12345678 --depth 3 -f json | jq -r '.[].text' | grep -oP 'https?://[^\s<"]+' | sort -u
User Profiles
# User details
netapi hn user dang -f json | jq '.'
# Summary
netapi hn user dang -f json | jq '{
id,
karma,
created: (.created | todate),
about
}'
# Karma only
netapi hn user dang -f json | jq '.karma'
Search (Algolia)
# Basic search
netapi hn search "rust async" -f json | jq -r '.hits[] | "\(.title) (\(.points) pts)"'
# Search sorted by date
netapi hn search "kubernetes" --sort date -f json | jq -r '.hits[] | "\(.created_at[0:10]): \(.title)"'
# Filter to stories only
netapi hn search "go generics" --tag story -f json | jq '.hits[] | {title, points, num_comments}'
# Filter to Ask HN
netapi hn search "career advice" --tag ask_hn -f json | jq -r '.hits[] | .title'
# Search with points threshold
netapi hn search "machine learning" -f json | jq '.hits | map(select(.points > 100)) | .[] | {title, points}'
# Highlight matches -- Algolia returns _highlightResult
netapi hn search "network automation" -f json | jq -r '.hits[] | "\(.title)\n Matched: \(._highlightResult.title.value)\n"'
Stories by Domain
# Group top stories by domain
netapi hn top --limit 30 -f json | jq -r '[.[] | select(.url) | {title, domain: (.url | split("/")[2])}] | group_by(.domain) | map({domain: .[0].domain, count: length}) | sort_by(-.count) | .[] | "\(.count) \(.domain)"'
# Find stories linking to a specific domain
netapi hn top --limit 30 -f json | jq '.[] | select(.url // "" | test("github\\.com")) | {title, url, score}'
# All unique domains in current top stories
netapi hn top --limit 30 -f json | jq -r '[.[] | .url // empty | split("/")[2]] | unique | .[]'
Filtering by Score
# Stories above 200 points
netapi hn top --limit 30 -f json | jq '.[] | select(.score > 200) | {title, score}'
# Stories with high engagement (comments > score * 0.5)
netapi hn top --limit 30 -f json | jq '.[] | select(.descendants > (.score * 0.5)) | {title, score, comments: .descendants}'
# Bottom quartile -- quiet stories that made front page
netapi hn top --limit 30 -f json | jq 'sort_by(.score) | .[0:8] | .[] | "\(.score) pts: \(.title)"'
Time-Based Filtering
# Stories from last 24 hours
netapi hn new --limit 20 -f json | jq --arg cutoff "$(date -d '24 hours ago' +%s)" \
'.[] | select(.time > ($cutoff | tonumber)) | {title, score, hours_ago: ((now - .time) / 3600 | floor)}'
# Search results from last week
netapi hn search "python" --sort date -f json | jq --arg cutoff "$(date -d '7 days ago' -Iseconds)" \
'.hits | map(select(.created_at > $cutoff)) | .[] | {title, created: .created_at[0:10], points}'
Count by Author
# Stories by author in current top 30
netapi hn top --limit 30 -f json | jq '[.[].by] | group_by(.) | map({author: .[0], count: length}) | sort_by(-.count) | .[] | "\(.count) stories: \(.author)"'
# Prolific posters in search results
netapi hn search "devops" -f json | jq '[.hits[].author] | group_by(.) | map({author: .[0], count: length}) | sort_by(-.count) | .[0:10]'
Raw API Access
# Any Firebase endpoint
netapi hn api /v0/maxitem.json | jq '.'
# Specific item by ID
netapi hn api /v0/item/12345678.json | jq '.'
# Current top story IDs (raw)
netapi hn api /v0/topstories.json | jq '.[0:10]'
# User by ID
netapi hn api /v0/user/pg.json | jq '{id, karma, created: (.created | todate)}'
# Changed items and profiles (polling)
netapi hn api /v0/updates.json | jq '{items: (.items | length), profiles: (.profiles | length)}'
Environment Variables
| Variable | Description |
|---|---|
None required |
Both APIs are public, no authentication needed |
|
Override Algolia API base URL (optional, default: |
Useful jq Recipes
Daily Digest — Markdown Reading List
netapi hn top -f json | jq -r '.[] | "- [\(.title)](\(.url)) (\(.score) pts, \(.descendants) comments)"'
Export to CSV
netapi hn top --limit 30 -f json | jq -r '.[] | [.title, .score, .descendants, .by, .url] | @csv'
Create Markdown Table
netapi hn top --limit 30 -f json | jq -r '["Title", "Score", "Comments"], (.[] | [.title, .score, .descendants]) | @tsv' | column -t -s$'\t'
Bookmark Export (title + URL pairs)
# Markdown links
netapi hn top -f json | jq -r '.[] | select(.url) | "- [\(.title)](\(.url))"'
# Org-mode links
netapi hn top -f json | jq -r '.[] | select(.url) | "- [[\(.url)][\(.title)]]"'
# Plain text
netapi hn top -f json | jq -r '.[] | select(.url) | "\(.title)\n \(.url)\n"'
Search and Save to File
# Save high-quality results as JSON
netapi hn search "distributed systems" -f json | jq '[.hits[] | select(.points > 50) | {title, url, points, created: .created_at[0:10]}]' > hn-distrib-systems.json
# Append daily top stories to a log
netapi hn top --limit 10 -f json | jq --arg date "$(date +%Y-%m-%d)" '{date: $date, stories: [.[] | {title, score, url}]}' >> hn-daily.jsonl
Cross-Reference Authors
# Get a user's recent submissions via search
netapi hn search "" --tag story -f json | jq '.hits[] | select(.author == "dang") | {title, points, created: .created_at[0:10]}'
# Compare karma between users
for user in dang pg tptacek; do
echo -n "$user: "
netapi hn user "$user" -f json | jq -r '.karma'
done
Pipeline with Other Tools
# Open top story in browser
netapi hn top --limit 1 -f json | jq -r '.[0].url' | xargs xdg-open
# Count words in Ask HN titles
netapi hn ask --limit 20 -f json | jq -r '.[].title' | tr ' ' '\n' | sort | uniq -c | sort -rn | head -20
# Feed top stories into fzf for interactive selection
netapi hn top -f json | jq -r '.[] | "\(.id)\t\(.title)"' | fzf --with-nth=2 | cut -f1 | xargs -I{} netapi hn story {} -f json | jq '.'
curl Equivalents (Works Now)
These raw curl + jq patterns work today without the netapi CLI. Use these until netapi hn is implemented.
Firebase API (Real-Time, Item-by-Item)
# Top story IDs (returns array of up to 500 IDs)
curl -s https://hacker-news.firebaseio.com/v0/topstories.json | jq '.[0:30]'
# New stories
curl -s https://hacker-news.firebaseio.com/v0/newstories.json | jq '.[0:20]'
# Best stories
curl -s https://hacker-news.firebaseio.com/v0/beststories.json | jq '.[0:20]'
# Ask HN
curl -s https://hacker-news.firebaseio.com/v0/askstories.json | jq '.[0:10]'
# Show HN
curl -s https://hacker-news.firebaseio.com/v0/showstories.json | jq '.[0:10]'
# Job postings
curl -s https://hacker-news.firebaseio.com/v0/jobstories.json | jq '.[0:10]'
# Single item (story, comment, poll, job)
curl -s https://hacker-news.firebaseio.com/v0/item/1.json | jq .
# User profile
curl -s https://hacker-news.firebaseio.com/v0/user/pg.json | jq .
Fetch Top Stories with Details
The Firebase API returns IDs only — you need a second call per story. This shell loop resolves them:
# Top 10 stories with titles, scores, and URLs
curl -s https://hacker-news.firebaseio.com/v0/topstories.json \
| jq -r '.[0:10][]' \
| xargs -I{} curl -s "https://hacker-news.firebaseio.com/v0/item/{}.json" \
| jq -r '{title, score, url, by, descendants} | "\(.score)\t\(.title)\n\t\(.url // "Ask HN")\n\t\(.descendants) comments by \(.by)\n"'
# Parallel fetch (4x faster)
curl -s https://hacker-news.firebaseio.com/v0/topstories.json \
| jq -r '.[0:20][]' \
| xargs -P4 -I{} curl -s "https://hacker-news.firebaseio.com/v0/item/{}.json" \
| jq -s 'sort_by(-.score) | .[] | "\(.score)\t\(.title)"'
Algolia API (Search, Bulk, Fast)
# Full-text search (sorted by relevance)
curl -s "https://hn.algolia.com/api/v1/search?query=rust+async&tags=story" \
| jq '.hits[:5] | .[] | {title, url, points}'
# Search sorted by date
curl -s "https://hn.algolia.com/api/v1/search_by_date?query=cisco+ise&tags=story" \
| jq '.hits[:5] | .[] | {title, url, points, created_at}'
# Filter by type: story, comment, poll, show_hn, ask_hn
curl -s "https://hn.algolia.com/api/v1/search?query=python&tags=show_hn" \
| jq '.hits[:5] | .[] | {title, url, points}'
# Search with minimum points threshold
curl -s "https://hn.algolia.com/api/v1/search?query=network+automation&tags=story&numericFilters=points%3E100" \
| jq '.hits[] | {title, url, points}'
# Stories from a specific domain
curl -s "https://hn.algolia.com/api/v1/search?query=&tags=story&restrictSearchableAttributes=url&query=github.com" \
| jq '.hits[:10] | .[] | {title, url, points}'
# Stories from last 24 hours
curl -s "https://hn.algolia.com/api/v1/search_by_date?tags=story&numericFilters=created_at_i%3E$(date -d '24 hours ago' +%s)" \
| jq '.hits[:10] | .[] | {title, points, url}'
Comment Trees (curl)
# Get a story's kid IDs, then fetch each comment
STORY_ID=1
curl -s "https://hacker-news.firebaseio.com/v0/item/${STORY_ID}.json" \
| jq -r '.kids[]' \
| xargs -P4 -I{} curl -s "https://hacker-news.firebaseio.com/v0/item/{}.json" \
| jq -s '.[] | {by, text: (.text | gsub("<[^>]+>"; "") | .[0:200])}'
Daily Digest (curl)
# Markdown reading list from top stories
curl -s https://hacker-news.firebaseio.com/v0/topstories.json \
| jq -r '.[0:15][]' \
| xargs -P4 -I{} curl -s "https://hacker-news.firebaseio.com/v0/item/{}.json" \
| jq -rs 'sort_by(-.score) | .[] | "- [\(.title)](\(.url // "https://news.ycombinator.com/item?id=\(.id)")) (\(.score) pts, \(.descendants) comments)"'