Category:CypherTech: Difference between revisions
Jump to navigation
Jump to search
Line 3: | Line 3: | ||
== Sync == | == Sync == | ||
=== Pull Discussions === | === Pull Discussions === | ||
This has to be done separately because the remote harvester host will not have the full discussion archive. Remote will have the current and previous month, which can be pulled with --delete. | |||
<pre> | <pre> | ||
cd /opt/cypherpunk/data/reddit/json | cd /opt/cypherpunk/data/reddit/json | ||
find discussion -name '2023-09' -type d | awk '{print "rsync -avz --size-only --delete www.iterativechaos.com:/opt/cypherpunk/data/reddit/json/"$1"/ ./"$1"/"}' | find discussion -name '2023-09' -type d | awk '{print "rsync -avz --size-only --delete www.iterativechaos.com:/opt/cypherpunk/data/reddit/json/"$1"/ ./"$1"/"}' | ||
</pre> | </pre> | ||
=== Pull All Other JSON === | |||
<pre> | |||
rsync -avz --size-only --delete www.iterativechaos.com:/opt/cypherpunk/data/reddit/json/link_list/ /opt/cypherpunk/data/reddit/json/link_list/ | |||
rsync -avz --size-only --delete www.iterativechaos.com:/opt/cypherpunk/data/reddit/json/archive_link_list/ /opt/cypherpunk/data/reddit/json/archive_link_list/ | |||
</pre> | |||
=== S3 Backup === | === S3 Backup === | ||
==== JSON, Daily ==== | ==== JSON, Daily ==== |
Revision as of 01:21, 26 September 2023
Resilience
Sync
Pull Discussions
This has to be done separately because the remote harvester host will not have the full discussion archive. Remote will have the current and previous month, which can be pulled with --delete.
cd /opt/cypherpunk/data/reddit/json find discussion -name '2023-09' -type d | awk '{print "rsync -avz --size-only --delete www.iterativechaos.com:/opt/cypherpunk/data/reddit/json/"$1"/ ./"$1"/"}'
Pull All Other JSON
rsync -avz --size-only --delete www.iterativechaos.com:/opt/cypherpunk/data/reddit/json/link_list/ /opt/cypherpunk/data/reddit/json/link_list/ rsync -avz --size-only --delete www.iterativechaos.com:/opt/cypherpunk/data/reddit/json/archive_link_list/ /opt/cypherpunk/data/reddit/json/archive_link_list/
S3 Backup
JSON, Daily
time aws s3 sync --size-only --delete /opt/cypherpunk/data/reddit/json/ s3://iterative-chaos/cyphernews/harvest/reddit/json/
Parquet Compacted, Weekly
time aws s3 sync --size-only --delete /opt/cypherpunk/data/reddit/parquet/compacted_raw/ s3://iterative-chaos/cyphernews/harvest/reddit/parquet/compacted_raw/
GPT
openai.error.RateLimitError: Rate limit reached for 10KTPM-200RPM in organization org-CleAC9cITP7hCx44ZiB2gw5d on tokens per min. Limit: 10000 / min. Please try again in 6ms. Contact us through our help center at help.openai.com if you continue to have issues.
Data Processing Process
- $ python get_link_lists.py # daily
- Needs data archival / backup.
- Doing a brute-force full backup right now, might be sufficient for the time being.
- $ scp -i key.pem reddit-link-list.tar.bz2 admin@www.iterativechaos.com:./
- Can do just week/day going forward, but that will quickly get slow.
- Storing the files as .json.bz2 would be a big improvement
- Doing a brute-force full backup right now, might be sufficient for the time being.
- Needs data archival / backup.
- $ python parquet_link_list.py
- (done) Make this iterate and do a full refresh each time
- Reconsider full refresh when processing time goes over a minute (currently runs in 12 seconds (not sure if that's real or user))
- Old Version:
- $ python parquet_link_list.py ../data/reddit/link_list/day/science/ ../data/reddit/parquet/link_list/day/science/
- $ python dedupe_link_list_parquet.py
- (done) Make this iterate and do a full refresh each time
- Reconsider full refresh when processing time goes over a minute (currently runs in 2 seconds (real))
- $ python get_discussions.py
- Make this check the existing download, harvest timestamp, num_comments
- $ python discussion_to_gpt.py
- Make this iterate and do a full refresh each time
- Reconsider full refresh when processing time goes over a minute
- hit ChatGPT with output
- Change this to API call to ChatGPT
- Make this check the existing generate, harvest versus generate timestamp
- Add data archival / backup
- 2023-08-26: Overnight buildup: 3,649 (bug)
- 2023-08-27: overnight buildup: 369
Interesting Subreddits
- aitah
- antiwork
- ask
- askmen
- askreddit
- askscience
- chatgpt
- conservative
- dataisbeautiful
- explainlikeimfive
- latestagecapitalism
- leopardsatemyface
- lifeprotips
- news
- nostupidquestions
- outoftheloop
- personalfinance
- politics
- programmerhumor
- science
- technology
- todayilearned
- tooafraidtoask
- twoxchromosomes
- unpopularopinion
- worldnews
- youshouldknow
Reddit OAuth2
- https://www.reddit.com/r/redditdev/wiki/oauth2/explanation/
- https://www.reddit.com/dev/api/oauth/
- https://github.com/reddit-archive/reddit/wiki/OAuth2
Example Curl Request
curl
-X POST
-d 'grant_type=password&username=reddit_bot&password=snoo'
--user 'p-jcoLKBynTLew:gko_LXELoV07ZBNUXrvWZfzE3aI'
https://www.reddit.com/api/v1/access_token
Real Curl Request
curl
-X POST
-d 'grant_type=client_credentials'
--user 'client_id:client_secret'
https://www.reddit.com/api/v1/access_token
One Line
curl -X POST -d 'grant_type=client_credentials' --user 'client_id:client_secret' https://www.reddit.com/api/v1/access_token
Oauth Data Call
$ curl -H "Authorization: bearer J1qK1c18UUGJFAzz9xnH56584l4" -A "Traxelbot/0.1 by rbb36" https://oauth.reddit.com/api/v1/me
$ curl -H "Authorization: bearer J1qK1c18UUGJFAzz9xnH56584l4" -A "Traxelbot/0.1 by rbb36" https://oauth.reddit.com/r/news/top?t=day&limit=100
- https://old.reddit.com/r/worldnews/top/?sort=top&t=day
- /r/subreddit/top?t=day&limit=100
- count=100&
Reddit Python
- pip install aiofiles aiohttp asyncio
- https://realpython.com/async-io-python/
t3 fields of interest
- "url_overridden_by_dest": "https://www.nbcnews.com/politics/donald-trump/live-blog/trump-georgia-indictment-rcna98900",
- "url": "https://www.nbcnews.com/politics/donald-trump/live-blog/trump-georgia-indictment-rcna98900",
- "title": "What infamous movie plot hole has an explanation that you're tired of explaining?",
- "downs": 0,
- "upvote_ratio": 0.94,
- "ups": 10891,
- "score": 10891,
- "created": 1692286512.0,
- "num_comments": 8112,
- "created_utc": 1692286512.0,
Minimal Term Set
hands, mouth, eyes, head, ears, nose, face, legs, teeth, fingers, breasts, skin, bones, blood, be born, children, men, women, mother, father, wife, husband, long, round, flat, hard, soft, sharp, smooth, heavy, sweet, stone, wood, made of, be on something, at the top, at the bottom, in front, around, sky, ground, sun, during the day, at night, water, fire, rain, wind, day, creature, tree, grow (in ground), egg, tail, wings, feathers, bird, fish, dog, we, know (someone), be called, hold, sit, lie, stand, sleep, play, laugh, sing, make, kill, eat, drink, river, mountain, jungle/forest, desert, sea, island, rain, wind, snow, ice, air, flood, storm, drought, earthquake, east, west, north, south, bird, fish, tree, dog, cat, horse, sheep, goat, cow, pig (camel, buffalo, caribou, seal, etc.), mosquitoes, snake, flies, family, we, year, month, week, clock, hour, house, village, city, school, hospital, doctor, nurse, teacher, soldier, country, government, the law, vote, border, flag, passport, meat, rice, wheat, corn (yams, plantain, etc.), flour, salt, sugar, sweet, knife, key, gun, bomb, medicines, paper, iron, metal, glass, leather, wool, cloth, thread, gold, rubber, plastic, oil, coal, petrol, car, bicycle, plane, boat, train, road, wheel, wire, engine, pipe, telephone, television, phone, computer, read, write, book, photo, newspaper, film, money, God, war, poison, music, go/went, burn, fight, buy/pay, learn, clean
Pages in category "CypherTech"
The following 7 pages are in this category, out of 7 total.