this post was submitted on 05 Jun 2024
95 points (98.0% liked)
Programming
17313 readers
235 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities [email protected]
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
asyncio provides "cooperative concurrency" for python.
Lets say you need to download 10 webpages in python, someone might do
Down side is that each requests.get() blocks until HTTP request is done and if each webpage loads in 5 seconds your programs needs 50 seconds to download all pages.
You can do something like spawn 10 threads, but threading has it's own downside.
What coopertive concurrency does is allowing these coroutine(tasks) that can tell Python to do something else while a function is waiting for something.... I think it's the best to read some Python examples. https://docs.python.org/3/library/asyncio-task.html#coroutines
examples that solves requests.get() problem with asyncio but it's probably better to use libraries that builds around asyncio.