Loading large amounts of data performantly using Node.js Streams
Sometimes you have to load tons of data from a database, a file, etc. So much so that your Node service crashes due to the dreaded "Out of Memory" error. A common use case is generating large CSV exports - think financial transactions from a site like Mint or RocketMoney, or high-frequency sensor data. Users might need these exports to import into their own data pipeline or transfer to another tool.
Note: Usually that export will happen in a task queue, and then will be emailed to the user when the task is compelte.
If your implementation is naive and the data you are trying to load is large enough, a single user might be able to crash your server.
Now imagine when you have several users trying to export data, and things can get out of control quickly.