New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
十万条数据插入数据库,怎么去优化和处理高并发情况下的DB插入 #26
Comments
先写成文件,然后再一次性导入。 |
最好用node本身适合处理IO 核心思路是把db操作语句在中间层(nodejs)按时间(如每10秒)合并,比如把同一集合下的nosql或者sql多句柄合并成单一句柄,时间到周期后操作db 数据量小就写内存,大就写redis |
之前处理过一个类似的nosql场景,大概说下当时处理时候的思路。 |
你的公布答案,想问下答案在哪里 |
这种题,你懂的,逼格高,亮瞎眼,大厂太爱考了。
不过装逼归装逼,有能力并且真真正正处理过这些高并发情况的FE,这题是他们一个很好的展现机会。
以前我的mentor,用nodejs实现了高并发的智能容灾,我至今记忆犹新,并且他也收获了那年的高绩效。
来玩一下?
The text was updated successfully, but these errors were encountered: