How to read millions of records effectively with more joins?

by RAJENDIRAN   Last Updated October 13, 2019 16:06 PM

Im developing an enterprise application using node js as backend and mysql for database for one of the telecommunication company. I'm currently working in rewards module like grabbing and redeem vouchers. In this i have to do nearly 9 to 13 tables inner and left joins. 5 tables has 3 millions of records. My query was extremely slow, it almost takes 1 to 2 minutes for processing the query for single request. And when i do loadtest with 50 concurrent request my app going down.

Then i splitted the large tables from doing joins and executed as sub queries. For that response time is tremendrously reduced to 1 second. When again the huge data comes in response time increases. I really need a permanent solution for this. Pls help.

Related Questions

Low Memory Usage High CPU mysqld

Updated May 16, 2018 16:06 PM

Several Data Importation Illogical Latency in MySQL

Updated August 23, 2015 17:02 PM

Help tune query

Updated February 01, 2017 14:02 PM

Mysql Disk Scanning algorithm

Updated May 31, 2016 08:02 AM