What you will normally find is that the speed that the table can be modified will trend down as the size increases.
The job that been running and loading data just fine, suddenly becomes too slow and becomes a bottleneck.
It shows the performance degradation of inserting one billion rows into a table with insert buffer disabled (not recommended, and used for demonstration purposes only).
Note that this is in So we should expect degradation of performance due to the structure of the index, but there are actually some ways that we can try and stretch out the curve, and not degrade as quickly.
I had some experience with the interesting Solaris OS, which is a 64-bit OS but run a lot of 32-bit applications.
One thing good about Solaris is that it allows the 32-bit application to allocate 4G heap memory instead of 2G (or 3G) than Windows, Linux and AIX.