Thank you for correcting the information. I was able do debug the process, here is what I have found.
The issue is not the actual indexing process, this is very good news. The slowness comes from when the index tries to calculate how many rows are indexed, and how many more needs to be indexed. I’ve done some research on this, and turns out that some database engines does not store the row counts, but instead they have to run the query every time, which takes some time in such huge data sets.
The good news is, that most of this counting is informative only, it has no effect on the indexing outcome. I have disabled these counters programatically, and it already made a huge difference in performance. One table operation however must remain, for the index to be able to select which posts to index next, without that it’s not going to work.
The ‘items not indexed’ counter is still active, so you can check if there is any items remaining to index: https://i.imgur.com/wAEqoCs.png
Setting the indexing limit to 50 worked correctly, with an occasional stops, but I was able to finish the indexing. Now every post is indexed, and new ones are indexed automatically when saving them.
Now everything should be indexed, I’ve otpimized the configuration as much as possible.Best,
If you like my products, don't forget to rate them on codecanyon :)