Here we are using limit for pagination
This was my scenario
I have a post with likes in my database .
Am fetching data by ordering maximum number of likes post with the limit 0,5
The problem is
we have shown the data by maximum likes with limit 0,5 for first page say as example
You should realize one thing:
this is probably what you want.
Consider loading page1 at time X. How long will you then spend reviewing results from page 1 before going to page 2? Is it few microseconds? then it is very unlikely order will change in that time. Is it few minutes? Results would then be outdated, when viewing the next page. And how do you decide it is time to update search and show new updated results?
If you still insist on solving this, I suggest making new table
CREATE TABLE likes_search_orders(search_id varchar(255) not null, post_id bigint(20) not null, order_id biging(20) not null)
set unique index to
search_id-post_id and proceed with following algorithm:
create random search_id (like md5(rand(0,1000000)."-".time()))
INSERT INTO likes_search_orders (search_id, post_id, order_id) SELECT "search_id", post_id, row_number() over (order by (select NULL)) FROM table_with_likes ORDER BY likes DESC
SELECT l.* FROM table_with_likes l INNER JOIN likes_search_orders o ON o.post_id = l.id AND o.search_id = "search_id" ORDER BY o.order_id DESC LIMIT 5,5
Every time you create new search ID, you "freeze" the DB in the state it was at that given instant (at least the order of the things, you could possibly include likes as well into likes_search_orders, and then you would have even exact likes count at that given time)
I would also suggest creating table with searches, where you store search_id and date when it was created. You can then setup a cron to clean up old searches from DB so that they dont bloat up DB