Thermatix Thermatix - 1 year ago 92
Ruby Question

Sequel query causes script to hang and causes computer to slow down

I have the following code:

team_articles = user.npt_teams.to_a.inject({}) {|arts,team|
arts.merge({ =>
NptArticle.join(:npt_authors).join(:users).join(:npt_teams).where(:npt_teams__id =>})

It causes my terminal to stop responding and my Macbook to slow down.

In mysqlworkbench it gets a response instantly.

A suggestion was to create a lighter version of the NptArticle object but I'm not quite sure how to create a version that pulls less columns so any suggestion to fix this issue would be great.

This is the table.

The generated SQL is:

SELECT * FROM `npt_articles` INNER JOIN `npt_authors` INNER JOIN `users` INNER JOIN `npt_teams` WHERE (`npt_teams`.`id` = 1)

I'd love to upgrade the Ruby version but I can't. I'm working off an old code-base and this is the version of Ruby it uses. There are plans to re-build in the future with more modern tools but at the moment this is what I have to work with.

Results from :

EXPLAIN SELECT * FROM npt_articles INNER JOIN npt_authors INNER JOIN users INNER JOIN npt_teams WHERE ( = 1);

Answer Source

So for =1 you are performing a cross join for all:


If the number of articles, authors and users is even moderate you would get a huge number of results as the joins aren't restricted. Normally, you would use something like:

INNER JOIN `npt_authors` ON (npt_articles.ID=npt_authors.articleID) 

(it depends on how your database relates).

In addition, you would need indexes on the fields that relate the tables to each other, which will speed things up as well.

Look at the rows column of the EXPLAIN SELECT. That is how many rows are being processed for each part of the join. To get an estimate of the total number of rows processed, multiply these numbers together. 1 x 657 x 269723 x 956188 = rather a lot.

I'm not Ruby wiz so perhaps somebody else can post how you do this.