New Discussion

Notifications

You’re not receiving notifications from this thread.

How to work with long processing of information that will be consumed by the FRONT-END

1
General

Hey guys! how are you? I'm just looking for a pointer on which approaches I could follow for the scenario, I'd appreciate the help:
-I'm using Tabulator on my front-end to present information about sales. The table presentation is limited to 100 records only, I am paginating through the back-end using pagy;
-I'm generating the columns and rows in my back-end, and many of the row values ​​are the result of calculations that I need to perform, and even limiting the table presentation to 100 rows is consuming a lot of time;
-The sales values ​​are 'static', they are not updated frequently, but I have a lot of filter options that can be used in the table, which I believe makes caching a little more difficult. Please correct me if I'm wrong;

I was thinking about carrying out this processing through a background job and then performing fetch from my front, Would it be a good solution? In this case, what would be a rails way approach here?

If you have other solutions, I'm all ears! Thank you in advance

Hey Allan,

Can you cache the calculation results to the database? That would make retrieving and filtering faster since you wouldn't have to re-calculate every time (which I assume is the slow part?).

Join the discussion
Create an account Log in

Want to stay up-to-date with Ruby on Rails?

Join 89,509+ developers who get early access to new tutorials, screencasts, articles, and more.

    We care about the protection of your data. Read our Privacy Policy.