Love rails assets. I first found out about it about a year ago, and it quickly became one of my favorite ways to grab front end libs. :)
Yeah I've used that which is just babel under the hood. Also used traceur, which I believe is googles transpirer.
ES6 is basically a lot of the features of cofeescript except with out white space parsing.
I still like coffee script in general and use it when I can. It still feels the closest to writing ruby on the front end.
This is actually similar to how I have been writing my jquery.
Recently, though, I've been using ES6 more than coffeeScript but being able to use classes on the front end has been so much nicer than the jquery soup I used to deal with.
Posted in My Development Environment Discussion
Posted in PDF Receipts Discussion
I love wkhtmltopdf. It has saved my ass on a bunch of projects in the past. It's become my goto for PDF generation in almost any language I use. Prawn is a pain in the butt, I've used it on two projects and it has been very painful, but you do get a lot of control.
Posted in PDF Receipts Discussion
It was awesome meeting you as well Chris! I look forward to next years RailsConf!
This is awesome! I'm glad you did a screencast on this, the talk from Sandi was just way too good not to share, and proliferate those ideas!
That would be awesome!
Very nice video Chris! I really like the idea of breaking things out into POROs. I think the next refactoring I would do to this app is making a Subscription class to house the logic for adding and updating the mailchimp list. Since multiple parts of your app could end up needing to add someone to mailchimp. Also feels like its violating SRP there.
It depends on the technology stack I have available, I have dropped into command line utilities for working with large CSV's in ruby in the past. They are just so much faster to work with in raw CLI. The other option is to push the task off to a service with a language that is much more suited to massive data crunching. It also depends on the servers I have to work with, with sufficiently beefy compute heavy servers you ruby can churn through data at a pretty decent clip.
As long as the client is ok with it I tend to push this processing into delayed jobs and allow it to run in the background. The larger the file the less likely I am to make it real time. Most users are fine with a message saying "we are currently processing your request, we will send you an email when its done." or even just a check back later kind of message. That way its ok if your process takes 10 or 20 minutes to churn through the data.
No worries :)
Minor inconsistency at the top of the post you recommend 14.10 Trusty Tahr. The 14.10 release is actually Utopic Unicorn, and 14.04 LTS is Trusty Tahr.
Posted in Exporting Records To CSV Discussion
Awesome! I know RailsCast already did this a long time ago, but its great to see it redone with the updates to the syntax.
Yeah It was fun to work on. Taught me a lot about performance and mass data processing.
I would usually handle it like this, either you tell them a limit number like "you may only upload 500 or 1000 at a time" on the upload page. Typically that's a safe-ish number. So they would need to break their file into multiple smaller files. Or If they uploaded a file too large I would send back an error like "Opps your file appears to exceed the maximum records allow per upload, please split your file into smaller uploads of 500, or contact support.". I would normally double up on this kind of thing and do both.
Beyond that I have run into some cases where you need to handle massive data and that requires significant engineering to manage thousands of records flooding in. One of the largest imports I worked on had well over 1.5 million data points to be processed through a single import.
Awesome job on explaining imports in a simple way. Some of the most complex chunks of code I have worked on are imports. Especially if the data is inconsistent and needs to have all the edge cases accounted for, or be massaged into the correct formatting. One thing I would suggest for anyone implementing an import like this is a max record limit for each import. So a user cannot just shove 50,000 records down the app's throat. That kind of thing can crash an app server really quick.
https://github.com/skwp/dot... is another great set of dotfiles to get you started with Vim. It's saved me ton of time, and has nearly all the plugins I need minus a few gems here and there.
I highly recommend learning Vim it can make editing code insanely fast.
Awesome demonstration of refactoring.
Love unraveling the magic of some of these really useful gems :)