Ask A Question

Notifications

You’re not receiving notifications from this thread.

Refactoring CSV Uploads with ActiveModel::Model Discussion

Awesome episode, Chris! Would be nice to continue with moving CSV parsing to a background job and adding ajax progress bar to provide visual feedback on import progress

Reply

That's a great idea. I'll add it to my list!

Reply

I think I just asked this question today. Exactly what I am trying to do.

Reply

I built that w/ActionCable just last week. Should I write up a blog post?

Reply

Chris, I'm having trouble figuring out how to convert the date field in my csv file (mdY) to the proper format. It's reading the Y correctly, but importing the day to month and vice-versa instead of day to day and month to month. Where should the code be put and what does it look like? I tried a bunch of things and just can't get it to work. I looked at smarter_csv and thought that might be the answer, but it didn't work for me.

Reply

I believe there's a "converters" option that you can pass in. It basically tries to convert every column to a Date object and if it is successful, it will use that. You'd pass in the ":date" option for it (something like converters: [:date] I believe). You can read a bit more about that here: http://ruby-doc.org/stdlib-...

Plus if you have something custom that doesn't work with that, you can write your own converter. I just did this for a project and figured I should make an episode on it. Pretty nifty and easy to do, but really poorly documented.

Reply
Tabish Iqbal Tabish Iqbal

Hey Chris - awesome stuff however, could I ask for a favor. I know some videos link to the github repo but would it be possible to have the code pasted also - sorta like how Ryan Bates had it. When you're trying to go through and connect the different parts or figure how things work its harder to keep going back and forth on the video trying to find the exact location.

Reply

That's absolutely on my list. I need to have someone help me with that part as making the videos on their own is time consuming enough. Expect to have transcripts with code snippets sometime in the near future if everything goes well. :)

Reply
Tabish Iqbal Tabish Iqbal

Thoughts on the following gem: https://github.com/continuu...

Reply

This looks pretty cool. It might need some updates for newer versions of Rails since it hasn't been updated in a while, but you might check to see if anyone has been maintaining a fork.

Reply

I just installed it, and so far so good with CSV's.

Reply

Like I said, it works fine -- until you attempt to import very large CSV files. That's because it uses the gem roo to open the file. With large files, it's better to have roo stream them. This gem does not stream.

Reply

those 4 episodes are Great

Reply

Hey Chris, I made it all the way through and implemented this in EntityCloud, but my only issue seems to be that no matter which row in the CSV my error is on, the $. prints out "1". Any idea why this is not printing the correct line number? If there are multiple rows with errors they all print out as "1". Thanks!

Reply

According to this previously, we could use "$." (before Ruby version 2.6). The best alternative I've found is the following:

CSV.foreach('example.csv', headers: true).with_index(1) do |row, index|
  puts index
end
Reply

What is the best approach to deal with csv headers that don't match 1:1 to the model's columns?

I used Structs to build a hash, but I'm thinking on refactoring to ActiveModel::Model.

Any suggestions?

Reply

Once you have the row as a Ruby hash, you can use transform_keys to modify the keys and renaming the CSV headers (keys) to the name you want.

Reply

Thats awesome 🙌

But I still need to do some value casting like 'Yes' to True

But using transform_keys would work if it wasn't for that, thanks a lot Chris

Reply
Join the discussion
Create an account Log in

Want to stay up-to-date with Ruby on Rails?

Join 87,563+ developers who get early access to new tutorials, screencasts, articles, and more.

    We care about the protection of your data. Read our Privacy Policy.