Chris Oliver

Joined

291,530 Experience
86 Lessons Completed
296 Questions Solved

Activity

Posted in Disaster recover plan?!

Yeah, that's what I was thinking as a first version. 🙌

Posted in Disaster recover plan?!

Definitely agreed. That's on the todo list but there are a few things that are higher priority at the moment right now. Plus there are SO many configuration options for the backup gem, that it'd be hard to verify they're all correct and test them in the UI.

Posted in Disaster recover plan?!

This is a great question Karim! Exciting to get your app in production for a customer! 🍻

So a couple things to consider here:

  1. Database backups are safest when you upload them to somewhere third-party like S3. You can edit your Backup config on Hatch via SSH to use your S3 credentials for uploads. That will keep them safe in case anything happens to your server.
  2. DigitalOcean backups are handy for quick restoration of a server. You can create a new server based on your previous one in a minute or two this way. If you didn't have those, you'd need to setup a new server which takes a few minutes using Hatch. It can speed recovery up a little bit which is good.
  3. You'll want to make sure that you have practiced database restoration at least once before you go to production to be safe. Download one of the db backups from your app to your local computer and try restoring it to Postgres locally. This command should import the backup into a database: psql -U <username> -d <dbname> -1 -f <filename>.sql Once it's imported, you can check that all the same data is available locally as it was on your server.
  4. If you have file uploads that upload locally, you'll want to include those in your backup and restore process. If they're uploading to S3, you don't have to worry about it.

That's most of it. You should also setup the Backup config to notify you on slack or email for failed backups to be safe. If anything goes wrong, you'll want to address that right away.

Hey Liam!

So I use Janus for Vim. I'm guessing maybe it's a config option that comes by default then to do that if yours is different. They ship a lot of good defaults and I originally used it to just replicate my workflow from Sublime into Vim. I actually don't even know all the details of how they configure everything since it all worked out really nicely from a fresh install haha.

Maybe you can find their NERDTree config in the repo or you might like to just try out Janus itself.

Posted in Do I need rails-ujs and jquery_ujs?

I did an episode on the new Rails UJS library. It replaces everything jquery_ujs does, but it just doesn't require jQuery: https://gorails.com/episodes/rails-ujs-primer

Feel free to add jquery in, but you won't need the jquery_ujs library. Everything will still function exactly the same as before, they just want to reduce dependencies.

Very cool. Like the built-in Rails mailer previews but with a UI. http://guides.rubyonrails.o...

Since ActiveJob is just a wrapper for whatever background processor you use, all you have to do is write your code to use ActiveJob. All your emails already support ActiveJob, so to send them in the background you can just say "UserMailer.notification(user).deliver_later" and the deliver later part will create a job for it.

As for everything else like notifications and csv downloads, you'd just write your code in a job and have your controllers start the job. Nothing too fancy.

Is there anything specific I could cover for you that would help wrap your head around it?

I think most services confirm your email, but I don't know for sure if they force you to do that before using OAuth. It's definitely something to think about. Automatic merging is probably okay so long as you can trust that those accounts were already confirmed. I'm just not 100% that they are. I would guess that every service is different so it's probably safest not to auto-merge. If you find out more details on that, let us know!

Posted in What do you like to do away from the computer?

Roadtrippppps. 🚗

Posted in Hatch / Updates

Hey Louis-Philippe!

The goal is certainly to help you upgrade Postgres and Redis. One of the tricky bits of this is that often times your code will be affected by this, so I don't want to accidentally break your apps.

Once there's a major version update, I'm going to be working on a script to help you do the migration as seamlessly as possible. The process is usually pretty simple for Postgres where you only have to run a couple commands to stop the old database, migrate it to the new version, and start the new version. You can see the process here: https://gist.github.com/delameko/bd3aa2a54a15c50c723f0eef8f583a44

Redis is similar, although, it's probably easier to upgrade than Postgres which is great.

Hatch won't take over full control of your servers like Heroku does, so you're free to upgrade things at any time or make changes as you like. At the end of the day, I want you to have full freedom to run what you want and have Hatch just make your life a bit easier so you're not dealing with the hassle of server management constantly.

Hey Karim! Excited you're moving to Hatch!

Jack's answers are all spot on.

  1. There's a gem called Backup that Hatch helps you setup for each app. You'll need to login to your server to edit the config if you want the backups to upload to S3 or something, but it's really easy to use.
  2. I haven't used Delayed_job on Hatch yet myself, but I believe someone has already. The important thing will be to basically add these commands to the deploy script (and make sure you've got that daemon gem installed). https://github.com/collectiveidea/delayed_job#running-jobs If you run into any troubles with this, let me know and I'll help you get that situated. I generally recommend you always use Sidekiq these days over delayed_job, resque, etc so this feature is a little less tested.
  3. Like Jack mentioned, RAM is going to be the most important thing here. You'll generally want a 1GB or 2GB server. If you don't know, you can start with a 1GB server and DigitalOcean makes it super easy to migrate up to a larger size if you start getting memory errors.
  4. Hatch has two cool features around logs. First, you can actually view the logs in the web UI by clicking on the "Rails Logs" tab of your app. It will retrieve the logs for you and you can read the last like 300 lines or so. Second, because logs can get unweildy, your logs on the server are rotated daily so that you don't run out of disk space. Every day they get compressed and labeled with the date so if you ever need to go back into the logs for a couple days, you can SSH in and find those.
  5. A little bash script or alias will do the trick for you like Jack mentioned. This would be nice to add to Hatch's UI so you could copy and paste that into your Bash or ZSH config. 👍

Posted in How to redirect batch of urls from routes.rb

I like that aproach. You could certainly simplify your routes file and that would take care of things in a much nicer way. Loading the yaml file in an initializer and looping through it with some ruby in the routes file should do the trick really easily. And it's super flexible so you can change the format or whatever you like easily in the future.

Posted in Using will_paginate and .limit(x) together.

That works. The other thing you can do is look at @articles.total_entries I believe with will_paginate which will give you the count of all the results, not just the current page. It's either that or total_count on that or Kaminari.

Your solution is simple enough though so I would stick with that. 👍

Posted in Using will_paginate and .limit(x) together.

You're definitely doing something that is outside the scope of most pagination gems. What I would probably do is customize the view template to only show the last 5 pages links. This would visually make it so you could only see the first X items and then you can have your controller also verify the page # is between 1 and 5 so the users can't go outside those boundaries.

What does your solution look like right now?

Posted in Using will_paginate and .limit(x) together.

Hey Simon,

One super important thing here is that pagination uses both limit and order to make pages. When you add limit in on top of that, you're going to likely break pagination because of that.

Are you trying to limit the number of pages displayed?

Posted in Just want to say thanks!

Congrats on releasing your app Adrian, that's incredible! I can't wait to check it out. Looks really well done.

Thanks so much for the kind words guys. Couldn't make me happier to see that my little screencasts have helped so much. You guys are awesome. 🍻

Hah, good catch. Thanks for the heads up on that.

Posted in Direct File Uploads to S3: Part 3 Discussion

The cache directory is exactly what you would imagine. When you're uploading a file with a form and the form validation fails, you want to keep a cached copy of that file temporarily until the user completes the form. You don't want to permanently store the file since the form wasn't valid, so you just need it temporarily stored somewhere. Hence the cache directory. When the validations pass, the file is simply copied from cache to store rather than re-uploading the file.

Posted in How do you charge for building web app projects?

Yeah, the proof of concept part helped me for two reasons:

  1. It greases the wheels. The clients know you're not trying to gouge them for all they got.
  2. You get to back out early if the client isn't good to work with.

Roughly my daily and weekly pricing was based upon hourly pricing. Say like $100/hr equates to $4k for a 40 hour week. Chances are you're not going to be booked every single week of the year so the weekly price is high enough to account for time you're doing sales and taxes and other expenses you may have. If all goes well, you should be making a six-figure salary doing that.

The daily and weekly stuff was usually for ongoing projects. Say a couple months down the line after finishing a project the client comes back with some maintenance problems or revisions. This pricing makes it fairly easy for you to say okay cool, I'll tackle those things and it'll cost $X per day. It worked best when you couldn't quite judge how much time something was going to take which is where flat fee is riskier.

You're definitely right about picking a number of how much you want to make being the best way to decide what to charge. I think roughly the easiest way to break things down is to work backwards. If you want to make $100k/yr (keep taxes in mind), you're going to want to make basically $8300 a month. That's $2075 a week, $415 a day, or roughly $50/hr. Things that you have to pay for that an employer wouldn't pushes that number up (health insurance, taxes for self employed, office, internet, computer, etc) so you'll want to add those into the calculation. It's good to budget for buying some new computer equipment each year, and so on, so you'll want to tack those on top.

Another thing to think about is that often times you're going to be in one of two modes: Sales mode or Execution mode. Usually you won't be getting that many new projects while you're doing work for someone. I generally got new projects when I was nearing the end of a project and was taking more time out to go to meetups and talk with leads. When I got a project I was usually heads down working and so things tended to go in phases to keep the pipeline of work going. Referrals are always the best source, but you have to do really good work to make that work and your clients need to know other potential clients.

And of course, you have to be actually worth those prices you want to charge. It's not hard to work your prices up with experience, but a junior dev trying to charge $100/hr is going to struggle making clients think they're worth it. That's a lot of money. If you think of it from the flip side and consider paying someone to build one of your apps for $100/hr, you're gonna want to make sure they aren't slow or mediocre at what they do. The way I dealt with that was starting out charging $25/hr, taking on tons of work and each new client raising the prices slowly by like $5/hr or so. My slowness was accounted for (or I didn't bill for some hours) and the clients were happy and while I didn't make as much money as I wanted, it was exactly what I needed to work my way up to higher pricing.

This makes me want to do like a course on consulting or something. I think that would be really fun to do.