Activity
Posted in Cookies vs token for authentication
Typically people will store JSON Web Tokens in localStorage so they're persisted across requests in the browser. This has the (somewhat major) downside of tokens being stealable by any Javascript that runs on the page and why session cookies are still the right answer for 95%+ of apps. You're far less likely to screw up the security of sessions with the traditional approach.
Token authentication works best when you're using it for mobile apps because you aren't likely running anyone else's code. You can save the token in an encrypted place accessible only by the app using the native libraries and know your token is secure there. On the web, it's not so easy.
Here's some good food for thought on JWTs:
From this article: http://cryto.net/~joepie91/blog/2016/06/19/stop-using-jwt-for-sessions-part-2-why-your-solution-doesnt-work/
Posted in Setup Windows 10 Discussion
Go for it. Like it says at the top of the post, this is for documenting it for Bash on Windows, not other approaches like VirtualBox.
Well, the real worst-case scenario is that your JWT is out in the wild and still valid, then someone just changes your account details to their own and hijacks it from you without you knowing.
Obviously, letting people know that your email address has an account is not great, but it's no different than someone attempting to register a new account with your email and it saying "This email address has already been taken".
Posted in Setup Windows 10 Discussion
Nope, wrote it myself, but things are constantly changing with Bash on Windows so these instructions break often.
That's not much to worry about because if someone gets your token, they have full access to your account and can do anything they want. You've got much bigger problems in that situation because your JWTs should never be exposed.
Hmm weird, they changed that to be the default URL for Rails in 5 I believe so it shouldn't be necessary. The project I'm using in 5.0.1 doesn't have it and it finds ActionCable just fine. Not sure what's up there.
Posted in Rails Application Structure | GoRails
When you configure your database for the first time, you're asked what user and password owns it. This is how it gets protected. Almost all services you use that configure the database for you like Heroku, will have generated a username and password for it. That's inside the URL at the beginning before the @
and separated by a :
. You just use the URL they give you but when you're using your own local database or whatever, you must use the username and password you set while setting up the database. You almost always want a password to lock it down so no random people can access your data.
For your own database, you'll just modify the database.yml to use those separate parameters instead of a url because it's easier formatting.
Hey Aime,
I'd probably recommend making a cron job for this case instead of scheduling jobs ahead of time. This way you can write a worker that runs every day or so, and checks for events that happen in the next 24 hours and sends out those email reminders.
By doing this you don't have to keep track of which events were canceled or rescheduled, and you're always looking at accurate data when the job runs so you don't have to worry about any of the cancellations or reschedules.
I made an episode on the whenever gem that you can use to build cron jobs that you might like to watch: https://gorails.com/episode...
Posted in Setup MacOS 10.10 Yosemite Discussion
As long as you followed all these instructions you should be fine. It sounds like you overwrote files inside a Rails app which should be fine. You can always try creating a new Rails app and seeing if that works. Then you'll know you're setup correctly.
That seems like a pretty decent solution cross-domain. Since you're sharing the database between the two, you can verify the token is only allowed for the user it was generated for and your expiration can be like 30 seconds so that the chance of that token leaking is very small.
You can also scope that AuthRequestsController to only allow admin users to access it as well so you get the same security around these tokens that devise masquerade does when it's only accessible from the admin.
Sounds like that'll work pretty nicely.
Hey Toshiki, this is a great question!
You're right on the tradeoffs here. Here's some thoughts I have on it:
STI is great when you have very similar models that are unlikley to change or won't have many different columns. For example, you wouldn't want to do this if you had two types of User models but each one had 15 different attributes unique to each other. Then you have 30 columns on one table and you'd only ever use half of them at a time. That means you've got a mess of attributes that you don't use and makes for a confusing time later on.
In your case, the models are pretty similar and you don't have that many additional columns so STI is not a bad fit. Yes, you'll have a couple columns that are null, but that's not a big deal.
As for polymorphic associations, they've absolutely got their downsides too. The primary one being additional complexity and no ability to query efficiently. Since every record has the class name stored inside the column's value, you can't add foreign keys which is very unfortunate. It's great for things like comments where you often will be querying for Movies, Actors, or other models and their associated comments and never the reverse of looking at comments and trying to find their "commentable" records. You can index the commentable_type and commentable_id columns together to make finding a Movie and all the associated comments a fast query, but there's no way to do uniqueness or anything like that on a database level.
Another downside with the polymorphic associations are that you now have a whole bunch more tables and you can already see it feels overkill in this situation. As any project grows, more tables means it's harder and harder to wrap your head around how things work as time goes on.
So what I would say is this:
- If you imagine your models aren't going to change hardly at all in the future (it seems like they won't), then STI doesn't seem like too bad of a solution here. You've got very similar models, and there's only 3, and you only add a couple columns unique to each.
- If there's a possiblity you might be adding more columns or several new Invoice Item models...then you might be better off going with polymorphic associations instead so you don't have one bloated table with 60 columns and most of them being null values.
Then the real conclusion here I would probably give you is: start simple and just pick one that seems most intuitive to you right now.
You honestly can't go too wrong with either approach here. The one thing that you're trying to optimize for is the future. You can have a best guess as to what will change in the future, so use the path that seems most in line with that. If something changes and it turns out the solution you chose wasn't ideal, then you can always write migrations to move data into another structure and change your models along with it.
You will always have that option to reorganize your data so even if you pick the wrong solution, you can always fix it and refactor.
Great stuff! I would do the same thing. Since I hadn't paid too close attention that you were using a gem, this makes more sense for that situation. My bad! :)
You'll find out pretty quick, and sometimes you might get exceptions you might want to ignore or whatever like a bot typed in weird crap into the URL that Rails can't parse and it's fine to ignore. There's always those odd things to deal with. I've been really happy with it on those similar apps where I want to know if something goes wrong, but it definitely doesn't justify money or running another service for it.
Let me know how it goes! And now I really want to check out Errbit myself...
I haven't used errbit, but I've deployed exception_notification to like 30 apps and it's always worked fantastic. Errbit looks cool, but that requires you to run a separate app?
The nice part about exception_notification is it's super simple and can post to email or slack or whatever is convenient for you without much setup. The downside is it doesn't collect messages anywhere, so you'd have to track them inside Slack or Email or whatever and that can be a fairly big downside if you have to manage a lot of exceptions.
One of the downsides of doing that is you're faking out objects so not all your code is fully tested there. VCR would only fake out the HTTP response and you could fully test all of your code instead. Sometimes that can save you from some future troubles where a mock object wouldn't.
On the test double side, I've used rspec-mocks before and it worked pretty well. Most of what I was doing I just used the readme for https://github.com/rspec/rspec-mocks#test-doubles
Basically creating the double and then passing that object into the classes I wanted to test. Then you can expect it to have certain methods called and such which is pretty convenient and mock out the return values.
Those are what can get you into trouble if your object's API changes in the app but you forget to update your test doubles to match exactly. That's why I usually recommend not using them as much as you can so you know you're always operating on the same exact objects the app will have.
One thing that comes to mind here is actually the VCR gem. You can use it to record network requests and save them so that your tests can just replay a real request. It sounds like you're describing something that might be a perfect fit for VCR: https://github.com/vcr/vcr
Have you looked into that at all?
I was going to say, I don't think you can do that because ActionCable doesn't go through Rails controllers. You'd want to set self.current_user equal to the logic that finds the user from the session.
What was your solution? I'm sure other people would love to know it as well!
Posted in rails5 + heroku + cloudfront + fonts
Awesome post, thanks for sharing Michael! :D I need to do an episode on Cloudfront sometime.
Posted in Search all models gem
Like Andrew said, Searchkick is perfect for this. Each model gets an index inside ElasticSearch and then searchkick lets you query over those by simply doing this: https://github.com/ankane/searchkick#multiple-indices All that should make sense after watching the screencast. 😎
Ransack isn't quite the same in that it doesn't index them in a way that you could search multiple models together. You'd actually have to search each table separately and then combine the results in your view. It's nothing fancy as it just does database queries which makes it really easy to implement without any separate services, but more rudimentary in a sense.
Posted in Freelance advice
This is a great question Karim. :D Here's some thoughts from when I was consulting:
- The most important thing for me was to just get work at first. Any work at all would lead me to more people and build up my portfolio. Over time I'd "trade up" and get better and better jobs ignoring a lot of ones I probably would have picked up before that were bigger time sinks than I wanted.
- I actually just emailed a random guy on a job board and got my first consulting project that way. The majority of my work afterwards was actually from spending a lot of time in the startup community in town. All kinds of people want to start a business and need a developer to help them build it. There is always a lot of work available if you're around those people.
- I'd charge pretty much any rate you feel comfortable with. The ideal is to know you can deliver on their project with a fixed cost, but that's rarely possible to do well. Too much changes in most projects so to stay safe you can always charge hourly. To make things a bit easier on yourself, consider charging per day or per week and they have to book either a full day or full week and that way you're not losing out. You'll do well to find out the client's budget and then work with that (see what you think you can deliver for that, it might not be the full project or whatever if it's too small but you can help them start). Raise your rate by $5/hr each new client or something as you get more experienced.
- Lots of different projects was great for me to keep improving skill wise. And then for each new client I'd try different things. Like I went from hourly to daily to weekly and then tried some fixed fee projects so each new client I'd tweak one piece of the wway I worked to see how much it improved my life and if I liked it or not.
- I didn't really do any marketing because the best clients I got were actually from referrals so I focused on that. Rarely did I get good work that would show up randomly so I didn't put any time into marketing which was nice for me. A lot of people will do marketing for this, but you'll need to do something totally unique to stand out from all the other developers who claim they can "take your ideas and make them real" or any of the thousands of variations every consultant's website has. 😛
Happy to answer some more questions on this and I would love to hear about other people's experiences. I spent like 7 years consulting and made a really good living doing it.