You can improve the security of your Go web application by cryptographically signing cookies using sugarcookie. A small library used to signing cookie values that you can verify and trust on a users return visit. This gives you strong trust and allows you to decouple the cookie from a session on a specific server.
The signed cookie consists of a a secret value, a timestamp, and a unique identifier; which is one-way hashed, then concatenated with the timestam and unique identifer. Then everything is base64 encoded making it suitable for storing in a cookie.
You verify the cookie is valid by replaying the hash with the secret and the supplied timestamp and unique ID.
Including the time as part of the hash means you can use it as a secondary mechanism to invalidate the cookie. If it's too old, maybe an half an hour, you know it's no longer good, no need to test it; just invalidate it.
Death of a thousand cuts, simple and deadly. The victim is bound and cut slowly, with small cuts. It's excruciatingly painful and takes a long time for the victim to die. The torture is far removed from our civil society, and it maybe crass to use to describe such, but it fits. Using tools and services we make decisions about what to use and when. It's a balancing act between the service provider asking too much versus what we give up.
So it is with Feedly. They are taking too much, throwing our relationship out of balance. I signed up for Feedly after Google Reader shut down. I gave it a shot because of it's good cross platform support, and the ability to add feeds in the app on my phone.
It might be that their service can't survive without the click-jacking, but them maybe it shouldn't exist. So, while I don't pay for the service, I'm doing the only moral thing I can, canceling my account and moving onto a different services.
My web development workflow in Go is not much different than how I used to work with PHP, but it is slowly shifting in subtle ways.
When I first started learning how to program, using PHP years ago, I would work in short feedback loops. I would hit F5 to refresh my work all the time. Often with the smallest change I would refresh just to see what would happen. Such a short feedback loop helped me learn specifically what each little change would do and how it would affect my website. As I progressed in learning I would code longer between F5 refreshes.
In PHP I work in a short feedback loop. I look at the outcome requirements and I plan a way to get there. What code will I have to update, what new code will I have to write, and how will they interact. For larger blocks of code I plan how to modularize them, what are the inputs and outputs, then I code up a working example as fast as possible.
Once any new business logic code has been completed I wire it into the front end. I code, tweak and adjust, then F5 to see my website updates. I love seeing how my application shifts and morphs as I refresh and refresh as I update code. The short feedback cycle helps guide me. The positive and negative emotions I feel as things start to work or fail to act nudges me ever so slightly in the right direction.
With Go my feedback loop isn't too different. I develop using vim and tmux. I keep my compile and run command in window 0 and edit code in the rest. I have apache setup to proxy non-existing requests to localhost, where my Go binary is listening. DEV listens on a different port than LIVE.
With Go I have the same requirements process as with PHP. I start coding on the core functionality that is changing the most. Then I move onto wiring it into the existing functionality.
I don't refresh in Go as often as I do with PHP, but it's just about as easy. I switch to tmux window 0, ^C to kill my running process, then go install, and run the new binary. Then F5 in my browser and I'm looking at my updated changes.
Most of the time Go is done compiling by the time I switch to my browser and before I even have a chance to refresh.
The feedback loop is not much longer than when using PHP. There is the extra step to kill and recompile the binary, but doesn't slow me down. Which is then why I find it curious that I find myself using this refresh/feedback loop less and less. With Go I find that I plan more and wait longer between feedback refreshes. There are also times when all the feedback I want is to see if the binary will compile.
The similarities in feedback loop between Go and PHP helped me become comfortable with learning Go. I was able to see my changes as fast as I had before, which helped me learn the eccentricities of Go. Without having to greatly modify my development process I was able to learn Go.
I have just recently switched my backend database from MySQL to Postgres. I switched for political and technical reason.
Oracle now owns MySQL. They acquired it as part of their purchase of Sun Microsystems. Which bought it outright when it acquired MySQL AB. Oracle owning MySQL is a bad thing.
It is in Oracle's best interest of it keeps MySQL from growing and progressing. Oracle's money makers is the Oracle Database. They make a ton of money selling it and they make a ton of money supporting it. During the Internet boom MySQL was an alternative to Oracle's database on the low end. Small projects and poor web developers could install it and have access to a database that, while nowhere near as powerful as Oracle, was powerful enough.
MySQL AB made money by supporting and developing MySQL. Now that Oracle owns it they don't need the money that MySQL would bring in. In fact they would be better off financially if they upsold existing MySQL users into Oracle database contracts.
Under Oracle's stewardship MySQL has stagnated. Fewer features and fewer bug fixes. Many of the original MySQL developers have left since Oracle has taken ownership, and formed their own companies pushing their own version of MySQL.
I don't trust Oracle to nurture MySQL. I don't trust them to fix bugs, add features or move MySQL forward. To me MySQL is now a dead end.
The migration process
Migrating to Postgres was more troublesome than I would have liked. There is no one-to-one export/import process. MySQL uses a few non-standard data types and syntax. This makes it impossible to import data directly from a data dump. Even if you are using "--compatible=postgresql".
Instead I narrowed down the old CMS's tables that I specifically needed. Then I dumped them and hand edited them, making them compatible for Postgress.
Then the import was easy. It just took longer than I would have liked.
Overall I'm happy with the change. Getting familiar with Postgres has been easy if not without a couple of moments where I had to dig into the documentation. I feel more secure with Postgres over MySQL because of it's independent nature. As well as comfortable with it's technological underpinnings.