A question came across the Drupal Developer's list today asking whether Drupal could auto-update itself, like WordPress. As someone who thinks about security a lot, the very thought of this horrifies me.

It's a bad idea for several reasons, but the biggest reason:

It could easily lead to the biggest most powerful bot-net on the planet.

This could just as easily happen to WordPress, too. It already has, in fact, to a small extent.

See, here's the thing. Criminals are lazy. They want to get the biggest possible payoff for the smallest amount of work. Having a single site that can be compromised, and lead to taking control of millions of web sites -- well, that's an awfully big payoff for a pretty easy attack.

And here is one Achilles Heel of open source -- the crown jewels are out there in the public, with minimal security.

Think of Wikipedia. On the whole, it has become the largest, best source of human knowledge out there. But if somebody wants to trick you into believing something, all they have to do is modify a single page, and get you to read it (and believe it) before somebody changes it back.

Attacking the repository for an open source project is a bit more protected -- access is only given to trusted individuals to make these kinds of changes, at least for core parts of the program. But that just means it's up to the security practices of those individuals -- if an attacker can get the credentials of one of those users, they can get malicious code into a repository. It has happened to several large open source projects in addition to Word Press -- Debian, the Free Software Foundation, and yesterday, Kernel.org that houses the main Linux kernel.

All of these attacks are generally detected relatively quickly. There are lots of ways of identifying malicious code that makes it into software repositories. However, there is some delay between the repository getting hacked and the malicious code getting identified -- a window of time where people updating their software may download the nasty stuff.

The problem with web projects like Drupal and Word Press is that a huge number of people run these in production environments, live on the web. If a high number of these all get updated, how many site administrators would notice? How long would it take to get millions of hijacked web sites fixed up? If a site administrator is not savvy enough to apply updates manually because all they want to do is push a button to auto-update, why is it a good idea to expect them to clean up after their site has been hacked?

This is wrong on so many levels.

So. Recognizing that site updates can be critically important to close vulnerabilities that might also lead to your site getting hacked, how do you apply updates safely? Here's our checklist:

  1. Evaluate each update to assess its priority. Does it fix something that left unfixed could lead to your site getting broken into today? Is this likely to happen? If so, drop everything to evaluate and rush out the fix. If not, schedule the remaining items into your normal work flow.
  2. Wait a few days to see if others experience problems upgrading, and review new issues on the project's issue queue.
  3. Evaluate the contents of an update to see if the code changes match what's in the change log.
  4. Back up the site and database entirely, in case something goes wrong.
  5. Apply to a test version of the site, and see if we can find anything that breaks.
  6. Commit the new code to source control, so we have the capability of rolling it back.
  7. Ask client to review the updates on the test site, if the client is heavily involved with site administration.
  8. Roll out the update to production, notify the client that it's up.
  9. Verify that the updated code operates correctly.

While you might be able to script a backup job to run before an automatic upgrade, and while you also might be able to kick off some automated tests (if you have any -- for Drupal they were added in version 7), the rest of the above checklist would get totally bypassed by automatic updates.

Most of the time, skipping those checks might make parts of your site break. Even doing those checks might break some of your site, but the code management can make tracking down the source of the problem much easier. Making it easy to apply updates is a good thing -- but it's not a replacement for keeping track of what's going on with the software you're using. Remember that if your site gets hacked, it's usually your visitors (customers?) who suffer -- currently the most common thing done with hacked sites is making them hosts for spreading viruses and malware to Windows machines. Do you really want to put your customers through that?

Have a different view? Please share in our comments below!

John, I am going to disagree with you on this one, for the following reasons:

1. You can evaluate each Wordpress update before you apply it. Wordpress' auto update doesn't run by itself. You still need to click the update button manually, in order to update Wordpress.
2. You don't need to update Wordpress as soon as the update it available, but can wait a few days / weeks to see if the update caused problems for other users.
3. Evaluate the contents of an update to see if the code changes match what's in the change log.
4. You should backup any CMS and database before doing any update, regardless if you do the upgrade manually or automatically.
5. If a backup was made, and the upgrade broke your website, you could restore the backup in order to "roll back" your website to the previous version.
6. How many webmasters really commit the update to source control? IF they don't do it with a manual update like Drupal, why would they do it with say Wordpress?
7. Updates could be done on a test site, for the client to evaluate before doing it on the live site, both on Drupal and Wordpress
8. Roll out the update to production, notify the client that it's up. - also both with Wordpress & Drupal
9. Verify that the updated code operates correctly. - also both with Wordpress & Drupal

Wordpress' "auto update" merely downloads the update directly from the server and "installs" it directly onto the website, instead of you downloading a tar.gz file and uploading it via FTP. Most *lazy* web masters don't follow half of your suggested steps when they apply updates to any script, let alone Drupal.
Hackers could as easily infect the Drupal / Joomla / {name-your-CMS} repositories as they could with Wordpress and as many people's websites could have the infected code if they just download, unzip and upload the updates to their websites.

Hi,

Thanks for your comment! I'm not sure we disagree, though. You're pointing out that you can manage WordPress sites manually, just like Drupal, and that's definitely true (Drupal does have a few slight advantages in terms of being able to remove write access to executable code entirely, and the fantastic Drush shell utility). But that's not the point I was making in this post.

The point is, auto-updates are a bad idea because they promote lazy, bad practices that will come back to bite you, sooner or later. Auto-updates are now available in Drupal, and they're just as bad an idea for Drupal as for WordPress, for exactly the same reasons.

This boils down to risk management. Most of the projects we work on have high value for their owners, and our customers recognize that putting in some protections to keep from losing everything is well worth the added cost -- if your site gets hacked and you haven't been diligent about keeping backups, what is that going to cost your business? I just think auto-updates are extremely risky if you haven't been diligent about keeping solid backups, and worse, they entice you into easy shortcuts that make you less likely to do proper backups in the first place!

Oh, and one other thing: to support the ability to do auto-updates potentially opens a security hole in the site itself.

These generally work either by copying over executable files as the web server user, or connecting using FTP/SFTP on behalf of you, using your credentials. If you go the web server user route, that means the web server must be able to write to the executable files that make up the application. Which means any random hacker who finds a small exploit on your site can leverage that to upload code that he can then run on the server itself -- turning small vulnerabilities into potentially huge ones.

If you use FTP, well, that's just plain wrong to begin with, given how trivially easy it is to sniff FTP passwords when your web designer uploads new files from the local coffee shop. FTP = you don't care about the security of your web site.

And if you use SFTP, but you aren't using SSL, you've got the same issue -- only now you're potentially sharing much more dangerous SSH credentials. (And it requires allowing password access to your servers, which we don't -- you need an approved SSH key to log into any of our production servers).

The only way an auto-update can run in the site without exposing dangerous new exploit vectors is if you have to enter SFTP credentials every time (don't have them stored in the site), and the admin pages that you use to run the auto-updates are protected via SSL. And this still requires password access to the server.

Much easier and safer to use drush to update modules in Drupal, after you log in securely with a key. Much better to test updates on a test copy of the site, and deploy through version control.

Auto updates are for people who don't care if they entirely lose their site, and are willing to start over.

Not necessarily 100% true. Auto updates are for those who would prefer to break a site over compromising a site in some cases. The latest Drupal hack is a prime example. Within hours sites were being attacked. For many of those sites people were sleeping not knowing what was going on. Now if auto security updates were in place this could have been avoided.

1. Security updates are the only thing in my opinion that need to be auto.
2. For anything other than security I agree waiting can be beneficial.
3.
4. This can be built in to the process and should be
5. This could be done as well. Symlinks and automated testing. Not perfect but better than getting hacked.
6. This can easily be done in an automated process.
7. An email to the client could be triggered after the update to have the client review
8. Same as 7
9. Not sure how this is a problem. It's the same if it were done manually

For me it's not a matter of set it and forget it. It's more that things (security) get done when I can't then I can review and do what's needed. For me to fix a broken site is much better than having/fixing a hacked site.

Also there is no need to store SFTP creds to auto update if you have Drush installed.

I do believe that with some work automatic security updates could be done with minimal risk

Hi John,
thanks for sharing your opinion BUT it is not so easy to say it is bad or good to have automatic updates available.
In real life there are many many Drupal sites which run on a very old version or sometimes they have never updated the system.
The reason? I don't care it is a fact... maybe they are lazy or do not kow how it is working (e.g. on shared hosting).

So it woul be really cool to have the possibility to enable automatic updates in Drupal.
Anyway, there is no posibillity in drupal core to enable automatic updates but there is a contributed module/ web service wich will help to autoupdate Drupal. Find more about the module here: https://www.drupal.org/project/cms_updater

I think in comes down to the question of which is worse: the prospect that malware might be distributed in Drupal updates, infecting tons of websites, or the prospect of tons of unpatched installations of Drupal getting infected because the update process is either too technical, or too time consuming for someone without a budget? Either way, websites will get infected. it's just a matter of how.

I think the main point here is that applications need maintenance, there's no silver bullet. Your choice is:

  • Pay somebody or spend the time to do updates yourself, along with testing them and fixing issues
  • Risk breaking your site frequently with automatic updates, leading to an opportunity cost of potential missed business (and even realizing that you have a broken site)
  • Risk having your site broken into by attackers, if you neglect updates.

Pick your poison!

We're working towards doing automatic updates in a very structured, revision-managed way on dev and stage sites, with automated tests after updates, and then notifying a person to review test results and sanity-check the site before rolling those out to production. And thus be able to deliver the first option at far lower cost and greater reliability than you can find elsewhere!

Permalink

"Auto updates are for people who don't care if they entirely lose their site, and are willing to start over." - ah, you mean like so many people had to do after Drupageddon, which of course was totally caused by automatic updates - oh wait, no, it was caused by relying on manual updates and most people not being able to do them within the few hours required!

Really, there are risks with either automated or manual updates - they're just different risks. And the level of risk often depends on the site and the situation.

The risk of automated updates is, as you note, that there is a small but non-zero chance that someone with bad intentions could infiltrate an open-source project, contribute malicious code, fail to have said code detected by any of the other contributors, and have it make it into an official release in spite of whatever testing and safeguards the project in question might have.

The risk of requiring manual updates is that (a) less experienced site owners install updates rarely or not at all (which in my experience is so common, with nearly all web applications, that it's really more the rule than the exception, except when an outside agency is handing the updates instead of the owner), and also that (b) in the case of some security vulnerabilities, even if an outside agency IS handling the updates and doing them diligently, they may not get to it before the attackers do (Drupageddon being a standout example).

Overall - I suspect the number of security problems caused by updates being done too late or not at all is MASSIVELY higher than the number caused by compromised updates. So honestly - especially after the Drupageddon experience, I would LOVE to see Drupal offer the option of automated updates (with manual updates still being an option for those who prefer them, of course).

Some of the risks you cite with automated updates are pretty easily mitigated with things like automated backups, not using an insecure connection, checking the site after an update, etc. And no, not everyone does those things, but not everyone does manual backups at all, so people being slackers is an issue with both approaches. But again, the likelihood of problems being caused by not doing updates at all is probably a lot higher than problems being caused by doing updates without a recent backup (though neither of those is a good idea, obviously).

Personally, I've yet to see any site I created or maintain affected by a compromised update, of any web app whatsoever, and I've been a web developer for a long time. I've seen a handful of instances of functionality being broken by an update accidentally, but that's usually fixable one way or another (even by restoring from a backup if necessary). I have, however seen more instances than I can count of sites being compromised due to updates not being done in a timely manner, so to my mind that's the bigger of the two problems.

My ideal security update workflow would be:

1. Automated daily backups (a good idea no matter how you're handling updates).
2. Automated updates (as an option, not a requirement, but I'd opt for it).
3. Notices sent to the webmaster when an automated update is scheduled to be done, and again when it's completed, so that the update can be cancelled in the event that there's a reason not to do it at that time, and the site can be checked as soon as the update is complete.

Had that been the norm with Drupal, Drupageddon wouldn't have happened, or least would have happened on a much, much smaller scale.

Hi, Lynna,

Thanks for your great comment! Yes, it is a question of risk, and really my point is that automatic updates are a dangerous thing IF YOU DON'T HAVE BACKUPS. Your points are on target, so I think we largely agree...

The thing is, we've had to use our backups multiple times, in all sorts of situations. Auto-updates without having a backup in place is crazy -- and yet I think that happens routinely all over the web.

We are working towards a service that we think provides more "responsible" auto-updates. We already have much of this in place, on our maintenance plan. Basically it looks like this:

  1. Automated daily backups (we do this now, everywhere, on every copy of every site).
  2. Automatic daily updates of a dev copy of the site, with an exception list (not yet done).
  3. Automatic Behat tests run on the dev site (we do this now).
  4. Automatic deployment of updates to Stage site after tests pass (we manually trigger this now).
  5. Automatic visual regression testing between stage and production (we do this now)
  6. Notice sent to site release manager with test results (we do this now)
  7. Release manager can trigger a production deployment or auto-schedule for a particular time (manual works now, scheduling not yet).
  8. Production deployments take a database snapshot before getting deployed, and are tagged so they can be easily rolled back (we do this now).

So our approach is apply all updates to a dev copy and run tests. And then a human can review and trigger the deployment to production.

This involves quite a bit more infrastructure than a typical site owner has, but we think it's a great balance between prompt application of critical patches, while mitigating the risks associated with updates.

We have recovered 7 - 8 sites from Drupalgeddon specifically and a couple dozen from various hacked states in all. None were compromised on our watch -- these all ended up becoming our customers!

Cheers, John

Add new comment

The content of this field is kept private and will not be shown publicly.

Filtered HTML

  • Web page addresses and email addresses turn into links automatically.
  • Allowed HTML tags: <a href hreflang> <em> <strong> <blockquote cite> <cite> <code> <ul type> <ol start type> <li> <dl> <dt> <dd> <h1> <h2 id> <h3 id> <h4 id> <h5 id> <p> <br> <img src alt height width>
  • Lines and paragraphs break automatically.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.