Dangers of Self-Managed Development Environments

Wednesday, November 03, 2010

Jamie Adams

4085079c6fe0be2fd371ddbac0c3e7db

Like most of my peers in this industry, I started out as a programmer.

Many remained as programmers for several years until morphing into managers. Others became system administrators of development or production systems.

Over my twenty-two plus year career, I've worked as a developer, system administrator, and architect, and in various leadership and managerial roles.

Today I've come full-circle and back to my roots as a principal engineer.

Over the years I have seen the discipline of software development grow in many directions to include new programming languages, associated processes, software deployment methodologies, and software development environments.

The trend that concerns me is the ease at which developers can set up their own development environment to perform unit level development.

Developers can easily stand up their own Linux desktop or virtualized environments and begin to develop software. Then they can submit finished components into a central repository to be integrated and fully tested in the subsequent phases of development.

Denying developers this capability would be counter-productive. I agree with the two arguments I hear most often:

(1) “Don't worry dude! The software is portable” and

(2) it takes too long for system administrators to configure our necessary development environments – more specifically, it takes too long for system administrators to adjust security settings and install tools we need to develop software.

Both are valid points, however, the non-standardization of development tools, runtime environments, and inconsistent security settings raise concerns.

Too often I have seen developers relax security controls during unit development only to be bewildered when full integration testing fails. Many database administrators have strict table access controls which developers must adhere to.

Why isn't this the same when it comes to base operating system resources?

Another scenario which concerns me is when an operations group has little-to-no influence on the configuration guidelines which a development group must follow.

When development is completed, operations is expected to install and configure the newly developed application.

In my opinion, if the required configuration deviates from established standards or a developer is required to install it, operations should not be expected to accept the product!

Why should operations be expected to maintain the current level of documented security or maintain the product in general if the product is not developed using the same processes?

This dilemma is exactly what one of our Security Blanket® customers described to me. My first response to them was that good leadership, sound configuration management, and supporting processes must be in place.

Secondly, my suggestion was to establish consistent configurations throughout the testing environments to mimic production as closely as possible.

And if possible, tighten the development environments down, too, or at least establish development guidelines like “stop running Tomcat as root”!

If you or your organization have encountered this situation in a professional services capacity, what was your solution? What other concerns should organizations have?

Jamie Adams is a Principal Secure Systems Engineer at Trusted Computer Solutions, Inc.

Possibly Related Articles:
11917
Operating Systems
Software
Software Virtualization Operating Systems Development
Post Rating I Like this!
7e6249b5c7f6b63c28587c820b16edcb
Robert Gezelter Jamie,

Nicely said. When developing software, or counseling clients on developing software, I very consciously use the principle of "minimum necessary privilege".

I too have seen far too many environments where privileges and file protections have been circumvented. At some point, the purely technical issue becomes a corporate issue, when an audit is done.

At that time, the problem becomes a corporate governance issue. I have seen people get warned by auditors that the problem (elevated privileges) will be corrected, or the auditor will issue a "Sarbenes-Oxley" letter, something no one wants.

Additionally, I see very (emphasis, VERY) few cases where the privileges are actually necessary.
1288814289
4085079c6fe0be2fd371ddbac0c3e7db
Jamie Adams Robert, thank you for your feedback. Forgive me but I've never worked in an environment focused on SOX. What kind of penalties is an organization looking at if a letter is issued and then not followed?

Seems that different industries are penalized in different ways. The common denominator seems to be the threat of shutting down operations.

I agree with you 100% that eventually "..the purely technical issue becomes a corporate issue". Too many managers will NOT delay the deployment/delivery of the latest and greatest application because of a few "minor" inconsistent configuration elements. :-(
1288816858
7e6249b5c7f6b63c28587c820b16edcb
Robert Gezelter SOX is a complex area with many issues. To keep it short, I will note that SOX requires accountability for financial results. This in turn implies that data processed and produced by information systems can be trusted. Thus, information systems must enforce limits on authority that are part of the internal controls.

At one client, problems with the application had been circumvented by granting every user in the organization system-level privileges. The users were captive, and were supposed to be limited by the application itself, but this is actually an illusion. The auditors found out about the privileges, and warned senior management that the privileges needed to be removed before the audit report was finalized.

I was able to resolve the problem by revoking privileges, and substituting a series of graded access controls administered using access control lists and roles. This segregated duties and enforced the limitations.

The directive to correct this issue came directly from the CFO's office. Up to that point the applications team had been completely focused on features and functions, not security and integrity.

The potential penalty was substantial. SOX applies to public companies, and as I understand it, the penalty was an exception paragraph on the subject of internal controls included in the audit letter that accompanies the annual report.
1288954440
The views expressed in this post are the opinions of the Infosec Island member that posted this content. Infosec Island is not responsible for the content or messaging of this post.

Unauthorized reproduction of this article (in part or in whole) is prohibited without the express written permission of Infosec Island and the Infosec Island member that posted this content--this includes using our RSS feed for any purpose other than personal use.