|By Richard Williams||
|April 19, 2004 12:00 AM EDT||
As a decision maker in your IT organization, you're aware that your Linux systems share is growing (if your enterprise follows today's business trend). Linux installations are now available on every major hardware platform. New projects in development include Linux systems in an increasing share, and you're challenged with incorporating these Linux systems seamlessly into your operations and business processing.
These Linux systems must also now be included as part of your IT audit. IT audits are increasingly performed by cross-functional teams rather than by operations, networks, applications, or database management teams. The cross-functional audit teams have the scope and purview to examine each area of operations. Since your skilled operations teams aren't responsible for policing their own house, they can remain focused on their core skill sets.
The audit teams make scheduled passes, with strategic focus on physical security, network security, applications security, systems security, and whatever else is part of your enterprise security plan. The report is digested and parsed by the audit team leader or information security manager, who tactfully disseminates the information to the appropriate team leaders.
The first challenge emerging from this vision of corporate information systems unity is that the operations teams will potentially mistrust, hate, fear, or otherwise loathe the audit teams. This humanistic certainty is based on the perception that someone is trying to find something wrong so that blame can be assigned. Overcoming this challenge, while not a typical strategic audit goal, is important since you want the audit teams to have unfettered access, and you want their work to be supported and adopted by the operations teams. The audit teams' reports must become meaningful input for operations teams, who will review a report and mitigate the threats instead of putting out fires later because important audit information was not heeded.
Using your vision, sensibility, and other executive powers, you've attained respectful buy-in from the teams - you can now move forward to meet other challenges.
The AuditOne problem identified during Linux audits is that too many people know the root password and other elevated-privilege account passwords. These passwords are the electronic keys to the kingdom in Linux, and taking back control of these accounts is a top audit priority. Typically, everyone who has the root password knows why they shouldn't pass it out or overuse it.
There's limited accountability in most native Linux operating systems, including the lack of a cogent audit trail. The native auditability is primarily centered around the syslog and sulog facilities, which cannot describe the interactive actions of the root user with the system at the level required by the HIPPA, Sarbanes-Oxley, and NISPOM Chapter 8 requirements, to mention only a few. For example, Figure 1 shows a sample sulog, revealing a not very detailed snapshot of users using su on a system.
While they're better than nothing, the sample log entries don't describe what actions were taken after the SU command occurred. (For the uninitiated, the + or - tells you if the SU request was successful.)
The syslog example may be roughly equivalent (see Figure 2).
The example in Figure 2 also indicates privilege being elevated, but does not describe (or require) a reason. Additionally, the file(s) produced by the syslog daemon may contain information not germane to your audit, but again, some information is certainly better than nothing. You can significantly improve the auditability in your enterprise by adding third-party software that captures all standard input, output, and errors, including everything the user does with the elevated privilege.
The example below is from a policy created on a Linux system (salmon.mydomain. com), using a Symark product called PowerBroker, (version 3.2.1). It provides a root shell for any user authorized to run the command pbrun GIMMIEROOT. The policy creates an audit file akin to others available in some third-party products to give you more auditability when users gain or use elevated privilege. This particular product will log all standard input, output, and errors, as well as a complete report regarding the secured task:
$ pbrun GIMMIEROOT
Enter your reason for accessing this policy:
I need to edit the /etc/passwd file
Figure 3 shows what the resultant logfile includes. Note that the "who, what, when, where, and why" are evident in the log output.
I truncated the log file, but you can see that your audit team has the ability to see it, and to tell the who, what, when, where, and why for any elevated-privilege or vital-asset access. In addition to third-party products, Linux vendors are working hard to provide this functionality. This functionality significantly improves your teams' ability to take back the root and other elevated-privilege accounts by granting elevated privilege only when the user accesses certain commands or assets (within their normal job descriptions, for example). When access is complete, normal privilege resumes, and the user never knows the elevated password.
So you're familiar with elevated-access audit control; is your audit team is as well? Basic audit tenants include reading the documentation to determine what to audit, but what documentation do you have that describes who can access what, when, where, and why?
Your systems, applications, and networks team can collaborate to create a document like Table 1.
Your teams may have used any visualization method, but the output is a matrix of your systems (vertical axis), and your user community (horizontal axis). Notice that each login/access method is described, as well as which system each user can access, from which system, by which method. Once users are on the systems, executable commands are listed, as well as any elevated privilege required. With this documentation, your audit team now knows which systems to go to, which accounts to scrutinize, which commands should normally be allowed as the user, and which commands require elevated privilege. This documentation is simple but effective in meeting the requirement to report upward and manage outward.
Another important problem that surfaces in a Linux audit is the publication of passwords, which often happens inadvertently via secure applications scripts (Web startup or shutdown, middleware startup or shutdown, database startup or shutdown, etc.).
Information synchronization routines (such as NIS or LDAP v2) also place assets at risk, as they pass account, system, and other enterprise information around the LAN or WAN in clear case. (In the case of passwords specifically, the encrypted value is sent, but agile information bandits know the difference between a crypt, bigcrypt, or MD-5 hash. When the rest of the information is in clear case, encrypting only the password may provide little safety.)
Once passwords are obtained by a nontrusted source (someone leaves a file containing a password world-readable, for example), valuable assets are at risk on numerous fronts, including easy access to critical files/data. When an asset can be accessed by a user in masquerade, the asset is at risk. The insertion of a Trojan program, the destruction of an application, and the alteration of data are all undesirable options. Whether compromised by the pad of paper in the machine room, the e-mail to the group alias with a defunct (but still receptive) recipient, the generic account password used by consultants nationwide when installing the new software on your enterprise server, or some other method, the untrusted source now has the ability to log in to one or more systems as someone other than themselves. No audit could save you at this point, as activity performed under the guise of a trusted user is now suspect.
Fortunately, your systems audit includes the regular checking for ownership, permissions, checksums, and other embedded safety mechanisms to keep data and applications in a known good state. Program files, executables, even operating system and patch levels are being recorded and compared from audit to audit, and maintained at the most current secure levels. The LDAP directory is scrutinized for the dysfunction that occurs between Human Resources and Information Systems, causing transferred or even terminated employees to be removed to systems, but allowed to remain in the LDAP directory. This step eliminates the ability for a transferred or terminated employee to gain access to assets via an LDAP-credentialed application. You have delegated and empowered effectively, your audit team is passing back the appropriate report to the systems managers, and the integrity of the systems and programs is secure.
ConclusionAs a quick summary, your internal teams periodically perform these audits:
- Physical security
- Operating system
- Network security
- Others as you require
Your charter to your auditors is multifold, as they assess each aspect of today's increasingly complex information systems nervous system. The audits should be periodic, focused on a specific aspect of the larger picture, and as unintrusive as possible. They should yield a systematic and repeatable report, which is then passed back into the system for assessment and mitigation. Your audit teams use a documentation tool to determine who, what, and how to audit your assets, and the result is that the external audit becomes a quality checkpoint rather than an item causing worry, fear, or loathing.
|John Legg 05/13/04 08:26:22 PM EDT|
An impressive solution to Linux (as well as Unix) audits is at www.mase.com Many standard policies as well as customized ones can be monitored very quick and painlessly.
|Mark Post 04/22/04 04:42:51 PM EDT|
The author apparently isn't familiar with SSL/TLS support in OpenLDAP. Nothing has to pass in clear text when using that feature.
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
Aug. 28, 2016 02:00 AM EDT Reads: 1,767
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
Aug. 28, 2016 01:30 AM EDT Reads: 2,079
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
Aug. 28, 2016 01:00 AM EDT Reads: 2,963
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Aug. 28, 2016 12:15 AM EDT Reads: 1,825
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Aug. 27, 2016 11:00 PM EDT Reads: 3,997
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
Aug. 27, 2016 08:45 PM EDT Reads: 2,354
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Aug. 27, 2016 06:00 PM EDT Reads: 3,099
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, will deep dive into best practices that will ensure a successful smart city journey.
Aug. 27, 2016 05:15 PM EDT Reads: 1,583
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Aug. 27, 2016 05:00 PM EDT Reads: 1,885
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
Aug. 27, 2016 04:00 PM EDT Reads: 601
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
Aug. 27, 2016 03:15 PM EDT Reads: 777
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
Aug. 27, 2016 12:45 PM EDT Reads: 2,355
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
Aug. 27, 2016 12:30 PM EDT Reads: 3,626
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Aug. 27, 2016 11:00 AM EDT Reads: 2,380
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Aug. 27, 2016 10:15 AM EDT Reads: 1,928
Pulzze Systems was happy to participate in such a premier event and thankful to be receiving the winning investment and global network support from G-Startup Worldwide. It is an exciting time for Pulzze to showcase the effectiveness of innovative technologies and enable them to make the world smarter and better. The reputable contest is held to identify promising startups around the globe that are assured to change the world through their innovative products and disruptive technologies. There w...
Aug. 27, 2016 07:45 AM EDT Reads: 697
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abil...
Aug. 27, 2016 02:30 AM EDT Reads: 2,025
Is the ongoing quest for agility in the data center forcing you to evaluate how to be a part of infrastructure automation efforts? As organizations evolve toward bimodal IT operations, they are embracing new service delivery models and leveraging virtualization to increase infrastructure agility. Therefore, the network must evolve in parallel to become equally agile. Read this essential piece of Gartner research for recommendations on achieving greater agility.
Aug. 25, 2016 05:15 PM EDT Reads: 838
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
Aug. 25, 2016 01:00 PM EDT Reads: 2,664
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Aug. 25, 2016 08:45 AM EDT Reads: 2,189