|By Richard Williams||
|April 19, 2004 12:00 AM EDT||
As a decision maker in your IT organization, you're aware that your Linux systems share is growing (if your enterprise follows today's business trend). Linux installations are now available on every major hardware platform. New projects in development include Linux systems in an increasing share, and you're challenged with incorporating these Linux systems seamlessly into your operations and business processing.
These Linux systems must also now be included as part of your IT audit. IT audits are increasingly performed by cross-functional teams rather than by operations, networks, applications, or database management teams. The cross-functional audit teams have the scope and purview to examine each area of operations. Since your skilled operations teams aren't responsible for policing their own house, they can remain focused on their core skill sets.
The audit teams make scheduled passes, with strategic focus on physical security, network security, applications security, systems security, and whatever else is part of your enterprise security plan. The report is digested and parsed by the audit team leader or information security manager, who tactfully disseminates the information to the appropriate team leaders.
The first challenge emerging from this vision of corporate information systems unity is that the operations teams will potentially mistrust, hate, fear, or otherwise loathe the audit teams. This humanistic certainty is based on the perception that someone is trying to find something wrong so that blame can be assigned. Overcoming this challenge, while not a typical strategic audit goal, is important since you want the audit teams to have unfettered access, and you want their work to be supported and adopted by the operations teams. The audit teams' reports must become meaningful input for operations teams, who will review a report and mitigate the threats instead of putting out fires later because important audit information was not heeded.
Using your vision, sensibility, and other executive powers, you've attained respectful buy-in from the teams - you can now move forward to meet other challenges.
The AuditOne problem identified during Linux audits is that too many people know the root password and other elevated-privilege account passwords. These passwords are the electronic keys to the kingdom in Linux, and taking back control of these accounts is a top audit priority. Typically, everyone who has the root password knows why they shouldn't pass it out or overuse it.
There's limited accountability in most native Linux operating systems, including the lack of a cogent audit trail. The native auditability is primarily centered around the syslog and sulog facilities, which cannot describe the interactive actions of the root user with the system at the level required by the HIPPA, Sarbanes-Oxley, and NISPOM Chapter 8 requirements, to mention only a few. For example, Figure 1 shows a sample sulog, revealing a not very detailed snapshot of users using su on a system.
While they're better than nothing, the sample log entries don't describe what actions were taken after the SU command occurred. (For the uninitiated, the + or - tells you if the SU request was successful.)
The syslog example may be roughly equivalent (see Figure 2).
The example in Figure 2 also indicates privilege being elevated, but does not describe (or require) a reason. Additionally, the file(s) produced by the syslog daemon may contain information not germane to your audit, but again, some information is certainly better than nothing. You can significantly improve the auditability in your enterprise by adding third-party software that captures all standard input, output, and errors, including everything the user does with the elevated privilege.
The example below is from a policy created on a Linux system (salmon.mydomain. com), using a Symark product called PowerBroker, (version 3.2.1). It provides a root shell for any user authorized to run the command pbrun GIMMIEROOT. The policy creates an audit file akin to others available in some third-party products to give you more auditability when users gain or use elevated privilege. This particular product will log all standard input, output, and errors, as well as a complete report regarding the secured task:
$ pbrun GIMMIEROOT
Enter your reason for accessing this policy:
I need to edit the /etc/passwd file
Figure 3 shows what the resultant logfile includes. Note that the "who, what, when, where, and why" are evident in the log output.
I truncated the log file, but you can see that your audit team has the ability to see it, and to tell the who, what, when, where, and why for any elevated-privilege or vital-asset access. In addition to third-party products, Linux vendors are working hard to provide this functionality. This functionality significantly improves your teams' ability to take back the root and other elevated-privilege accounts by granting elevated privilege only when the user accesses certain commands or assets (within their normal job descriptions, for example). When access is complete, normal privilege resumes, and the user never knows the elevated password.
So you're familiar with elevated-access audit control; is your audit team is as well? Basic audit tenants include reading the documentation to determine what to audit, but what documentation do you have that describes who can access what, when, where, and why?
Your systems, applications, and networks team can collaborate to create a document like Table 1.
Your teams may have used any visualization method, but the output is a matrix of your systems (vertical axis), and your user community (horizontal axis). Notice that each login/access method is described, as well as which system each user can access, from which system, by which method. Once users are on the systems, executable commands are listed, as well as any elevated privilege required. With this documentation, your audit team now knows which systems to go to, which accounts to scrutinize, which commands should normally be allowed as the user, and which commands require elevated privilege. This documentation is simple but effective in meeting the requirement to report upward and manage outward.
Another important problem that surfaces in a Linux audit is the publication of passwords, which often happens inadvertently via secure applications scripts (Web startup or shutdown, middleware startup or shutdown, database startup or shutdown, etc.).
Information synchronization routines (such as NIS or LDAP v2) also place assets at risk, as they pass account, system, and other enterprise information around the LAN or WAN in clear case. (In the case of passwords specifically, the encrypted value is sent, but agile information bandits know the difference between a crypt, bigcrypt, or MD-5 hash. When the rest of the information is in clear case, encrypting only the password may provide little safety.)
Once passwords are obtained by a nontrusted source (someone leaves a file containing a password world-readable, for example), valuable assets are at risk on numerous fronts, including easy access to critical files/data. When an asset can be accessed by a user in masquerade, the asset is at risk. The insertion of a Trojan program, the destruction of an application, and the alteration of data are all undesirable options. Whether compromised by the pad of paper in the machine room, the e-mail to the group alias with a defunct (but still receptive) recipient, the generic account password used by consultants nationwide when installing the new software on your enterprise server, or some other method, the untrusted source now has the ability to log in to one or more systems as someone other than themselves. No audit could save you at this point, as activity performed under the guise of a trusted user is now suspect.
Fortunately, your systems audit includes the regular checking for ownership, permissions, checksums, and other embedded safety mechanisms to keep data and applications in a known good state. Program files, executables, even operating system and patch levels are being recorded and compared from audit to audit, and maintained at the most current secure levels. The LDAP directory is scrutinized for the dysfunction that occurs between Human Resources and Information Systems, causing transferred or even terminated employees to be removed to systems, but allowed to remain in the LDAP directory. This step eliminates the ability for a transferred or terminated employee to gain access to assets via an LDAP-credentialed application. You have delegated and empowered effectively, your audit team is passing back the appropriate report to the systems managers, and the integrity of the systems and programs is secure.
ConclusionAs a quick summary, your internal teams periodically perform these audits:
- Physical security
- Operating system
- Network security
- Others as you require
Your charter to your auditors is multifold, as they assess each aspect of today's increasingly complex information systems nervous system. The audits should be periodic, focused on a specific aspect of the larger picture, and as unintrusive as possible. They should yield a systematic and repeatable report, which is then passed back into the system for assessment and mitigation. Your audit teams use a documentation tool to determine who, what, and how to audit your assets, and the result is that the external audit becomes a quality checkpoint rather than an item causing worry, fear, or loathing.
|John Legg 05/13/04 08:26:22 PM EDT|
An impressive solution to Linux (as well as Unix) audits is at www.mase.com Many standard policies as well as customized ones can be monitored very quick and painlessly.
|Mark Post 04/22/04 04:42:51 PM EDT|
The author apparently isn't familiar with SSL/TLS support in OpenLDAP. Nothing has to pass in clear text when using that feature.
“The Internet of Things transforms the way organizations leverage machine data and gain insights from it,” noted Splunk’s CTO Snehal Antani, as Splunk announced accelerated momentum in Industrial Data and the IoT. The trend is driven by Splunk’s continued investment in its products and partner ecosystem as well as the creativity of customers and the flexibility to deploy Splunk IoT solutions as software, cloud services or in a hybrid environment. Customers are using Splunk® solutions to collect and correlate data from control systems, sensors, mobile devices and IT systems for a variety of Ind...
Oct. 4, 2015 04:45 PM EDT Reads: 551
As enterprises capture more and more data of all types – structured, semi-structured, and unstructured – data discovery requirements for business intelligence (BI), Big Data, and predictive analytics initiatives grow more complex. A company’s ability to become data-driven and compete on analytics depends on the speed with which it can provision their analytics applications with all relevant information. The task of finding data has traditionally resided with IT, but now organizations increasingly turn towards data source discovery tools to find the right data, in context, for business users, d...
Oct. 4, 2015 04:00 PM EDT Reads: 354
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud without worrying about any lock-in fears. In fact by having standard APIs for IaaS would help PaaS expl...
Oct. 4, 2015 02:30 PM EDT Reads: 371
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated and cloud solutions through hybrid hosting – a sustainable solution for the data required to manage I...
Oct. 4, 2015 02:00 PM EDT Reads: 387
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the cloud and the best price/performance value available. ProfitBricks was named one of the coolest Clo...
Oct. 4, 2015 01:00 PM EDT Reads: 694
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
Oct. 4, 2015 01:00 PM EDT Reads: 533
Learn how IoT, cloud, social networks and last but not least, humans, can be integrated into a seamless integration of cooperative organisms both cybernetic and biological. This has been enabled by recent advances in IoT device capabilities, messaging frameworks, presence and collaboration services, where devices can share information and make independent and human assisted decisions based upon social status from other entities. In his session at @ThingsExpo, Michael Heydt, founder of Seamless Thingies, will discuss and demonstrate how devices and humans can be integrated from a simple clust...
Oct. 4, 2015 12:00 PM EDT Reads: 602
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Bradley Holt, Developer Advocate at IBM Cloud Data Services, will demonstrate techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk will be on IBM Cloudant, Apa...
Oct. 4, 2015 11:00 AM EDT Reads: 343
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of streaming data in the cloud with an enterprise grade SLA. It features built-in integration with Azur...
Oct. 4, 2015 11:00 AM EDT Reads: 707
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Oct. 4, 2015 10:45 AM EDT Reads: 353
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
Oct. 4, 2015 09:00 AM EDT Reads: 542
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
Oct. 4, 2015 08:30 AM EDT Reads: 150
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, focusing on how to help solve the problems that developers are continuously battling.
Oct. 4, 2015 04:00 AM EDT Reads: 322
SYS-CON Events announced today that MobiDev, a software development company, will exhibit at the 17th International Cloud Expo®, which will take place November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software development company with representative offices in Atlanta (US), Sheffield (UK) and Würzburg (Germany); and development centers in Ukraine. Since 2009 it has grown from a small group of passionate engineers and business managers to a full-scale mobile software company with over 150 developers, designers, quality assurance engineers, project manage...
Oct. 4, 2015 04:00 AM EDT Reads: 659
SYS-CON Events announced today that Cloud Raxak has been named “Media & Session Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Raxak Protect automates security compliance across private and public clouds. Using the SaaS tool or managed service, developers can deploy cloud apps quickly, cost-effectively, and without error.
Oct. 3, 2015 01:15 PM EDT Reads: 573
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, will discuss the impact of technology on identity. Should we federate, or not? How should identity be secured? Who owns the identity? How is identity ...
Oct. 3, 2015 11:00 AM EDT Reads: 396
SYS-CON Events announced today that Solgeniakhela will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Solgeniakhela is the global market leader in Cloud Collaboration and Cloud Infrastructure software solutions. Designed to “Bridge the Gap” between Personal and Professional Social, Mobile and Cloud user experiences, our solutions help large and medium-sized organizations dramatically improve productivity, reduce collaboration costs, and increase the overall enterprise value by bringing ...
Oct. 2, 2015 10:00 PM EDT Reads: 539
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, will show how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He will use a Raspberry Pi to connect sensors to web services, and cloud integration to connect accounting and data, providing a Bluetooth...
Oct. 2, 2015 03:30 PM EDT Reads: 337
SYS-CON Events announced today that Micron Technology, Inc., a global leader in advanced semiconductor systems, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Micron’s broad portfolio of high-performance memory technologies – including DRAM, NAND and NOR Flash – is the basis for solid state drives, modules, multichip packages and other system solutions. Backed by more than 35 years of technology leadership, Micron's memory solutions enable the world's most innovative computing, consumer,...
Oct. 2, 2015 07:00 AM EDT Reads: 553
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
Oct. 1, 2015 02:30 PM EDT Reads: 395