Click here to close now.


Linux Containers Authors: VictorOps Blog, Liz McMillan, Victoria Livschitz, SmartBear Blog, Craig Lowell

Related Topics: Linux Containers

Linux Containers: Blog Post

Create Linux User Login Monitor on Monitis

Monitis provides the ability to monitor almost any operation on your server

Monitis provides the ability to monitor almost any operation on your server.  Using simple Linux tools and scripts you are able to monitor each time a user logs into the server and capture various information, including username, host address and login service.  Using pam_script and bash scripts, you are able to transmit information to a Custom Monitor with this information.

API Access

The first thing you will need in order to create this monitor is the Monitis API Key and Secret Key.  The API Key is a alphanumeric code that allows you to access the Monitis API url’s and transmit or receive data about your Monitis services.  The Secret Key is an alphanumeric code that allows you to digitally sign your information to ensure that only you can transmit data to your Monitis account.  Your API Key may be disclosed to anyone, but your Secret Key must be maintained private and should not be shared nor transmitted.  To obtain your Monitis API Key and Secret Key, log into your account and from the top menu bar, go to Tools then API then API Key, it will display both your API Key and your Secret Key.

Now let’s test your API access.  You should be able to connect and get an Auth Token:

curl '[API Key]&secretkey=[Secret Key]&version=2'

In the above command you should replace [API Key] and [Secret Key] with your API Key and Secret Key.  We are using curl in order to connect to and access the API to get a Auth Token.  The return value is json and sends back something similar to:


Where the alphanumeric code will be your Auth Token.  You can use your Auth Token to validate against the API later.   However sending your Secret Key is not extremely secure, others could possibly  obtain your Secret Key this way.  The more secure method of authenticating is to send your data using POST instead of GET and using a Base64-encoded RFC 2104-compliant HMAC signature to sign the post data.  The signature is sent in the checksum parameter of the POST data.  To calculate the checksum you must follow these rules:
  1. sort all parameters alphabetically by name (excluding the checksum parameter)
  2. concat all parameter names and values like this: name1value1name2value2…
  3. create Base64-encoded RFC 2104-compliant HMAC signature using Secret Key

The final rule can be calculated using openssl:

echo -en “name1value1name2value2” | openssl dgst -sha1 -hmac [Secret Key] -binary | openssl enc -base64

Creating a Custom Monitor

In order to create a custom monitor, you must send a POST request to the API.  This POST request must contain several parameters: action, name, resultsParams, and tag (refer to for specifications).  We will use the following specifications for the params:

  • action=addMonitor
  • name=Login Monitor
  • resultsParam=user_login:Login Name:logins:3;host:Host Address:hostaddress:3;srv:Service:service:3
  • tag=loginMonitor

There is other necessary information in order to communicate with the API:

  • apikey=[API Key]
  • timestamp=[Current UTC time]
  • version=2
In order to create our new monitor called Login Monitor we would post this data plus a checksum to which is the Custom Monitor API url.  Here is a simple script that will accomplish this:

# create a Custom Monitor for Monitis
# Be sure to modify the API Key and Secret Key
NAME="login monitor"
RESULTPARAMS="user_login:Login Name:logins:3;host:Host Address:hostaddress:3;srv:Service:service:3"
TIMESTAMP=`date -u +"%F %T"`
SECRETKEY="[Secret Key]"

# Create Checksum
CHECKSUM_STR="action"$ACTION"apikey"$APIKEY"name"$NAME"resultParams"$RESULTPARAMS"tag"$ TAG"timestamp"$TIMESTAMP"version"$VERSION
CHECKSUM=$(echo -en $CHECKSUM_STR | openssl dgst -sha1 -hmac $SECRETKEY -binary | openssl enc -base64 )

# Post Data to API
POSTDATA="--data-urlencode \"action="$ACTION"\" --data-urlencode \"apikey="$APIKEY"\" --data-urlencode \"name="$NAME"\" --data-urlencode \"resultParams="$RESULTPARAMS"\" --data-urlencode \"tag="$TAG"\" --data-urlencode \"timestamp=$TIMESTAMP\" --data-urlencode \"version="$VERSION"\" --data-urlencode \"checksum="$CHECKSUM"\""

eval "curl ${POSTDATA} $URL"

Save the above script into a file called, be sure not to change the order of the variables in the checksum calculation as they must be in alphabetical order.  Ensure to make this file executable:

chmod 755

Now run it:

The output should look similar to this:


This is showing us that the monitor was successfully created and that the id of the resulting monitor is 305.  If you go to your Monitis account now, you will be able to access this monitor.  From the top level menu, go to Monitors then Manage Monitors and then Custom Monitors.  Here you should find the Login Monitor.  Click the check box next to the title and then click Add to Window.  A window will pop up below the Custom Monitors dialog box.  Close the Custom Monitors dialog box and you will see your new monitor there.  But no data has been sent to it, so it is not that interesting.

Sending Data to Custom Monitor

In order to send data to your Custom Monitor, you must provide the action, monitorId, checktime, and results (refer to for specifications).  The action is addResult, the monitorId is the id that was returned to us in the previous example (If you forgot the id, don’t worry we will get it back), the checktime is the timestamp of the results data, and the results is a string of the parameters and values in this format: name1value1;name2value2

The following script will send data to your Custom Monitor:

# add result to Custom Monitor for Monitis

cat << EOF
usage: $0 options

This script will add results to a Custom Monitis Monitor.

-h Show this message
-a api key
-s secret key
-m monitor tag
-i monitor id
-t timestamp (defaults to utc now)
-r results name:value[;name2:value2...]

CHECKTIME=`date -u +"%s"000`
TIMESTAMP=`date -u +"%F %T"`

while getopts "ha:s:m:i:t:r:s:" OPTION
case $OPTION in

exit 1

if [[ -z $APIKEY ]] || [[ -z $SECRETKEY ]] || [[ -z $MONITOR$ID ]] || [[ -z $RESULTS ]] || [[ -z $CHECKTIME ]]
exit 1

# Get id of monitor if not provided
if [[ -z $ID ]]
XMLID=$(curl -s "$URL?apikey=$APIKEY&output=$OUTPUT&version=$VERSION&action=getMonitors&tag=$MONITOR" | xpath -q -e /monitors/monitor/id)

# Add monitor result
# Create Checksum
CHECKSUM_STR="action"$ACTION"apikey"$APIKEY"checktime"$CHECKTIME"monitorId"$ID"results"$ RESULTS"timestamp"$TIMESTAMP"version"$VERSION
CHECKSUM=$(echo -en $CHECKSUM_STR | openssl dgst -sha1 -hmac $SECRETKEY -binary | openssl enc -base64 )
# Post Data to API

POSTDATA="--data-urlencode \"action="$ACTION"\" --data-urlencode \"apikey="$APIKEY"\" --data-urlencode \"checktime="$CHECKTIME"\" --data-urlencode \"monitorId="$ID"\" --data-urlencode \"results="$RESULTS"\" --data-urlencode \"timestamp=$TIMESTAMP\" --data-urlencode \"version="$VERSION"\" --data-urlencode \"checksum="$CHECKSUM"\""

eval "curl ${POSTDATA} $URL"

Save this file to and make executable.  You can run it with no parameters to get a help menu, that should be self-explanatory.  You can either provide the API Key and Secret Key on the command-line or fill in the script to contain it.  The script will provide you with the monitorId if you forget yours, but you will have to know the tag name you gave to your Custom Monitor when you created it.  Therefore, either your tag or your monitorId is required to run this script.

Capturing Information on Login

Now that we have a script to send data to the Custom Monitor, we need to have data to send.  This script could easily be run from .bashrc or /etc/bashrc – and that would work fine, if we knew that no user would be deleting their .bashrc.  Since we cannot guarantee that, we will use PAM (Pluggable Authentication Module) to control how and when we send information to the Custom Monitor.  Since no user without root access will be able to alter PAM, this is a secure way to guarantee login information.  Also since sshd, sftp, ftp, and most other programs utilize PAM for authentication, this will monitor all logins to the server, not just shell logins.

PAM offers many options and modules, we will be utilizing a module called pam_script.  pam_script allows you to execute a script on session open, session close, and/or on auth.  You must download and install pam_script first:

wget '' -O libpam-script.tar.gz
tar -xzvf libpam-script.tar.gz
cd libpam-script-x.x.x #x.x.x is the version that you just download, apparent from tar output
sudo cp /lib/security/
sudo chown root:root /lib/security/
sudo chmod 755 /lib/security/

pam_script is now installed, but not configured.  There are three files associated with pam_script, /etc/security/onsessionopen /etc/security/onsessionclose /etc/security/onauth  The first two files will work on a session and the last will work for a successful auth.  Since we want to monitor successful auths, we will create the onauth file:

# onauth for Monitis Custom Login Monitor

/etc/security/ -m loginMonitor -r "user_login:$USER;host:$HOST;srv:$SERVICE"

This script will require that you move the script to /etc/security and make it and the onauth script executable by root and owned by root:

sudo mv /etc/security
sudo chmod 700 /etc/security/
sudo chown root:root /etc/security/
sudo chmod 700 /etc/security/onauth
sudo chown root:root /etc/security/onauth

Now we need to set PAM to utilize the pam_script module.  Depending on your system this will vary, but you need to edit the /etc/pam.d/common-auth file or something similar on your system.  You should add the following line:

# require the scripts to run at auth
auth required  runas=root expose=rhost

Here we are telling module to run as root and to expose the rhost variable, which will contain the remote host information that we utilize in the above script with the $PAM_RHOST variable

Testing the Monitor

Now we have a setup that will log all usernames, remote hosts, and service that they logged in from to our Custom Monitor.  Give it a try, ssh to your machine several times.  You will see the values appear in your account’s Custom Monitor.


Read the original blog entry...

More Stories By Hovhannes Avoyan

Hovhannes Avoyan is the CEO of Monitis, Inc., a provider of on-demand systems management and monitoring software to 50,000 users spanning small businesses and Fortune 500 companies.

Prior to Monitis, he served as General Manager and Director of Development at prominent web portal Lycos Europe, where he grew the Lycos Armenia group from 30 people to over 200, making it the company's largest development center. Prior to Lycos, Avoyan was VP of Technology at Brience, Inc. (based in San Francisco and acquired by Syniverse), which delivered mobile internet content solutions to companies like Cisco, Ingram Micro, Washington Mutual, Wyndham Hotels , T-Mobile , and CNN. Prior to that, he served as the founder and CEO of CEDIT ltd., which was acquired by Brience. A 24 year veteran of the software industry, he also runs Sourcio cjsc, an IT consulting company and startup incubator specializing in web 2.0 products and open-source technologies.

Hovhannes is a senior lecturer at the American Univeristy of Armenia and has been a visiting lecturer at San Francisco State University. He is a graduate of Bertelsmann University.

@ThingsExpo Stories
Electric power utilities face relentless pressure on their financial performance, and reducing distribution grid losses is one of the last untapped opportunities to meet their business goals. Combining IoT-enabled sensors and cloud-based data analytics, utilities now are able to find, quantify and reduce losses faster – and with a smaller IT footprint. Solutions exist using Internet-enabled sensors deployed temporarily at strategic locations within the distribution grid to measure actual line loads.
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, will explore the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context w...
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
There will be 20 billion IoT devices connected to the Internet soon. What if we could control these devices with our voice, mind, or gestures? What if we could teach these devices how to talk to each other? What if these devices could learn how to interact with us (and each other) to make our lives better? What if Jarvis was real? How can I gain these super powers? In his session at 17th Cloud Expo, Chris Matthieu, co-founder and CTO of Octoblu, will show you!
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
SYS-CON Events announced today that Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, will keynote at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
The IoT is upon us, but today’s databases, built on 30-year-old math, require multiple platforms to create a single solution. Data demands of the IoT require Big Data systems that can handle ingest, transactions and analytics concurrently adapting to varied situations as they occur, with speed at scale. In his session at @ThingsExpo, Chad Jones, chief strategy officer at Deep Information Sciences, will look differently at IoT data so enterprises can fully leverage their IoT potential. He’ll share tips on how to speed up business initiatives, harness Big Data and remain one step ahead by apply...
WebRTC converts the entire network into a ubiquitous communications cloud thereby connecting anytime, anywhere through any point. In his session at WebRTC Summit,, Mark Castleman, EIR at Bell Labs and Head of Future X Labs, will discuss how the transformational nature of communications is achieved through the democratizing force of WebRTC. WebRTC is doing for voice what HTML did for web content.
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, focusing on how to help solve the problems that developers are continuously battling.
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
WebRTC: together these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at WebRTC Summit, Cary Bran, VP of Innovation and New Ventures at Plantronics and PLT Labs, will provide an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it may enable, complement or entirely transform.
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.