Welcome!

Linux Containers Authors: Elizabeth White, Reinhard Brandstädter, Flint Brenton, AppDynamics Blog, John Esposito

Related Topics: Cloud Security, Linux Containers, Open Source Cloud

Cloud Security: Tutorial

Metasploit Nessus Bridge on Ubuntu

Nessus is a vulnerability scanner program

Nessus is a vulnerability scanner program; it is free for personal use using the nessus for home. They also have a nessus for business which requires a fee. I will be discussing the nessus for home use and using it with the popular metasploit framework. Acquire the latest release of nessus homefeed Nessus-4.4.1-ubuntu1010_i386.deb and register for the activation code. Follow the instructions listed in the document ion for installing with Ubuntu and start to configure. Nessus daemon cant be started until nessus has been registered and the plugin download has occurred.


$ sudo /opt/nessus/bin/nessus-fetch –register 'registration code from nessus'
Add user
$ sudo /opt/nessus/sbin/nessus-adduser
Make cert
$ sudo /opt/nessus/sbin/nessus-mkcert
Start the nessus Daemon
$ sudo /etc/init.d/nessusd start

Open up web browser to https://localhost:8834, login and complete a policy for your scans. I would create a number of policies based on the different systems that you will be scanning. If your scanning a windows environment then having the plugin for Linux and BSD are pointless. Also make sure that you have safe checks enabled, select a port scanner to use, select credentials, select plugins (remember not to enable ones that will bounce the box), and select preferences. When finished you should have a number of different policies that will be numbered 1 – however many you have and you can give them names for example for scanning windows environment you can label them as windows. Now you can logout of nessus and close the web browser.

Now open up a terminal and browse to where metasploit is installed and run an update.

$ cd /opt/framework-3.6.0/msf3
$ sudo svn update

Before we start the msfconsole lets get our database in proper order. Now I have used sqlite3 in the past and even did a tutorial on my website using sqlite3 http://pbnetworks.net/?cmd=bbs&id=35 which worked fine but sometimes it may not work and give error warning 'Note that sqlite is not supported due to numerous issues. It may work, but don't count on it.' Postgres is the recommended database for Metasploit. So lets install the postgres database and libraries.

$ sudo apt-get install postgresql-8.4
$ sudo apt-get install rubygems libpq-dev
$ sudo gem install pg
$ sudo apt-get install libreadline-dev
$ sudo apt-get install libssl-dev
$ sudo apt-get install libpq5
$ sudo apt-get install ruby-dev

You will need to become the system postgres user

$ sudo -s
# su postgres

Now you will need to create a database user:

$ createuser <user account name> -P
Enter password for new role:
Enter it again:
Shall the new role be a superuser? (y/n) n
Shall the new role be allowed to create databases? (y/n) n
shall the new role be allowed to create more new roles? (y/n) n
Next we need to crate a database:

$ createdb –owner=<user account name> msf_database

Now we can start up metasploit:

:/opt/framework-3.6.0/msf3$ sudo ./msfconsole

Enter in the following commands:

msf> db_driver postgresql
msf> db_connect <user account name>:<password>@127.0.0.1:5432/msf_database
msf> db_hosts

Now before, when using sqlite3, creating and connecting to the database was easy. I would start up metasploit and issue the following commands:

msf> db_driver sqlite3
msf> db_connect

To verify if the database was connected I would issue the following command:

msf> db_hosts

If everything looked good I would have no errors and I could use the db_nmap command. But sometimes I would encounter errors and it would crash. Using postgres is more reliable than sqlite3 but is still useful as I will describe later. Finally go ahead and enable the database on startup by issuing the following commands:

$ cat > ~/.msf3/msfconsole.rc
db_driver postgresql
db_connect <user name account>:<password>@127.0.0.1:5432/msf_database
db_workspace -a MyProject
^D

Now the next time you fire up metasploit your database will automatically be up and you will be connected to it. Just make sure that you have postgres running, I run postgres manually before I start metasploit. (see Figure 1)

Figure 1 Notice that postgresql loads when first starting the msfconsole

$ sudo /etc/init.d/postgresql-8.4 start
$ su postgres

Now just change directory over to /opt/framework-3.6.0/msf3 and start the msfconsole.  Now that we have postgres as the database for metasploit lets start using nessus from within metasploit.  Open up a second terminal and make sure nessus is running if not load the daemon. Now from the msfconsole load nessus (see Figure 2)

msf > load nessus

Now let see what kind of commands the Nessus Bridge for Metasploit 1.1 has given us, type nessus_help (see Figure 3)


Figure 2 loading nessus from the msfconsole

 

Figure 3 nessus_help

msf > nessus_help

The commands are divided up into different sections labeled Generic, Reports, Scan, Plugin, User, and Policy commands. Before we can run a scan we need to connect to the nessus server by using the nessus_connect command:

msf > nessus_connect <nessus username>:<password>@localhost:8834 ok

This should connect and authenticate you. From here you can run the scans, review the results, and load the scan results into the database and use autopwn feature. Or you can view the results and find a vulnerability with a system you scanned and throw a single exploit and get a meterpreter shell. Depending on the environment you may want to review the results of your nessus output and find the appropriate exploit to use instead of generating the noise of running autopwn. Now lets start our scan by issuing nessus_scan_new command as follows nessus_scan_new <policy id> (this was set in your nessus policy settings) <scan name> (generic) <target> (ip address)

msf > nessus_scan_new 1 winXP_home 192.168.1.124

To check up on the status of our scan use the nessus scan status feature (see Figure 4)


Figure 4 nessus_scan_status

msf > nessus_scan_status

When the scan has completed you can view the results using the following commands:

msf > nessus_report_list

We can view a list of hosts from the report with the following command:

msf > nessus_report_hosts UID

To view further information issue the following command:

msf > nessus_report_host_ports <ip address> UID (see Figure 5)

Figure 5 nessus_report_host_ports 192.168.1.124 UID

To see a list of hosts issue the db_host command. If you want to remove hosts from the db_hosts file then issue the db_del_host command (see Figure 6)

Now with the scan complete and the host listed in the db_hosts file you can run the autopwn tool or find an exploit that will work against the box. More on this in another article next month.

Now lets take a look at using nmap within the metasploit framework.

To use the nmap command from within the metasploit framework use the 'db_nmap' command to run nmap scans against targets and have the scan results stored in the database. When running on Back|Track I can issue many different nmap commands such as db_nmap -sS -sV -T 3 -P0 -O <ip address> -D RND --packet-trace. Which show the results: -sS TCP SYN stealth scan, -sV version scan, -T 3 normal scan, -O find the operating system, -D RND use a decoy and generate a random, non-reserved IP address, and finally --packet-trace will trace packets and data sent and received. I like to use the packet-trace feature on large scans because if it fails you can see it. Now this is great feature to use while in the msf console but I cant do this when using Unbuntu and connected to the postgres database as the postgres user. Why? Because I get an error saying that only the root user has the ability to use this nmap option (see Figure 7). I can use 'db_nmap -v -sV 192.168.15.0/24 --packet-trace' and the scan runs and produces an output. I have view the results with the following commands (see Figure 8).

msf > db_hosts
msf > db_services -c port,state

Figure 6 db_del_host command

 

Figure 7 nmap error with postgres

Now if I want to issue complex nmap scans I can exit out of the msf prompt, exit out of postgres, stop the database and login with sudo and use the sqlite3 database. The same command that the OS didn't allow me to use now can be used with no problem (see Figure 9)

msf > db_nmap -sS -sV -T 4 -P0 -O 192.168.15.0/24 -D RND --packet-trace

Look at the difference in results we now have after viewing information in the db_hosts and db_services -c port,state commands. Compare difference between Figure 10 and Figure 8 below.

Figure 8 db_nmap using postgres database

Figure 9 db_nmap using sqlite3

 

Figure 10 nmap results using sqlite3

Conclusion
This information can be useful in checking the integrity and strength of your network if you are the Network Security Engineer for your workplace, and have permission to do so. Doing this to networks that you have no authorization to be on is against the law in many if not all countries. For more information and some video tutorial please visit my website at http://pbnetworks.net

On the 'Net
Link to postgres setup: http://dev.metasploit.com/redmine/projects/framework/wiki/Postgres_setup
Link to video tutorials: http://pbnetworks.net/?cmd=bbs

Let pbnetworks get your pen test on target

Visit us and learn how http://pbnetworks.net

How secure is your network?

More Stories By David Dodd

David J. Dodd is currently in the United States and holds a current 'Top Secret' DoD Clearance and is available for consulting on various Information Assurance projects. A former U.S. Marine with Avionics background in Electronic Countermeasures Systems. David has given talks at the San Diego Regional Security Conference and SDISSA, is a member of InfraGard, and contributes to Secure our eCity http://securingourecity.org. He works for Xerox as Information Security Officer City of San Diego & pbnetworks Inc. http://pbnetworks.net a Service Disabled Veteran Owned Small Business (SDVOSB) located in San Diego, CA and can be contacted by emailing: dave at pbnetworks.net.

@ThingsExpo Stories
Increasing IoT connectivity is forcing enterprises to find elegant solutions to organize and visualize all incoming data from these connected devices with re-configurable dashboard widgets to effectively allow rapid decision-making for everything from immediate actions in tactical situations to strategic analysis and reporting. In his session at 18th Cloud Expo, Shikhir Singh, Senior Developer Relations Manager at Sencha, will discuss how to create HTML5 dashboards that interact with IoT devic...
In his session at @ThingsExpo, Chris Klein, CEO and Co-founder of Rachio, will discuss next generation communities that are using IoT to create more sustainable, intelligent communities. One example is Sterling Ranch, a 10,000 home development that – with the help of Siemens – will integrate IoT technology into the community to provide residents with energy and water savings as well as intelligent security. Everything from stop lights to sprinkler systems to building infrastructures will run ef...
Artificial Intelligence has the potential to massively disrupt IoT. In his session at 18th Cloud Expo, AJ Abdallat, CEO of Beyond AI, will discuss what the five main drivers are in Artificial Intelligence that could shape the future of the Internet of Things. AJ Abdallat is CEO of Beyond AI. He has over 20 years of management experience in the fields of artificial intelligence, sensors, instruments, devices and software for telecommunications, life sciences, environmental monitoring, process...
The demand for organizations to expand their infrastructure to multiple IT environments like the cloud, on-premise, mobile, bring your own device (BYOD) and the Internet of Things (IoT) continues to grow. As this hybrid infrastructure increases, the challenge to monitor the security of these systems increases in volume and complexity. In his session at 18th Cloud Expo, Stephen Coty, Chief Security Evangelist at Alert Logic, will show how properly configured and managed security architecture can...
We’ve worked with dozens of early adopters across numerous industries and will debunk common misperceptions, which starts with understanding that many of the connected products we’ll use over the next 5 years are already products, they’re just not yet connected. With an IoT product, time-in-market provides much more essential feedback than ever before. Innovation comes from what you do with the data that the connected product provides in order to enhance the customer experience and optimize busi...
A critical component of any IoT project is the back-end systems that capture data from remote IoT devices and structure it in a way to answer useful questions. Traditional data warehouse and analytical systems are mature technologies that can be used to handle large data sets, but they are not well suited to many IoT-scale products and the need for real-time insights. At Fuze, we have developed a backend platform as part of our mobility-oriented cloud service that uses Big Data-based approache...
The increasing popularity of the Internet of Things necessitates that our physical and cognitive relationship with wearable technology will change rapidly in the near future. This advent means logging has become a thing of the past. Before, it was on us to track our own data, but now that data is automatically available. What does this mean for mHealth and the "connected" body? In her session at @ThingsExpo, Lisa Calkins, CEO and co-founder of Amadeus Consulting, will discuss the impact of wea...
SYS-CON Events announced today that Ericsson has been named “Gold Sponsor” of SYS-CON's @ThingsExpo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. Ericsson is a world leader in the rapidly changing environment of communications technology – providing equipment, software and services to enable transformation through mobility. Some 40 percent of global mobile traffic runs through networks we have supplied. More than 1 billion subscribers around the world re...
SYS-CON Events announced today that Peak 10, Inc., a national IT infrastructure and cloud services provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Peak 10 provides reliable, tailored data center and network services, cloud and managed services. Its solutions are designed to scale and adapt to customers’ changing business needs, enabling them to lower costs, improve performance and focus inter...
The IoTs will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm and share the must-have mindsets for removing complexity from the development proc...
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vice president of product management, IoT solutions at GlobalSign, will teach IoT developers how t...
There is an ever-growing explosion of new devices that are connected to the Internet using “cloud” solutions. This rapid growth is creating a massive new demand for efficient access to data. And it’s not just about connecting to that data anymore. This new demand is bringing new issues and challenges and it is important for companies to scale for the coming growth. And with that scaling comes the need for greater security, gathering and data analysis, storage, connectivity and, of course, the...
So, you bought into the current machine learning craze and went on to collect millions/billions of records from this promising new data source. Now, what do you do with them? Too often, the abundance of data quickly turns into an abundance of problems. How do you extract that "magic essence" from your data without falling into the common pitfalls? In her session at @ThingsExpo, Natalia Ponomareva, Software Engineer at Google, will provide tips on how to be successful in large scale machine lear...
Digital payments using wearable devices such as smart watches, fitness trackers, and payment wristbands are an increasing area of focus for industry participants, and consumer acceptance from early trials and deployments has encouraged some of the biggest names in technology and banking to continue their push to drive growth in this nascent market. Wearable payment systems may utilize near field communication (NFC), radio frequency identification (RFID), or quick response (QR) codes and barcodes...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
The IETF draft standard for M2M certificates is a security solution specifically designed for the demanding needs of IoT/M2M applications. In his session at @ThingsExpo, Brian Romansky, VP of Strategic Technology at TrustPoint Innovation, will explain how M2M certificates can efficiently enable confidentiality, integrity, and authenticity on highly constrained devices.
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, will discuss how leveraging the Industrial Interne...
Whether your IoT service is connecting cars, homes, appliances, wearable, cameras or other devices, one question hangs in the balance – how do you actually make money from this service? The ability to turn your IoT service into profit requires the ability to create a monetization strategy that is flexible, scalable and working for you in real-time. It must be a transparent, smoothly implemented strategy that all stakeholders – from customers to the board – will be able to understand and comprehe...
You deployed your app with the Bluemix PaaS and it's gaining some serious traction, so it's time to make some tweaks. Did you design your application in a way that it can scale in the cloud? Were you even thinking about the cloud when you built the app? If not, chances are your app is going to break. Check out this webcast to learn various techniques for designing applications that will scale successfully in Bluemix, for the confidence you need to take your apps to the next level and beyond.