Welcome!

Linux Containers Authors: Liz McMillan, Pat Romanski, Yeshim Deniz, Roger Strukhoff, Elizabeth White

Related Topics: Cloud Security, Linux Containers, Open Source Cloud

Cloud Security: Tutorial

Metasploit Nessus Bridge on Ubuntu

Nessus is a vulnerability scanner program

Nessus is a vulnerability scanner program; it is free for personal use using the nessus for home. They also have a nessus for business which requires a fee. I will be discussing the nessus for home use and using it with the popular metasploit framework. Acquire the latest release of nessus homefeed Nessus-4.4.1-ubuntu1010_i386.deb and register for the activation code. Follow the instructions listed in the document ion for installing with Ubuntu and start to configure. Nessus daemon cant be started until nessus has been registered and the plugin download has occurred.


$ sudo /opt/nessus/bin/nessus-fetch –register 'registration code from nessus'
Add user
$ sudo /opt/nessus/sbin/nessus-adduser
Make cert
$ sudo /opt/nessus/sbin/nessus-mkcert
Start the nessus Daemon
$ sudo /etc/init.d/nessusd start

Open up web browser to https://localhost:8834, login and complete a policy for your scans. I would create a number of policies based on the different systems that you will be scanning. If your scanning a windows environment then having the plugin for Linux and BSD are pointless. Also make sure that you have safe checks enabled, select a port scanner to use, select credentials, select plugins (remember not to enable ones that will bounce the box), and select preferences. When finished you should have a number of different policies that will be numbered 1 – however many you have and you can give them names for example for scanning windows environment you can label them as windows. Now you can logout of nessus and close the web browser.

Now open up a terminal and browse to where metasploit is installed and run an update.

$ cd /opt/framework-3.6.0/msf3
$ sudo svn update

Before we start the msfconsole lets get our database in proper order. Now I have used sqlite3 in the past and even did a tutorial on my website using sqlite3 http://pbnetworks.net/?cmd=bbs&id=35 which worked fine but sometimes it may not work and give error warning 'Note that sqlite is not supported due to numerous issues. It may work, but don't count on it.' Postgres is the recommended database for Metasploit. So lets install the postgres database and libraries.

$ sudo apt-get install postgresql-8.4
$ sudo apt-get install rubygems libpq-dev
$ sudo gem install pg
$ sudo apt-get install libreadline-dev
$ sudo apt-get install libssl-dev
$ sudo apt-get install libpq5
$ sudo apt-get install ruby-dev

You will need to become the system postgres user

$ sudo -s
# su postgres

Now you will need to create a database user:

$ createuser <user account name> -P
Enter password for new role:
Enter it again:
Shall the new role be a superuser? (y/n) n
Shall the new role be allowed to create databases? (y/n) n
shall the new role be allowed to create more new roles? (y/n) n
Next we need to crate a database:

$ createdb –owner=<user account name> msf_database

Now we can start up metasploit:

:/opt/framework-3.6.0/msf3$ sudo ./msfconsole

Enter in the following commands:

msf> db_driver postgresql
msf> db_connect <user account name>:<password>@127.0.0.1:5432/msf_database
msf> db_hosts

Now before, when using sqlite3, creating and connecting to the database was easy. I would start up metasploit and issue the following commands:

msf> db_driver sqlite3
msf> db_connect

To verify if the database was connected I would issue the following command:

msf> db_hosts

If everything looked good I would have no errors and I could use the db_nmap command. But sometimes I would encounter errors and it would crash. Using postgres is more reliable than sqlite3 but is still useful as I will describe later. Finally go ahead and enable the database on startup by issuing the following commands:

$ cat > ~/.msf3/msfconsole.rc
db_driver postgresql
db_connect <user name account>:<password>@127.0.0.1:5432/msf_database
db_workspace -a MyProject
^D

Now the next time you fire up metasploit your database will automatically be up and you will be connected to it. Just make sure that you have postgres running, I run postgres manually before I start metasploit. (see Figure 1)

Figure 1 Notice that postgresql loads when first starting the msfconsole

$ sudo /etc/init.d/postgresql-8.4 start
$ su postgres

Now just change directory over to /opt/framework-3.6.0/msf3 and start the msfconsole.  Now that we have postgres as the database for metasploit lets start using nessus from within metasploit.  Open up a second terminal and make sure nessus is running if not load the daemon. Now from the msfconsole load nessus (see Figure 2)

msf > load nessus

Now let see what kind of commands the Nessus Bridge for Metasploit 1.1 has given us, type nessus_help (see Figure 3)


Figure 2 loading nessus from the msfconsole

 

Figure 3 nessus_help

msf > nessus_help

The commands are divided up into different sections labeled Generic, Reports, Scan, Plugin, User, and Policy commands. Before we can run a scan we need to connect to the nessus server by using the nessus_connect command:

msf > nessus_connect <nessus username>:<password>@localhost:8834 ok

This should connect and authenticate you. From here you can run the scans, review the results, and load the scan results into the database and use autopwn feature. Or you can view the results and find a vulnerability with a system you scanned and throw a single exploit and get a meterpreter shell. Depending on the environment you may want to review the results of your nessus output and find the appropriate exploit to use instead of generating the noise of running autopwn. Now lets start our scan by issuing nessus_scan_new command as follows nessus_scan_new <policy id> (this was set in your nessus policy settings) <scan name> (generic) <target> (ip address)

msf > nessus_scan_new 1 winXP_home 192.168.1.124

To check up on the status of our scan use the nessus scan status feature (see Figure 4)


Figure 4 nessus_scan_status

msf > nessus_scan_status

When the scan has completed you can view the results using the following commands:

msf > nessus_report_list

We can view a list of hosts from the report with the following command:

msf > nessus_report_hosts UID

To view further information issue the following command:

msf > nessus_report_host_ports <ip address> UID (see Figure 5)

Figure 5 nessus_report_host_ports 192.168.1.124 UID

To see a list of hosts issue the db_host command. If you want to remove hosts from the db_hosts file then issue the db_del_host command (see Figure 6)

Now with the scan complete and the host listed in the db_hosts file you can run the autopwn tool or find an exploit that will work against the box. More on this in another article next month.

Now lets take a look at using nmap within the metasploit framework.

To use the nmap command from within the metasploit framework use the 'db_nmap' command to run nmap scans against targets and have the scan results stored in the database. When running on Back|Track I can issue many different nmap commands such as db_nmap -sS -sV -T 3 -P0 -O <ip address> -D RND --packet-trace. Which show the results: -sS TCP SYN stealth scan, -sV version scan, -T 3 normal scan, -O find the operating system, -D RND use a decoy and generate a random, non-reserved IP address, and finally --packet-trace will trace packets and data sent and received. I like to use the packet-trace feature on large scans because if it fails you can see it. Now this is great feature to use while in the msf console but I cant do this when using Unbuntu and connected to the postgres database as the postgres user. Why? Because I get an error saying that only the root user has the ability to use this nmap option (see Figure 7). I can use 'db_nmap -v -sV 192.168.15.0/24 --packet-trace' and the scan runs and produces an output. I have view the results with the following commands (see Figure 8).

msf > db_hosts
msf > db_services -c port,state

Figure 6 db_del_host command

 

Figure 7 nmap error with postgres

Now if I want to issue complex nmap scans I can exit out of the msf prompt, exit out of postgres, stop the database and login with sudo and use the sqlite3 database. The same command that the OS didn't allow me to use now can be used with no problem (see Figure 9)

msf > db_nmap -sS -sV -T 4 -P0 -O 192.168.15.0/24 -D RND --packet-trace

Look at the difference in results we now have after viewing information in the db_hosts and db_services -c port,state commands. Compare difference between Figure 10 and Figure 8 below.

Figure 8 db_nmap using postgres database

Figure 9 db_nmap using sqlite3

 

Figure 10 nmap results using sqlite3

Conclusion
This information can be useful in checking the integrity and strength of your network if you are the Network Security Engineer for your workplace, and have permission to do so. Doing this to networks that you have no authorization to be on is against the law in many if not all countries. For more information and some video tutorial please visit my website at http://pbnetworks.net

On the 'Net
Link to postgres setup: http://dev.metasploit.com/redmine/projects/framework/wiki/Postgres_setup
Link to video tutorials: http://pbnetworks.net/?cmd=bbs

Let pbnetworks get your pen test on target

Visit us and learn how http://pbnetworks.net

How secure is your network?

More Stories By David Dodd

David J. Dodd is currently in the United States and holds a current 'Top Secret' DoD Clearance and is available for consulting on various Information Assurance projects. A former U.S. Marine with Avionics background in Electronic Countermeasures Systems. David has given talks at the San Diego Regional Security Conference and SDISSA, is a member of InfraGard, and contributes to Secure our eCity http://securingourecity.org. He works for Xerox as Information Security Officer City of San Diego & pbnetworks Inc. http://pbnetworks.net a Service Disabled Veteran Owned Small Business (SDVOSB) located in San Diego, CA and can be contacted by emailing: dave at pbnetworks.net.

@ThingsExpo Stories
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone inn...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
"Digital transformation - what we knew about it in the past has been redefined. Automation is going to play such a huge role in that because the culture, the technology, and the business operations are being shifted now," stated Brian Boeggeman, VP of Alliances & Partnerships at Ayehu, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
Product connectivity goes hand and hand these days with increased use of personal data. New IoT devices are becoming more personalized than ever before. In his session at 22nd Cloud Expo | DXWorld Expo, Nicolas Fierro, CEO of MIMIR Blockchain Solutions, will discuss how in order to protect your data and privacy, IoT applications need to embrace Blockchain technology for a new level of product security never before seen - or needed.
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
Digital Transformation (DX) is not a "one-size-fits all" strategy. Each organization needs to develop its own unique, long-term DX plan. It must do so by realizing that we now live in a data-driven age, and that technologies such as Cloud Computing, Big Data, the IoT, Cognitive Computing, and Blockchain are only tools. In her general session at 21st Cloud Expo, Rebecca Wanta explained how the strategy must focus on DX and include a commitment from top management to create great IT jobs, monitor ...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
Recently, REAN Cloud built a digital concierge for a North Carolina hospital that had observed that most patient call button questions were repetitive. In addition, the paper-based process used to measure patient health metrics was laborious, not in real-time and sometimes error-prone. In their session at 21st Cloud Expo, Sean Finnerty, Executive Director, Practice Lead, Health Care & Life Science at REAN Cloud, and Dr. S.P.T. Krishnan, Principal Architect at REAN Cloud, discussed how they built...
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
No hype cycles or predictions of a gazillion things here. IoT is here. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, an Associate Partner of Analytics, IoT & Cybersecurity at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He also discussed the evaluation of communication standards and IoT messaging protocols, data...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...