Welcome!

Linux Containers Authors: Elizabeth White, Liz McMillan, Sematext Blog, Pat Romanski, Flint Brenton

Related Topics: Cloud Security, Linux Containers, Open Source Cloud

Cloud Security: Tutorial

Metasploit Nessus Bridge on Ubuntu

Nessus is a vulnerability scanner program

Nessus is a vulnerability scanner program; it is free for personal use using the nessus for home. They also have a nessus for business which requires a fee. I will be discussing the nessus for home use and using it with the popular metasploit framework. Acquire the latest release of nessus homefeed Nessus-4.4.1-ubuntu1010_i386.deb and register for the activation code. Follow the instructions listed in the document ion for installing with Ubuntu and start to configure. Nessus daemon cant be started until nessus has been registered and the plugin download has occurred.


$ sudo /opt/nessus/bin/nessus-fetch –register 'registration code from nessus'
Add user
$ sudo /opt/nessus/sbin/nessus-adduser
Make cert
$ sudo /opt/nessus/sbin/nessus-mkcert
Start the nessus Daemon
$ sudo /etc/init.d/nessusd start

Open up web browser to https://localhost:8834, login and complete a policy for your scans. I would create a number of policies based on the different systems that you will be scanning. If your scanning a windows environment then having the plugin for Linux and BSD are pointless. Also make sure that you have safe checks enabled, select a port scanner to use, select credentials, select plugins (remember not to enable ones that will bounce the box), and select preferences. When finished you should have a number of different policies that will be numbered 1 – however many you have and you can give them names for example for scanning windows environment you can label them as windows. Now you can logout of nessus and close the web browser.

Now open up a terminal and browse to where metasploit is installed and run an update.

$ cd /opt/framework-3.6.0/msf3
$ sudo svn update

Before we start the msfconsole lets get our database in proper order. Now I have used sqlite3 in the past and even did a tutorial on my website using sqlite3 http://pbnetworks.net/?cmd=bbs&id=35 which worked fine but sometimes it may not work and give error warning 'Note that sqlite is not supported due to numerous issues. It may work, but don't count on it.' Postgres is the recommended database for Metasploit. So lets install the postgres database and libraries.

$ sudo apt-get install postgresql-8.4
$ sudo apt-get install rubygems libpq-dev
$ sudo gem install pg
$ sudo apt-get install libreadline-dev
$ sudo apt-get install libssl-dev
$ sudo apt-get install libpq5
$ sudo apt-get install ruby-dev

You will need to become the system postgres user

$ sudo -s
# su postgres

Now you will need to create a database user:

$ createuser <user account name> -P
Enter password for new role:
Enter it again:
Shall the new role be a superuser? (y/n) n
Shall the new role be allowed to create databases? (y/n) n
shall the new role be allowed to create more new roles? (y/n) n
Next we need to crate a database:

$ createdb –owner=<user account name> msf_database

Now we can start up metasploit:

:/opt/framework-3.6.0/msf3$ sudo ./msfconsole

Enter in the following commands:

msf> db_driver postgresql
msf> db_connect <user account name>:<password>@127.0.0.1:5432/msf_database
msf> db_hosts

Now before, when using sqlite3, creating and connecting to the database was easy. I would start up metasploit and issue the following commands:

msf> db_driver sqlite3
msf> db_connect

To verify if the database was connected I would issue the following command:

msf> db_hosts

If everything looked good I would have no errors and I could use the db_nmap command. But sometimes I would encounter errors and it would crash. Using postgres is more reliable than sqlite3 but is still useful as I will describe later. Finally go ahead and enable the database on startup by issuing the following commands:

$ cat > ~/.msf3/msfconsole.rc
db_driver postgresql
db_connect <user name account>:<password>@127.0.0.1:5432/msf_database
db_workspace -a MyProject
^D

Now the next time you fire up metasploit your database will automatically be up and you will be connected to it. Just make sure that you have postgres running, I run postgres manually before I start metasploit. (see Figure 1)

Figure 1 Notice that postgresql loads when first starting the msfconsole

$ sudo /etc/init.d/postgresql-8.4 start
$ su postgres

Now just change directory over to /opt/framework-3.6.0/msf3 and start the msfconsole.  Now that we have postgres as the database for metasploit lets start using nessus from within metasploit.  Open up a second terminal and make sure nessus is running if not load the daemon. Now from the msfconsole load nessus (see Figure 2)

msf > load nessus

Now let see what kind of commands the Nessus Bridge for Metasploit 1.1 has given us, type nessus_help (see Figure 3)


Figure 2 loading nessus from the msfconsole

 

Figure 3 nessus_help

msf > nessus_help

The commands are divided up into different sections labeled Generic, Reports, Scan, Plugin, User, and Policy commands. Before we can run a scan we need to connect to the nessus server by using the nessus_connect command:

msf > nessus_connect <nessus username>:<password>@localhost:8834 ok

This should connect and authenticate you. From here you can run the scans, review the results, and load the scan results into the database and use autopwn feature. Or you can view the results and find a vulnerability with a system you scanned and throw a single exploit and get a meterpreter shell. Depending on the environment you may want to review the results of your nessus output and find the appropriate exploit to use instead of generating the noise of running autopwn. Now lets start our scan by issuing nessus_scan_new command as follows nessus_scan_new <policy id> (this was set in your nessus policy settings) <scan name> (generic) <target> (ip address)

msf > nessus_scan_new 1 winXP_home 192.168.1.124

To check up on the status of our scan use the nessus scan status feature (see Figure 4)


Figure 4 nessus_scan_status

msf > nessus_scan_status

When the scan has completed you can view the results using the following commands:

msf > nessus_report_list

We can view a list of hosts from the report with the following command:

msf > nessus_report_hosts UID

To view further information issue the following command:

msf > nessus_report_host_ports <ip address> UID (see Figure 5)

Figure 5 nessus_report_host_ports 192.168.1.124 UID

To see a list of hosts issue the db_host command. If you want to remove hosts from the db_hosts file then issue the db_del_host command (see Figure 6)

Now with the scan complete and the host listed in the db_hosts file you can run the autopwn tool or find an exploit that will work against the box. More on this in another article next month.

Now lets take a look at using nmap within the metasploit framework.

To use the nmap command from within the metasploit framework use the 'db_nmap' command to run nmap scans against targets and have the scan results stored in the database. When running on Back|Track I can issue many different nmap commands such as db_nmap -sS -sV -T 3 -P0 -O <ip address> -D RND --packet-trace. Which show the results: -sS TCP SYN stealth scan, -sV version scan, -T 3 normal scan, -O find the operating system, -D RND use a decoy and generate a random, non-reserved IP address, and finally --packet-trace will trace packets and data sent and received. I like to use the packet-trace feature on large scans because if it fails you can see it. Now this is great feature to use while in the msf console but I cant do this when using Unbuntu and connected to the postgres database as the postgres user. Why? Because I get an error saying that only the root user has the ability to use this nmap option (see Figure 7). I can use 'db_nmap -v -sV 192.168.15.0/24 --packet-trace' and the scan runs and produces an output. I have view the results with the following commands (see Figure 8).

msf > db_hosts
msf > db_services -c port,state

Figure 6 db_del_host command

 

Figure 7 nmap error with postgres

Now if I want to issue complex nmap scans I can exit out of the msf prompt, exit out of postgres, stop the database and login with sudo and use the sqlite3 database. The same command that the OS didn't allow me to use now can be used with no problem (see Figure 9)

msf > db_nmap -sS -sV -T 4 -P0 -O 192.168.15.0/24 -D RND --packet-trace

Look at the difference in results we now have after viewing information in the db_hosts and db_services -c port,state commands. Compare difference between Figure 10 and Figure 8 below.

Figure 8 db_nmap using postgres database

Figure 9 db_nmap using sqlite3

 

Figure 10 nmap results using sqlite3

Conclusion
This information can be useful in checking the integrity and strength of your network if you are the Network Security Engineer for your workplace, and have permission to do so. Doing this to networks that you have no authorization to be on is against the law in many if not all countries. For more information and some video tutorial please visit my website at http://pbnetworks.net

On the 'Net
Link to postgres setup: http://dev.metasploit.com/redmine/projects/framework/wiki/Postgres_setup
Link to video tutorials: http://pbnetworks.net/?cmd=bbs

Let pbnetworks get your pen test on target

Visit us and learn how http://pbnetworks.net

How secure is your network?

More Stories By David Dodd

David J. Dodd is currently in the United States and holds a current 'Top Secret' DoD Clearance and is available for consulting on various Information Assurance projects. A former U.S. Marine with Avionics background in Electronic Countermeasures Systems. David has given talks at the San Diego Regional Security Conference and SDISSA, is a member of InfraGard, and contributes to Secure our eCity http://securingourecity.org. He works for Xerox as Information Security Officer City of San Diego & pbnetworks Inc. http://pbnetworks.net a Service Disabled Veteran Owned Small Business (SDVOSB) located in San Diego, CA and can be contacted by emailing: dave at pbnetworks.net.

@ThingsExpo Stories
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
"IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications," stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
The Internet of Things (IoT) promises to simplify and streamline our lives by automating routine tasks that distract us from our goals. This promise is based on the ubiquitous deployment of smart, connected devices that link everything from industrial control systems to automobiles to refrigerators. Unfortunately, comparatively few of the devices currently deployed have been developed with an eye toward security, and as the DDoS attacks of late October 2016 have demonstrated, this oversight can ...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
Businesses and business units of all sizes can benefit from cloud computing, but many don't want the cost, performance and security concerns of public cloud nor the complexity of building their own private clouds. Today, some cloud vendors are using artificial intelligence (AI) to simplify cloud deployment and management. In his session at 20th Cloud Expo, Ajay Gulati, Co-founder and CEO of ZeroStack, will discuss how AI can simplify cloud operations. He will cover the following topics: why clou...
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
"ReadyTalk is an audio and web video conferencing provider. We've really come to embrace WebRTC as the platform for our future of technology," explained Dan Cunningham, CTO of ReadyTalk, in this SYS-CON.tv interview at WebRTC Summit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Successful digital transformation requires new organizational competencies and capabilities. Research tells us that the biggest impediment to successful transformation is human; consequently, the biggest enabler is a properly skilled and empowered workforce. In the digital age, new individual and collective competencies are required. In his session at 19th Cloud Expo, Bob Newhouse, CEO and founder of Agilitiv, drew together recent research and lessons learned from emerging and established compa...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, discussed why and how ReadyTalk diverted from healthy revenue and mor...
We are always online. We access our data, our finances, work, and various services on the Internet. But we live in a congested world of information in which the roads were built two decades ago. The quest for better, faster Internet routing has been around for a decade, but nobody solved this problem. We’ve seen band-aid approaches like CDNs that attack a niche's slice of static content part of the Internet, but that’s it. It does not address the dynamic services-based Internet of today. It does...
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, discussed the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They also reviewed two "free infrastructure" pr...