When I heard about a UPI report that the US Carrier George Washington has been attacked by a Chinese war ship, and was damaged, it smelled fishy so I started poking around the interwebs.
Press International’s website and Twitter account were hacked Friday afternoon, with someone attempting to publish false stories.
It started on Twitter, where six fake headlines were posted in about 10 minutes, starting about 1:20 p.m. Some of them were about the Federal Reserve; others contained a false report that the USS George Washington had been attacked.
The aircraft carrier George Washington has not been attacked, and World War III has not begun, despite what tweets from United Press International say, the Navy has confirmed.
The carrier is in port, not in the South China Sea, the Navy told Military Times on Friday.
While everyone waits for Apple to release a patch for the ShellShock bug, one of the maintainers of BASH assisted with detailing out how to patch BASH (and SH) on OSX to prevent the Vuln. This comes from the helpful Apple section of Stack Exchange.
NOTE: To perform this patch you MUST be granted sudo privs on your machine — if not you won’t be able to move the new files into the required location.
Testing to see if you are vulnerable
First things first.. see if you are vulnerable by checking your version of BASH. The desired version is this; GNU bash, version 3.2.54:
If you are not seeing that, then you should check to see if you have the vuln. When I checked my updated version of OX Mavericks, I was on Bash 3.2.52 and it was vulnerable to the exploit.
If you see the word ‘vulnerable’ when you run this, your at risk! env x='() { :;}; echo vulnerable' bash -c 'echo hello'
This is a PASS (OK): env x='() { :;}; echo vulnerable' bash -c 'echo hello'
hello
This is a FAIL: env x='() { :;}; echo vulnerable' bash -c 'echo hello'
vulnerable
hello
Time to get down to patching
This process is going to require you to do some command line work, namely compiling bash and replacing the bad versions with the good ones. If you are NOT comfortable do that.. best to wait for Apple to create the installable patch. If your geek level is above basic, continue forward:
First, agree to using xcodebuild
If you have no run xcodebuild, you are going to need to run it, then agree to the terms, before you’ll be able to finish this build. I recommend you run it NOW and get that out of the way: xcodebuild
Set environment to NOT auto-include
This capability is part of the reason the exploit exists. It’s highly recommend you turn this on before starting the build. Ignore at your own peril. This parameter is used in the build stage for two patches:
export ADD_IMPORT_FUNCTIONS_PATCH=YES
Make a place to build the new objects
I dropped everything into the directory ‘new-bash’… and did it thus. NOTE: I am not using sudo, (yet)
mkdir new-bash
Download base-92 source
Move to that directory and download the the bash-92 source using good old curl and extract the compressed tarball:
cd new-bash
curl https://opensource.apple.com/tarballs/bash/bash-92.tar.gz | tar zxf -
Get the patch packages next
CD to the source directory for bash, and then download 2 patch packages:
cd bash-92/bash-3.2
curl https://ftp.gnu.org/pub/gnu/bash/bash-3.2-patches/bash32-052 | patch -p0
curl https://ftp.gnu.org/pub/gnu/bash/bash-3.2-patches/bash32-053 | patch -p0
Start creating the patches
Execute these two commands, in order two build and apply the two patches:
[ "$ADD_IMPORT_FUNCTIONS_PATCH" == "YES" ] && curl http://alblue.bandlem.com/import_functions.patch | patch -p0
[ "$ADD_IMPORT_FUNCTIONS_PATCH" == "YES" ] || curl https://ftp.gnu.org/pub/gnu/bash/bash-3.2-patches/bash32-054 | patch -p0
Start building!
Traverse back up the tree and start running the builds. It is recommended that you NOT run xcodebuild at this point. Doing so could enable root powers in the shell and that is something that you certainly do not want!
xcodebuild
OK.. PATCH MADE!
At this point you have a new bash and sh object build to replace the exploitable ones. Backup your old versions, move these into place and you are now safe.
# Test your versions:
build/Release/bash --version # you should see "version 3.2.54(1)-release"
build/Release/sh --version # you should see "version 3.2.54(1)-release"
Now clean up the local mess
Now the local directory where you build bash is no longer needed. I don’t like to leave cruft around on my system that creates a confusing environment. Removing the source tree is my last task. You can leave it if you like, but if I need to do this again I’m going to perform a full fresh rebuild, so this will not be re-used.
cd
rm -rf new-bash
YOU ARE DONE!
BIG HUGE THANKS TO ALL THAT DID THE REAL WORK HERE.. the people maintaining bash, the people that post awesome solutions to StackExchange and all the other fantastic resources on the net!
Over that last couple of days, most (hopefully all) of the Web clients (browsers) are being updated to revoke the CA (Certificat Authority) for DigiNotar. It’s important that you perform this update.
The reason is simple. They were hacked last week, and several bogus CERTs (SSL private/public key generated certificates used in secure HTTP communications) were issued for some very high profile websites.
You can read the gory geeky details on a recent Slashdot thread [ HERE ]. Additional information about the CA revocation can be read [ HERE ].
If you hadn’t already manually deleted the CA from your mail and web browsing applications, be sure to apply this update. If you have not been automatically notified of an update (SeaMonkey, Firefox and Thunderbird have all updated in the last 72 hours) I recommend you head to the home website of your favorite browser and see if a security update is available.
If you are still reading, you must be asking yourself, “Why is this important?”. It’s quite simple really (and actually rather complex, but I’ll try not to baffle with technobabble).
Hopefully, any time you communicate with a website that uses any type of password, you are ensuring you are communicating using SSL (Secure Socket Layer), which applies a certain degree of security by encrypting your traffic. The mechanics of this required that the website you are communicating with has a valid SSL Certificate issues for, and properly installed on their website.
Now, anyone can create their own SSL certificate by running a couple of X509 / keygen commands, and with a few lines of coded added to their web-browser, get it installed. Sounds simple enough still, right? The problem with that is, unless there is a centralized repository of people trusted to make these certificates, *anyone* could create a certificate for say.. BankofAmerica.com install it on *their* webserver, and apply some other social engineering techniques to fool you into thinking you are securely communicating with the bank, when in fact you are sending your data to, or even through (also known as a Man-in-the-Middle attack) some third party. With a few other hacks, they might even take over full DNS control of the BankofAmerica.com domain (this happened to UPS.COM just this past weekend, in case you wonder how that can happen). Bottom line, you want to know for CERTAIN that the site you are communicating with has a good, valid CERT issued by a reputable CERT issuing authority, not just some no-name criminal somewhere in Eastern Europe.
This trust is based on vetted, trusted, Certificate Authorities. If you want to look at he list of these trusted CA’s on your browser, it’s going to look at little mind boggling. Anyone on that list that issues a CERT for a website is automatically trusted by your software (and everyone else’s software too, unless you manually remove / revoke CA’s yourself, like I’ve done), so if anyone in that list has a compromised SSL signing system, then any CERT generated by that authority can no longer be trusted. This is the case with DigiNotar.
People far better at writing than I explain this further here: DigiNotar certificate authority breach: Why it matters[ link ]. I recommend you read it and learn a little something about how the web really works. I also suggest that if you are in business and depend on your website, you get some PROTECTION for your DNS with a product like this!: ActiveTrust DNS
If you managed to catch my previous post on the iPhone Cellular Location Tracking Controversy, you saw that I did a little more research into the issue than most of the other articles. Or, at least I showed my work. So why post today? I’m going to walk through a check to see if turning of ‘Location Services’ and *not* approving any of the services to use it for a week, stopped or at least reduced the amount of data recorded.
Here is what I did…. and at the end of the article, we’ll both know the results.
Syncing the iPhone to laptop
Connected my phone to the laptop at 08:49, and specifically told iTune to sync. Once that was done, I changed to the iPhone backup directory:
Looking for updated files
:Backup me$ cd /Users/me/Library/Application\ Support/MobileSync/Backup
Next, I checked to see which directories had been most recently updated:
:Backup me$ ls -ltr
total 0
drwxr-xr-x 1929 me staff 65586 Mar 16 2010 21562bef54882a56a05f4047db0dd1ea95783af1
drwxr-xr-x 1323 me staff 44982 Apr 2 08:16 d56742670a5e045f4a76ebb7fd93c728054c0ebe-20110402-081551
drwxr-xr-x 719 me staff 24446 Apr 25 16:33 669ed5e78e2afe06caad469294edd80d4b3261b9
drwxr-xr-x 1371 me staff 46614 Apr 26 08:49 d56742670a5e045f4a76ebb7fd93c728054c0ebe
Locating the Manifest files that contain filename for the consolidated.db data file. This netted 4 database files. The one I am most interested in is the one created during the sync at 08:49 this morning.
:Backup me$ find . -name 'Manifest.mbdb*' -exec ls -l {} \;
-rw-r--r-- 1 me staff 226086 Apr 2 08:16 ./d56742670a5e045f4a76ebb7fd93c728054c0ebe-20110402-081551/Manifest.mbdb
-rw-r--r-- 1 me staff 167152 Apr 2 12:21 ./669ed5e78e2afe06caad469294edd80d4b3261b9/Manifest.mbdb
-rw-r--r-- 1 me staff 167022 Apr 25 16:33 ./669ed5e78e2afe06caad469294edd80d4b3261b9/Snapshot/Manifest.mbdb
-rw-r--r-- 1 me staff 232090 Apr 26 08:49 ./d56742670a5e045f4a76ebb7fd93c728054c0ebe/Manifest.mbdb
Changing to the directory, I ran the python script that lists contents of the db, and it’s data files, looking for the true name of the consolidated.db. There are over 1300 data files in that directory. It looks like the data file name remains unchanged (4096c9ec676f2847dc283405900e284a7c815836).
Now that I know where the file is, I’m going to copy it to my home directory, and compare it to the other DB files I’ve saved off over the last week. Looking at the list, you can see that the size of the file has not changed, since I started to track this last week. Now, that does not necessarily mean there are no new records in the table, but, it’s a pretty decent indication that it does not. But, an examination of the table, and comparison to the data from the last extract will quickly tell the tail!
:d56742670a5e045f4a76ebb7fd93c728054c0ebe me$ cp 4096c9ec676f2847dc283405900e284a7c815836 ~/iPhoneTracking.3.db
:~ me$ ls -ltr iPhone*
-rw-r--r-- 1 me staff 19128320 Apr 21 10:19 iPhoneLocation.1.db
-rw-r--r-- 1 me staff 225280 Apr 21 10:21 iPhoneLocation.2.db
-rw-r--r-- 1 me staff 19128320 Apr 21 13:23 iPhoneLocation.4.db
-rw-r--r-- 1 me staff 19128320 Apr 21 15:13 iPhoneTracking.1.db
-rw-r--r-- 1 me staff 19128320 Apr 22 14:24 iPhoneTracking.2.db
-rw-r--r-- 1 me staff 19128320 Apr 26 09:16 iPhoneLocation.3.db
Comparing the data. 22-APRIL vs. 26-APRIL.
I simply turned off Location Services on my phone last week, after storing the snapshot on 22-APRIL. During that time I used several apps that use some sort of location information, and in the cases where I was prompted to provide location services, I declined.
But first things first. I think it’s important to show that there is more than just the controversial CellLocation data in this database file. Here is the full list of tables included in consolidated.db
The table that is supposed to contain the data, is CellLocation. Lets show what is in the last 5 records entered in that table. Here are the fields, then the last 5 records:
sqlite> .header ON
sqlite> select * from CellLocation limit 1;
MCC MNC LAC CI Timestamp Latitude Longitude HorizontalAccuracy Altitude VerticalAccuracy Speed Course Confidence
22-APRIL
sqlite> select Timestamp,Latitude,Longitude,HorizontalAccuracy,Speed,Confidence from CellLocation order by Timestamp desc limit 5;
Timestamp|Latitude|Longitude|HorizontalAccuracy|Speed|Confidence
325110108.640637|47.24654626|-122.43737727|2164.0|-1.0|70
325110108.640637|47.24717628|-122.43819308|500.0|-1.0|50
325110108.640637|47.24570667|-122.43808859|2138.0|-1.0|50
325110108.640637|47.2472279|-122.43625974|1550.0|-1.0|70
325110108.640637|47.24575543|-122.43641382|500.0|-1.0|50
26-APRIL
:~ me$ sqlite3 iPhoneTracking.3.db
SQLite version 3.6.12
sqlite> .header ON
sqlite> select Timestamp,Latitude,Longitude,HorizontalAccuracy,Speed,Confidence from CellLocation order by Timestamp desc limit 5;
Timestamp|Latitude|Longitude|HorizontalAccuracy|Speed|Confidence
325110108.640637|47.24654626|-122.43737727|2164.0|-1.0|70
325110108.640637|47.24717628|-122.43819308|500.0|-1.0|50
325110108.640637|47.24570667|-122.43808859|2138.0|-1.0|50
325110108.640637|47.2472279|-122.43625974|1550.0|-1.0|70
325110108.640637|47.24575543|-122.43641382|500.0|-1.0|50
CONCLUSION
Based on the evidence collected from my phone, it indicates that turning LOCATION SERVICES OFFDOES STOP CellLocation LOGGING!. So.. there you have it.. My research. I’ve shown my work. Explained my methodology. You can trust me or not, but if you have some evidence, beyond some huckster’s article, that indicates I’m wrong, PLEASE let me know! If I missed something, I want to correct my research and conclusion.
The Great iPhone Location File Controversy – is it really a problem?
Unless you have been living under a rock, or use a Windows Mobile Device (no difference), you no-doubt have heard about the reports floating around in the last day or so about the infamous consolidated.db location history file that is maintained on each 3G enabled iOS4 devices such as the iPhone and 3G iPad.
In fact, the number of articles discussing this issue (and now this one is also joining the fray) is extensive.
The article references this page [iPhoneTracker], where you can download and app, or the source code to compile a program to read your consolidated.db file. However, first you must find it! The instructions for finding the file were not completely accurate, there are some references with path misspellings etc. So, I’m going to re-do those pages, show you how I did it, include the Python script I ran to read the database file, and finally the steps I went about to move the file, compare snapshots of the file, and see if turning off location services, as it purported, solves this problem
First order of business is to get the App. Since I’m smart and use a MAC (and LINUX, but I don’t sync my phone to LINUX, so that’s not going to be discussed any further) I grabbed this zip file, extracted the app inside, and dropped it in my Applications pane.
Running the program, I see this:
As you can see, I don’t do much wandering around. Zoomed in you can see some of the places along the West Coast I have wandered since I purchased the current device. The App will display ALL data, or you can have it just display a specific weeks’ full of data, form any time frame in the device. Here is a shot of some travel I did during Thanksgiving 2010.
To say the dataset is full of inaccuracy’s is an understatement. Just look at this map of the last two days of travel. I can assure you, I was not in Sumner, nor in Olympia or any of those other places on the eastern side of the sound in a long time, much less that last two days:
And yet another, from the middle of last year. You can see a lot of ‘hits’ on Vancouver island. However, I have NEVER been there. Ever!
OK, so, I hope you can see that, the use of the data has it’s limitations. It’s not very accurate. In face it’s pretty inaccurate in enough cases to make it’s utility, dubious.
But, I wanted to know more, so I delved deeper into the files and went in search of the nefarious SQLite file itself. First stop was this page, where I grabbed a script, applied some mentioned patches, located the backups directory and find my consolidated location files (I found 4).
Here is the patched Python script:
#!/usr/bin/env python
import sys
def getint(data, offset, intsize):
"""Retrieve an integer (big-endian) and new offset from the current offset"""
value = 0
while intsize > 0:
value = (value<<8) + ord(data[offset])
offset = offset + 1
intsize = intsize - 1
return value, offset
def getstring(data, offset):
"""Retrieve a string and new offset from the current offset into the data"""
if data[offset] == chr(0xFF) and data[offset+1] == chr(0xFF):
return '', offset+2 # Blank string
length, offset = getint(data, offset, 2) # 2-byte length
value = data[offset:offset+length]
return value, (offset + length)
def process_mbdb_file(filename):
mbdb = {} # Map offset of info in this file => file info
data = open(filename).read()
if data[0:4] != "mbdb": raise Exception("This does not look like an MBDB file")
offset = 4
offset = offset + 2 # value x05 x00, not sure what this is
while offset < len(data):
fileinfo = {}
fileinfo['start_offset'] = offset
fileinfo['domain'], offset = getstring(data, offset)
fileinfo['filename'], offset = getstring(data, offset)
fileinfo['linktarget'], offset = getstring(data, offset)
fileinfo['datahash'], offset = getstring(data, offset)
fileinfo['unknown1'], offset = getstring(data, offset)
fileinfo['mode'], offset = getint(data, offset, 2)
fileinfo['unknown2'], offset = getint(data, offset, 4)
fileinfo['unknown3'], offset = getint(data, offset, 4)
fileinfo['userid'], offset = getint(data, offset, 4)
fileinfo['groupid'], offset = getint(data, offset, 4)
fileinfo['mtime'], offset = getint(data, offset, 4)
fileinfo['atime'], offset = getint(data, offset, 4)
fileinfo['ctime'], offset = getint(data, offset, 4)
fileinfo['filelen'], offset = getint(data, offset, 8)
fileinfo['flag'], offset = getint(data, offset, 1)
fileinfo['numprops'], offset = getint(data, offset, 1)
fileinfo['properties'] = {}
for ii in range(fileinfo['numprops']):
propname, offset = getstring(data, offset)
propval, offset = getstring(data, offset)
fileinfo['properties'][propname] = propval
mbdb[fileinfo['start_offset']] = fileinfo
return mbdb
def process_mbdx_file(filename):
mbdx = {} # Map offset of info in the MBDB file => fileID string
data = open(filename).read()
if data[0:4] != "mbdx": raise Exception("This does not look like an MBDX file")
offset = 4
offset = offset + 2 # value 0x02 0x00, not sure what this is
filecount, offset = getint(data, offset, 4) # 4-byte count of records
while offset < len(data):
# 26 byte record, made up of ...
fileID = data[offset:offset+20] # 20 bytes of fileID
fileID_string = ''.join(['%02x' % ord(b) for b in fileID])
offset = offset + 20
mbdb_offset, offset = getint(data, offset, 4) # 4-byte offset field
mbdb_offset = mbdb_offset + 6 # Add 6 to get past prolog
mode, offset = getint(data, offset, 2) # 2-byte mode field
mbdx[mbdb_offset] = fileID_string
return mbdx
def modestr(val):
def mode(val):
if (val & 0x4): r = 'r'
else: r = '-'
if (val & 0x2): w = 'w'
else: w = '-'
if (val & 0x1): x = 'x'
else: x = '-'
return r+w+x
return mode(val>>6) + mode((val>>3)) + mode(val)
def fileinfo_str(f, verbose=False):
if not verbose: return "(%s)%s::%s" % (f['fileID'], f['domain'], f['filename'])
if (f['mode'] & 0xE000) == 0xA000: type = 'l' # symlink
elif (f['mode'] & 0xE000) == 0x8000: type = '-' # file
elif (f['mode'] & 0xE000) == 0x4000: type = 'd' # dir
else:
print >> sys.stderr, "Unknown file type %04x for %s" % (f['mode'], fileinfo_str(f, False))
type = '?' # unknown
info = ("%s%s %08x %08x %7d %10d %10d %10d (%s)%s::%s" %
(type, modestr(f['mode']&0x0FFF) , f['userid'], f['groupid'], f['filelen'],
f['mtime'], f['atime'], f['ctime'], f['fileID'], f['domain'], f['filename']))
if type == 'l': info = info + ' -> ' + f['linktarget'] # symlink destination
for name, value in f['properties'].items(): # extra properties
info = info + ' ' + name + '=' + repr(value)
return info
verbose = True
if __name__ == '__main__':
mbdb = process_mbdb_file("Manifest.mbdb")
mbdx = process_mbdx_file("Manifest.mbdx")
sizes = {}
for offset, fileinfo in mbdb.items():
if offset in mbdx:
fileinfo['fileID'] = mbdx[offset]
else:
fileinfo['fileID'] = ""
print >> sys.stderr, "No fileID found for %s" % fileinfo_str(fileinfo)
print fileinfo_str(fileinfo, verbose)
if (fileinfo['mode'] & 0xE000) == 0x8000:
sizes[fileinfo['domain']]= sizes.get(fileinfo['domain'],0) + fileinfo['filelen']
for domain in sorted(sizes, key=sizes.get):
print "%-60s %11d (%dMB)" % (domain, sizes[domain], int(sizes[domain]/1024/1024))
I placed the script in my home directory for now. Later I’ll move it off to the Applications directory. Setting the script to executable, I then set out to locate the manifest db files. To do this, I leveraged the find utility (don’t worry Windoze users… you’re not missing this utility, you never had it in the first place). I thus located these 3 manifest db files. The one I’m most interested in is dated today:
Now it’s time to run the python script, and grep for the filename I need.
Filename in hand, I check and verify it’s existence. The important part of the data returned is this:
That number is the real filename. Checking that I see that the file is there. Making a copy of that file available in my home directory (using cp to copy it… that means ‘copy’ for you Windoze users reading from your bunkers).
Word on the street, is that this is a SQLite database file. I should be able to confirm that with a simple strings test, and the rumor is confirmed:
I opened the database file, and selected the first 5 records, and last 30 records from the CellLocation table (rumored to contain the data of interest).
Now, the time stamping get’s a little tricky.. it’s seconds since January 01, 2001 (I really don’t want to do the GMT / epoch offsets right now), but I don’t need to worry about that, all I really care about is that LAST location the phone recorded for me. If it did, in fact, honor my demand to turn OFF location services, my last point of origin should be in or north of Purdy.
Checking this Latitude and Longitute with Lougle Maps (OK, so it’s a corny movie reference, move on with your big bad selves)… I see….
Well, this does not bode well for the Apple ‘researchers’ that say turning off Location Services solves this problem. As far as I can tell it DOES NOT. I’ll be doing a little more research on this myself later, when I have time to verify the timestamps. But for now, it looks like, regardless if you like it or not.. Big Brother Steve Jobs is WATCHING!!!!
Apr 20, 2011 … iLounge news discussing the iOS 4 devices quietly track, store users’ locations. Find more iPad news from leading independent iPod, iPhone, …
www.ilounge.com/…/ios–4–devices-quietly-track-store-users-locations/ – Cached
Apple tracking location of iOS4 device users, researchers say …
Apr 20, 2011 … A team of researchers have discovered that iOS4 is secretly obtaining your location and recording it to a hidden file, raising obvious …
www.betanews.com/article/Apple-tracking…iOS4–device…/1303319892
Got an iPhone or 3G iPad? Apple is recording your moves – O’Reilly …
Apr 20, 2011 … Ever since iOS 4 arrived, your device has been storing a long list of ….. Why not just visit http://oo.apple.come from any IOS4 device and …
radar.oreilly.com/2011/04/apple-location-tracking.html
I recently received a link to the this analysis of the crime-ware. Pretty sophisticated!!! The Conficker Cabal is busy trying to measure it’s function and effectivness. If you have the time, and the stomach for a tech article, I suggest you read this!
Wow, I can’t believe I can still access the web?!?! It’s already April 1st in Australia (right now : Melbourne *Wed 4:50 AM) and the entire internet has not collapsed!
I’m trying my best to act shocked but… I’m not a trained actor. Maybe a few hours into April 1st is too early to call it but.. frankly.. I stand by my first post on this.. much ado about NOTHING!
For entertainment factor, here are some more alarmist articles on the ‘threat’
Researchers in Toronto released a report this weekend, regarding the discovery of a massive cyber-espionage and data theft network that appears to have 3 of it’s 4 Command-and-Control (C&C) located in China.
Vast Spy System Loots Computers in 103 Countries
By JOHN MARKOFF
Published: March 28, 2009
TORONTO — A vast electronic spying operation has infiltrated computers and has stolen documents from hundreds of government and private offices around the world, including those of the Dalai Lama, Canadian researchers have concluded.
Details of the exploit vector are exactly spelled out in the article, but it would appear that this software infection of computers capable of monitoring email and other traffic. By description, it sounds like the malware/trojan/crimeware employs a network sniffer to watch traffic I/O on the infected machine, sending interesting data back to one (or more) of the C&C systems. The researchers also indicated that they stumbled upon some of this by accident, and there could be other capabilities of the network not yet exposed.
I plan to look into this further to see what types of systems have been infected.