The Army, the Web, and the Case for Intentional Emergence


Lt. Gen. Sorenson gave a Higher Order Bit talk at the Web 2.0 Summit in San Francisco back in November. I didn't make it to the Summit this year but I'm glad I got to see the video. 

I'm glad General Sorenson is thinking about how the Army's systems and methods can be improved with Web 2.0 ideas and technologies but I wish the Army would really go after the really fundamental benefit of the Web, the fact that it is a platform that supports emergence. It's not just about the specific technologies, it's about the ecosystem of technology, economics, policy, and culture that supports rapid innovation on a generative platform. 

I think the Army can unleash a wave of innovation at the edge by replicating the web's generativity on the battlefield and a couple of California National Guard guys I met have proven it. They managed to get a single Linux box authorized for the SIPRNET in theater and quickly used it to build a collection of web applications called Combat Operations Information Network that scratched a bunch of itches for their unit.

As simple as it was (a single underpowered Linux machine on the network), once COIN was on the network it was a generative node and people lined up to get other problems solved and is now widely used across the theater.

I tell the rest of the story about my reaction to General Sorenson's talk and how the Army's Battle Command System can support innovations like COIN here at Radar. I'll just link to it rather than cross posting the rest of it.

Technorati Tags: , , ,

• • •

Getting Owned Across the Air Gap

Usbdrive

I attended a fascinating talk yesterday at Blackhat given by Sinan Eren from Immunity in which he described a recent for-hire Information Operation.

In the talk he took pains to differentiate between a standard penetration test and the kinds of things they were doing; the primary differences being time scale and scope. In this case the time scale was long (though undisclosed) and the goal was compromise of some particularly sensitive data. He didn’t say but it was probably product design or source code.

To maintain a stealthy ingress they decided to avoid easily exploited client side weaknesses and instead found something much more difficult to detect, a poorly implemented anti virus scanner on the mail transfer agent. After fingerprinting, building an equivalent MTA in their lab, and coding a unique one-time exploit of the poorly implemented AV file parser, they were in. Consolidation and expansion was done at a leisurely pace, greatly aided by the social engineering benefits of the MTA’s access to all of the email traffic. Within a reasonable period of time they were able to relationship map many of the target’s personnel, expand to the other side of the firewall, quietly exploit a number of client machines, and gain a good understanding of who was likely to have access to the information they were looking for.

Then interesting stuff happened.

They began to find file references to the stuff they were looking for on a user workstation. The references ultimately ended up pointing to a USB drive that had been accessed at some time prior. It turns out that the target company was running a separate air-gapped internal network where they segregated development or other sensitive activities. However, one of the developers had questions that needed answering over email and was using the USB drive to carry bits and pieces across the air gap so that he could email them as attachments with his questions. Unfortunately, it sounds like the USB drive was used for backup as well and had more than just the snippets on it. After finding, testing, and deploying an exploit that would suck the contents out of the USB drive the next time it was inserted, the attackers just needed to wait until the next time the developer had a question.

Once they had the contents from the USB they ended their IO and reported out to their client. However, had they been dedicated to ongoing operations against the target organization it is not inconceivable that they could have gone further than just retrieving data from the USB. A planful low and slow continuation would probably have kept the USB-copy-and-retrieve going until it had panned out and they were no longer retrieving significant new information. With that vein mined, they might have escalated the level of detection they were willing to risk and try to deploy an exploit across the air gap by writing to the USB drive the next time it was inserted (that’s not my idea, it was quickly suggested by members of the audience who sounded like they had a good idea of how it would be done). With a long view, a cleverly designed set of USB-transported exploits, and those occasional sneaker-enabled transits and you’ve got an effective measured impedance between those two networks of near zero.

Obviously it is interesting that the air gapped network was as vulnerable as it turned out to be. I’m sure a government on government exercise like this would have a lot of people thinking hard about existing assumptions. But perhaps more interesting is the fact that almost all of the exploits used were purpose built for this operation and so were completely invisible to AV signature matching or rules based detection schemes. Well trained, motivated, and aggressive internal analysts with the right tools might have discovered what was going on, but no automated tool was likely to as there would have been no pattern or signature for them to match against.

I am curious how many “typical” intrusion attempts were discovered and warded off during the same period by the target. The “usual stuff”, by creating a high noise baseline that keeps defenders feeling like they are accomplishing something with their automated tools would probably help hide the signals of a determined and unique operation such as this one.

• • •

Campaigning, Technology, and Supporter Generated Content

I’m accustomed to receiving invitations to donate money to political campaigns but today I received an invitation to participate more directly in a calling campaign as part of Obama’s virtualized campaign phonebank. Of course McCain and Hillary have them too. It’s not a groundbreaking use of technology per se, but it feels groundbreaking in the political sphere where we are generally expected to passively absorb.

They aren’t really doing much more than providing a number to call and a script. Perhaps if they provided an embedded VoIP client they wouldn’t have to include this statement about costs (which I guess are essentially contributions that are beyond contribution accounting).

Hillaryclintoncom_make_calls

Obama’s site is noteworthy for leveraging the power of game mechanics and creating an opportunity to be recognized as an uber supporter.

Barack_obama_____change_we_can_beli

Though these virtual phonebanks are simple uses of technology, I hope they are a hint of the power of networks to create a more participative future political environment.

• • •

New Open Source Project Supports Contextual Collaboration

Cartoon

After about a month of preparation I’m thrilled to announce a new open source project to build support for contextual collaboration. The project is called rVooz (a contraction on rendezvous) can be found at www.rvooz.org.

From the rVooz web site:

rVooz is a software suite designed to make contextual connections, or “contextions,” between people who may or may not have a priori knowledge of each other. It is designed to bring people together even if they don’t have each other in their buddy lists or know each other’s phone numbers.

The rVooz suite consists of software clients that post context, a Salient Server which finds context matches, and Voozers that coordinate the connections by distributing presence or starting sessions…

To understand what this means, imagine looking at a web page and seeing all of the other people looking at that web page added to your IM client buddy list in real time (and removed when you leave). Or, in a military context, imagine that you are reviewing an airspace, a target, an area of interest, or some other context and all of the other operators working the same context (in whatever system they are using) are dynamically added to your buddy list. This is just the beginning, in addition to “contextual dynamic presence”, rVooz may also be leveraged to dynamically establish VoIP sessions without any of the parties knowing each other’s phone numbers or SIP addresses in advance.

The project is interesting as it may be the first free and open source project funded from day one by the Department of Defense (or it might not be, hard to tell!). It is also interesting because it has been selected to be as appealing to a non-DoD audience as it is to the DoD. If this turns out to be true (I am sure hoping so) it will open up a really interesting chapter of collaboration between two seemingly completely different domains.

The project is brand new but has the beginnings of the “Salient” back-end service in place and has started a “voozer” that works with the OpenFire Jabber/XMPP server. We’re hoping that as we continue to build out Salient the community will help us develop voozers for a variety of collaboration environments.

If it sounds interesting stop by www.rvooz.org and check it out.

• • •

Search Terms are Expressions of Interest

Limn

One of the things I like about blogging is having the opportunity to see what search terms on Google get people to my page. I like going back to the referring address to see what search term they used and where on the search results page my blog ended up.

We tend to think of Google as an output mechanism. We enter a search term and get the output; the search results are the valuable thing. However, I think the input stream of search terms is equally interesting as it is a stream of real time expressions of interest. All of these people all over the world are providing this rich stream of data that collectively says what people are interested in at that moment.

Google seems to think this too as they are now making search terms available to us in the aggregate as “trends” and are correlating the search term usage to events that may have been either a cause or an effect. Take a look at the trends page and try a search term like “war with Iran” to see what it looks like.

Unfortunately Google’s algorithms for this service seem to require a fairly high threshold of activity to permit the kind of statistical sampling they do; so, search terms like SOSCOE, NCES, TBMCS, DoD Open Source, or other things I might be interested in generally don’t meet the minimum. NECC does, but that’s because there is another NECC outside of the DoD space.

What I wish I could do is subscribe to search terms and receive an event for each time it was used (or aggregate events if it is used a lot). This would be interesting for two reasons.

First, it would take away the threshold requirement that their current algorithms require and would let me look at the data any way I want to. Is the use of “DoD Open Source” as a search term growing over time for example?

Second, and perhaps more interesting though would be the ability to use these near-real-time expressions of interest as causal signals in investment models based on complex event processing. Today many investment firms are using web crawlers to essentially automate the reading of the news. They then attempt, through complex models, to correlate the release of news stories to the effect on various investments. Search term “expression of interest” streams could be the more-real-time event driven equivalent. For example, a regional increase in the search term “hybrid car” might be a leading indicator for increased sales at Toyota (or, it might be a lagging indicator of increased sales last month… there would be a lot to test). Comparing the event stream to the equivalent crawled search terms, it might be possible to determine how much of the event stream is leading vs. lagging the news – which is the cause and which is the effect?

If it turns out it can be proven that expression of interest event streams have value as leading indicators rather then it seems like only a matter of time before Google and other search engines would productize the event stream.

Returning to the DoD space for a moment, what got me thinking about this today was the number of recent referral searches I’ve gotten for “Cyber Command.” There has been a lot in the news about the new Cyber Command lately and that is probably driving much of the interest (lagging rather than leading indicator) but it still be really interesting to see where the searches are originating; who is expressing the interest?

• • •

JBI, RSS, and Continous Integration

About to clear out for the weekend but decided to quickly go through some blog posts I’ve been meaning to read. I have to send props to Jeff Black (who works for the same company as I do) for his blog post on aggrating RSS feeds from multiple continuous integration environments via an RSS binding component in JBI. Cool example of how “web”-oriented JBI binding components (e.g. RSS BC, XMPP BC, SIP BC, etc.) might be used to bridge the gap between “web” and “Enterprise.”

• • •

Search is Broken?

Attended Jimmy Wales talk on Wikia search today at OSCON.

One of his slides stated:

“Search is part of the fundamental infrastructure of the Internet. And, it is currently broken.”

I think there is a lot to worry about with Google – concentration of data, hidden algorithms, etc., and I understand the free culture political incentives for building an open and transparent search engine; but I think Wales will be hard pressed at this point to get the average search user to agree with the assertion that search is broken. On the contrary, it seems to me that it is better than it has ever been. Will the audience care?

Btw, the grub project’s use of latent cycles and bandwidth to do crawling seems really cool.

• • •

Design School

I’ve always believed that design matters. My degrees are in engineering but my bookshelves are filled with books on architecture, design, information design, fonts, layout, etc. If you see your role in this world as gathering requirements and implementing them, enjoy your stay at the bottom of the commodity heap. The best systems happen at the intersection of understanding “requirements” and design with a point of view.

I ran across this post today on design with its great embedded links. Thought I’d pass it along.

• • •