NCW Conference – first thoughts

I got down to DC this week to attend two days of the Network Centric Warfare conference. There were about 700 attendees in the audience so it is still a pretty well subscribed event.

I only have time for a quick post tonight (I plan to post more later) so just a few highlights…

Terry Pudas, Deputy Assistant Sec Def for Transformation, focused on the fundamentals of tranformation – information for mass. His discussion was notable because of the focus he placed on the 3000.05 “HADR Directive” that elevates stability operations to a core military mission. This was the only mention of this directive that I noticed despite the impact it seems to be having.

Mr. Pudas also made an interesting observation comparing the fight for information superiority to the fight for air superiority. I will try to come back to this in another post because I think it is a powerful analogy that probably doesn’t get enough attention.

Quote of the Conference came from Gen Barry McCaffrey (ret) on the day of the SOTU speech: “Tonight’s speech is the most dead on arrival speech given in this town in 25 years.”

Second best quote from Gen McCaffrey… “Don’t threaten people in public.” in reference to the administration’s current round of sabre rattling in the Persion Gulf.

And what the heck, one more good Gen McCaffrey quote: “Leaders have to be where the situation is most dangerous and confusing, not where communications are best.” He made that point in reference to the incredible situational awareness and communications power of the modern TOC, that because of the static nature of our current conflict, has unproven mobility. The slow arrival of Mobile Battle Command on the Move is doing little to allay those concerns.

Congressman Saxton gave the room full of defense contractors a strongly delivered warning that could be summarized as “stop over promising and under delivering netcentricity…”
– “The hype of netcentric systems outweighs the capabilities of current systems.”
– “Combatent commanders are not active participants in defining these systems.”
– “Demonstrate real results in reasonable periods of time.”
– “…promises repeatedly fall short of reality.”
– “I’m continuously promised that perfect battlefield intelligence is just around the corner.”

He then went on to point out that the Army’s “reset” program that will require $17B up front and then $13B per year for as long as this conflict goes on, and then an addtional two more years, to fix and replace all of the Army’s broken stuff makes this a difficult environment for transformational programs such as FCS and LCS. Especially with all of the broken promises.

I’ll post more later but, quickly, one of the highlights for me was a presentation by LTC James Buck on Stryker employment in Iraq in which he compared and contrasted his term “network enabled” with Network Centricity.

“Network Enabled” ~ “how the network helps me do my job”
“Network Centric” ~ “when other people make me feed the beast”

• • •

Communication Simplification

Messoftext

A colleague forwarded me this Federal Biz Ops announcement recently and I couldn’t help but wonder (again) how much it costs the Department of Defense to rely on such opaque communications. If communication is intended to convey meaning from producer to audience is this the best way to do it?

This massive glom of text is nearly impossible to decipher. The absence of white space and the lack of paragraph structure make readability so poor as to be funny.

Unfortunately, these announcements are often written in such broad language that it is very difficult to understand what the author is looking to buy even if you can get past the readability issue. At the end of the day you try to decipher key words and then you make some phone calls if there was enough there to pique your interest. The obfuscation in these things makes it hard to do much more. So I wonder, are the people that write these things getting what they want from them (assuming they actually want their audience to understand what they are asking for).

A few jobs ago I hired a company in NYC called Siegel+Gale that at the time had made a name for its “Simplification” practice. They redesigned complex documents like the 1040EZ form or Merrill Lynch statements to make them much more understandable and usable. It would be a great experiment to see what they could do with Fed Biz Op announcements; not just with readability, but also with organization, distribution, and etc.

I would love to see Fed Biz Ops and other DoD-related organizations take a page from the simplification book and orient this stuff for readability. These things aren’t being sent as USTMF and they aren’t in XML so I really don’t see why they have to look like this. Make them readable and understandable for the intended audience.

Tagcloud_1

In the meantime, I wonder how indexing this stuff and making tag clouds would help. If I’m going to have to rely on keyword analysis to understand them, we might as well try to make the keywords obvious, visible, and weighted.

• • •

Exhibition Opening

Note_card

This blog really isn’t about photography but…

I have an exhibition opening at The Print Center in Philadelphia next month and can’t resist the urge to announce it here.

“Dream”
Feb 22 – May 5
The Print Center

Opening Reception
Feb 22, 2007 5:30-7:30 p.m.

• • •

A Market-Driven Vision for DISA

I attended the AFCEA DISA Luncheon Symposium in Washington, DC yesterday.

After the parading of the colors, a symbolic lunch that included black eyed peas for good luck and collard greens for prosperity, an invocation that ended with a rather old-Testament call for our enemies’ destruction, and a series of odd one-clap introductions, the panel discussion began.

For the most part the discussion stayed close to existing public statements so there were no real surprises.  Recognition that fast is better, reiteration of the adopt-before-build doctrine, an emphasis on lighter up front requirements to speed delivery, and the impact of globalization of the software business on security were all reemphasized. 

There were a couple of moments that stimulated an audience reaction. 

First Dave Mihelcic got some guffaws from the crowd with his statement that “Ideally NECC will have a release per week” when he was describing the purpose of the Federated Development and Certification Environment (FDCE). 

He didn’t use the term continuous beta but he was clearly emphasizing the need to get feedback early from “sandbox” testing with the idea that a sandbox on operational networks could readily evolve to an operational capability, and would get real feedback from operators in the meantime.

Clearly at least some of the audience didn’t see weekly releases as even within the realm of possiblity.  I applaud him for putting that goal out there.

The second reaction came from Brig Gen Warner’s statement that “The day of the big system integrator is over” to a room full of, well, big system integrators.

It was a provocative statement, but in concert with Mihelcic’s statement about release cycles, Brig Gen Warner’s later comments about the time spent on requirements are probably more interesting.  He made the point that when the typical two years is consumed defining requirements, commanders in the field will go off-reservation and “hobby shop” a solution. Innovation always shorts to ground when a big enough need builds up.

I think that they should have gone on to make the point that less time on requirements only works in concert with very rapid releases, and the discipline and competencies to do them.

The problem today is that very infrequent system-wide integration combined with the long tail of accreditation, certification, and testing completely decouples the development of a capability from it’s real world use.  In this environment reduced time spent on requirements will only result in poorly defined software reaching the sandbox; but without the necessary relief from a rapid follow-on of releases to correct misunderstandings of those broadly defined requirements.

What is required in my view is an environment more like Amazon.com’s where rapid evolution of requirements is combined with as-frequently-as-daily releases to in-situ A/B user-instrumented tests and production.  Measurement based product management combined with super-disciplined build, test, and release management is the combination that would permit NECC capabilities to be fielded every single day.

But I want to back up a bit … What was missing for me from the entire conversation was an exposition of DISA’s value proposition in market-defined language.  Moving beyond “existence as mandate,” I was hoping for a discussion of “what business we plan to be in” and “what will make us great at it.”  The discussion often revolved around what DISA is going to do and provide (e.g. NCES product areas), but not why their customers need it or want it.

If DISA was a commercial concern and the DoD was its market, how would it talk about its mission?

In November I attended Jeff Bezo’s talk “Web Scale Computing” at the Web 2.0 Summit in San Francisco.  In it he talked about Amazon’s entree into the infrastructure business.  His discussion was completely focused on the customer need being solved and barely mentioned technology; other than to say that they were using the same technology that they use internally for their web services customers.

We know, for example, that EC2 is based on XEN virtualization but he didn’t mention it once. 

What he did say was very simple and powerful: historically 70% of a startup company’s effort and money is spent on the “undifferentiated muck” of computing, networking, licensing, disaster recovery, and etc.; and that Amazon’s strategy will be “We make muck so you don’t have to.”   He went on to say that he thought that they would be good at running muck for customers because they run a lot of muck for themselves.

Walking out of Bezo’s presentation I knew I understood the customer pain that needed salve, what Amazon’s value was to heal it, and why Bezo thought that Amazon would be good at it.

What is DISA’s equivalent pain / value / competency triplet?  It probably shouldn’t be all that different; after all, every DoD program deals with similar muck, plus the muck that goes with government processes for testing and accreditation. 

Let’s take a stab at it for the NECC/NCES domain space…

Programs that provide capability to NECC spend too much time and money defining and provisioning hardware, infrastructure software, and networks; re-creating base command and control capabilities such as mapping, imagery handling, and collaboration; and dealing with the complexity of testing and certification.  DoD programs, unlike most of their commercial counterparts, remain “vertically integrated” and often end up building the same base capabilities in incompatible ways.  The result is a multi-year gap between a recognized need and the belated delivery of obsolescent capability.

DISA will help the services rapidly field new capabilities by providing remote hosting and management of these capabilities while also providing the “base platforms” of mapping, imagery, collaboration etc.  Essentially we intend to be the “Google maps-like mashup engine for C2 plus the data center for hosting the higher order capabilities built on top of that platform.”  In addition to developing that base platform we will leverage foundational enterprise integration capabilities to tie in commonly reusable data from Blue Force Tracker, ADSI / Link 16, NIMA, MIDB, and etc. for ready re-use by all joint service capabilities.

Additionally DISA will take on the role of community process owner and testing and certification facilitator to streamline the ability to build and deploy both these core capabilities AND the service-specific capabilities built on top of them on a daily release schedule when required.  Furthermore, we will develop those base capabilities as a DoD open source / open technology project to improve quality and to enhance the liklihood of adoption.

Finally, we will structure the processes and capabilities that we provide such that capabilities can be delivered, early, often, and incrementally to avoid the problem of huge upfront requirements specification effort.

Maybe?

• • •

Congress in Second Life

Congress_in_sl

Congressman George Miller held an in-world press conference in a virtual replica of the congressional chambers in Second Life yesterday. Afterwards he was interviewed on Rocketboom.

Unfortunately this first interaction was limited to an invite-only audience of reporters (many of whom must have picked up their newb av’s at the door given what is visible in the rocketboom video). Keeping it invite-only is probably smart though; no politician is going to want to end up in the mainstream media having been assaulted by any number of possible creative and distasteful griefer appendages.

To make an experiment like this useful it ultimately will have to be opened to permit forums between congress and citizens; as a way for members of congress to interact directly with remote constituents. However, working it out so that there is some hope of a civil dialogue will be difficult in a world where interactions are anonymous and often beyond social reprecussion.

It’s fun to imagine having a conversation with your representative’s av though. “Hi what is your position on Intellectual Property rights?”… “No, I’m not talking about software patent law, I mean do you think they should kill the LibSL project after that whole copybot debacle?”

I have to say though, I’m a bit skeptical on whether this will actually be useful. Typing via chat isn’t the easiest or best way to communicate when, more than just the message, you are evaluating the person. Also, it’s just difficult to have any kind of nuanced exchange. I can see messages getting dumbed down even further as politicos or their designated av operators pre-load their platform soundbites as gestures that can be called up on demand. Joystick campaigning.

Despite my doubts, Representative Miller should probably be applauded for being willing to try something new and maybe so should Clearlink and Sun who built/sponsored the replica congressional chambers. Although, it seems a bit sketchy to have the Congressional Chambers ACTUALLY owned by a private enterprise.

I had to laugh at this minor irony. One of the Clearlink team members listed as an owner of the land where the new congressional hall is situated is Atta Turk. Coincidence? Or can we too look forward to the day when we separate church, state, and ideology and become a secular nation? Hide that fez… 😉

• • •