Gary Reback: US Government Must Enforce Antitrust Laws to Encourage Innovation

On the need for government intervention in the free market to protect competition and encourage innovation

Gary Reback is one of the nation’s most prominent antitrust attorneys, best known for spearheading the efforts that led to the federal lawsuit against Microsoft.   Gary spoke to an attentive and eager audience on May 14th in Santa Clara, CA.   The Commonwealth Club and Yale Club of Silicon Valley sponsored his enlightening and provocative talk. Reback’s main message was that the government l’aissez faire policies, so strongly promoted by University of Chicago economists, have gone way too far. As a counter-weight, he says we need more government oversight of the private sector along with more vigilant anti-trust enforcement.
To set the stage for the current recessionary economy, Gary began by chronicling the history of the U.S. antitrust movement. From its beginnings in the 1870s (a time when big business controlled the railroads), through Teddy and Franklin D. Roosevelt, Thurmond Arnold and others, there has been an ebb and flow of power and control between the federal government in Washington and big business (e.g. monopolies) or Wall Street investment firms. Starting about thirty years ago, conservatives forced an overhaul of competition policy that has loosened business rules for everything from selling products to buying competitors. In the free market era of the 1990s, big business and investment banks certainly had the upper hand. To a large extent, that is what has caused the global financial meltdown and enabled companies like Microsoft, Intel, Cisco, Oracle, and Google to become so powerful. We were surprised to learn that expensive medical care and non-generic prescription drugs are the result of a lack of anti-trust enforcement.
Mr Reback firmly believes that in a high-tech world, U.S. government “hands off” policies actually slow innovation, hurt consumers, and entrench big companies at the expense of entrepreneurs.   In particular, Gary calls for increased government scrutiny of high tech firms monopolistic practices. He argues that monopolies have the power to raise prices by restricting output, supply and competition. As a result, the economy weakens, unemployment increases, and innovation is pressured.
We take it for granted now, but many of the advances in semiconductors and software were the result of a few dominant lawsuits against big companies. 
In the late 1950’s, AT&T was forced by the U.S. government to license the transistor. William Shockley, one of the co-inventors of the transistor, licensed it from AT&T to form Shockley Labs, which later begat Fairchild Semiconductor, which in turn begat Intel, AMD, and National Semiconductor. The early years of the semiconductor industry in Santa Clara Valley (it was not called Silicon Valley till the mid 1970s) was therefore, a direct result of the lawsuit against AT&T. For more on those early years, please see the article by this author:
In the early 1970s, concerned about possible anti-trust legislation, IBM was forced to unbundle software from hardware. This created a whole new independent software industry, which had not existed before. Software had previously been bundled with mainframes and minicomputers made by the same computer manufacturer.
Merger enforcement is perhaps the biggest business issue of our time, according to Reback. What do we do with companies too big to fail? How about Citibank Group, for example? If it had not been for the repeal of the Glass Steagall act in 1999, Citi would not have been able to acquire Smith Barney, Solomon Brothers and other investment firms. Hence they would not have gotten too big to fail.   Better to have government carefully scrutinize the mergers and acquisitions and/or break up large companies before they become too big to fail! One has to wonder if certain tech companies, like Oracle and Cisco have become too dominant in their industry or even too big to fail because of acquisitions that occurred without anti-trust scrutiny. For example, Oracle has done over 40 mergers after it acquired People Soft and now has proposed to acquire Sun Microsystems.
Is Google the next Microsoft? Yes, in terms of its dominance over web search software, in comparison to Microsoft control over desktop and notebook PC software. No, in at least two other important ways: 
  1. Google created technology that people liked and it worked well. 
  2. The company was also more customer friendly with a more congenial corporate culture and image.
Author’s Note: the next battle between these software titans will be in mobile OS market- Android platform from Google vs Windows Mobile from Microsoft.
Gary believes that Google’s big search competition will come from social networking sites (e.g. Twitter and Facebook), rather then from traditional search engines from Yahoo or Microsoft. He also noted that potential anti-trust action was enough for Google to call off its plans to put adverts on Yahoo’s search result pages.
Was the European Union’s (EU) huge fine against Intel Corp justified? Just one day before this talk- on May 13th– EU regulators slapped a record 1.06 billion euro ($1.45 billion) fine on Intel for antitrust violations and ordered it to halt illegal efforts to squeeze out arch-foe AMD. This fine was levied after an 8 year EU investigation of the company. "Intel has harmed millions of European consumers by deliberately acting to keep competitors out of the market for computer chips for many years," EU Competition Commissioner Neelie Kroes told a news conference.
Should Intel have known better to refrain from engaging in unfair trade practices? Most definitely yes, according to Gary. Intel was actually a U.S. government witness in the huge anti-trust suit against Microsoft in 1998. The plaintiffs (US Department of Justice and 20 states) alleged that Microsoft abused monopoly power on Intel-based PCs in its handling of operating system and web browser sales. The issue central to the case was whether Microsoft was allowed to bundle its flagship Internet Explorer web browser software with its Microsoft Windows operating system. Bundling them together is alleged to have been responsible for Microsoft’s victory in the browser wars, especially over arch rival Netscape (which seemed by this author to be a superior web browser). Didn’t Intel learn anything from the trial and the verdict against Microsoft regarding unfair competitive practices? Gary response, "Of course, Intel says it is innocent of the charges and never broke the law, so perhaps the company will be exonerated after the EU Commission decision is reviewed by the European courts."
Opinion: We suggest the reader to ask the question to an Intel executive or lawyer.
In summing up, Reback opined that “anti-trust action failures” in the health care and banking industries have contributed to unreasonably high medical costs and a financial meltdown. Meanwhile, heightened scrutiny over acquisitions (e.g. Oracle’s) would result in a stronger U.S. economy by encouraging more competition and invigorating innovation and the start up culture. 
Bio: Gary Reback is one of the nation’s most prominent antitrust attorneys. He has been named one of the “100 Most Influential Lawyers” in America by the National Law Journal and is quoted regularly by major media. His book Free the Market! is a memoir of Reback’s titanic legal battles—involving top companies such as Apple, Microsoft, IBM, Oracle, and AT&T—and a persuasive argument for measured government intervention in the free market to foster competition.   Gary is currently of counsel with Carr & Ferrell LLP. He is a very friendly and easy to get a long with person, in this author’s opinion.

3G and LTE squeeze WiMAX- is the market window still open?


Many pundits have declared the window of opportunity for WiMAX has closed.  Squeezed between he enhanced capability of 3G technologies (e.g. HSPA/HSPA+ for GSM) and accelerated LTE roll-outs, (notably Verizon Wireless) the claim is that WiMAX is DOA.  We disagree!  In particular, we believe there is a reasonable market for WiMAX fixed and nomadic/portable service in developing countries.  We also see possibilities for mobile WiMAX in Korea, Taiwan, Japan, Malaysia, Russia and other developing countries (but not necesarilly in the U.S. or Western Europe).

For more details on WiMAX in emerging markets, please refer to:

WiMAX Continues to Make Progress in Developing Countries

When used for either nomadic/portable or true mobile service, both LTE and WiMAX devices will need to have roaming and handoff between either HSPA/HSPA+ or EVDO rev xyz.  That is because those latter 3G networks will still be the predominant way users access the Internet- especially during the early days of LTE/WiMAX deployment.

While LTE is thought of as a mobile technology, it will be also used for BWA (e.g. Century Tel and VZW plan on LTE for rural BWA).  Similarly, IEEE 802.16e compliant WiMAX can be used to deliver both fixed/nomadic and mobile services if permitted by the regulator in the country where service is offered.

The LTE Express- has it really accelerated?

In the U.S., VZW’s aggressive LTE roll out plans have put pressure on Clearwire, which asserts that its mobile WiMAX network is superior to the LTE service Verizon Wireless will soon launch. Once projected to reach 100 million subscribers by the end of 2008, the new Clearwire joint venture is commercially available in just two metropolitan areas – Baltimore, MD and Portland, OR. What about the 7 other cities that were to be operational by end of 2009?

Clearwire plans to provide more details about its WiMAX deployment strategy on March 5, when it announces its financial results for the fourth quarter of 2008. Those details may include dates for commercial availability of mobile WiMAX service in Chicago, Washington, Boston and Dallas-Fort Worth, Texas, possibly very soon. Sprint Nextel’s WiMax division was already building networks in those cities before the joint venture with the original Clearwire was completed in December. Clearwire is also working on converting its more than 40 pre-WiMax networks to true, standardized WiMax over time. No plans have been announced yet for VoIP over WiMAX which negates any WiMAX smart phones (which Sprint has announced for its mobile WiMAX MVNO unit).  For more information see:

Clearwire readying WiMax game plan as rival LTE gains steam

Robert Syputa of Maravedis disagrees with all the hype about the VZW stepping up its LTE deployment.

"Verizon has not recently accelerated the roll out of LTE in their 700 MHz spectrum. If anything, recent announcements including Barcelona (WMC) amount to a 2-3 month push out from previous statements that they would luanch commercial networks by the end of 2009.

We have held that Verizon was posturing in their earlier announcements because suppliers could not be ready for commercial state deployment. What’s more, there has been no chance that there would be many devices available and too little time to do conformance and compatibility testing among vendors.

Verizon is pushing their own requirements which presage official LTE standard conformance and compatibility. This can be looked at as being similar to the way Sprint pushed the supply ecosystem including running their own test labs outside of those established by the WiMAX Forum. But this is jumping the gun; the LTE standard has yet to be published and chips, devices, and network equipment is at an earlier stage of commercial maturity.

This jousting of PR about availability should be evaluated in the context of what Verizon and other firms are attempting to achieve: Verizon has achieved the market position and PR image as being among the world’s leading networks. That contributes to their ability to hold onto and gain subscribers. Meanwhile, Sprint has succumbed to problems stemming from conversion of iDen, and upgrades to their 3G network and service problems. Even though they can now claim high 3rd party service reliability ratings, their image continues to suffer. Combined with pull from new phone and service offerings from Verizon and AT&T including iPhone, Google Android, and expanded push to talk, Sprint has continued to lose market share.

3G operators like Verizon can continue to build out higher density 3.5G HSPA & EVDO networks but added capacity comes at an escalating price tag. Both WiMAX and LTE next generation networks are, according to the competing camp’s AT&Ts network director, 1/4-1/2 the cost of delivering similar capacity on an advanced 3G network. Despite the higher cost, they say that they will continue to put most of their capital behind 3G over the next 2-3 years. The reason they don’t switch to LTE or WiMAX for the bulk of deployments is because LTE is at least 2-3 years away from being a mature ecosystem and it will require multiple mode or a transition to new devices in order to transition the customer base. The issue is hardly as simple as which technology works best.

What works best for Verizon is holding onto the image of being the leading network and not cannibalizing their fat 3G revenues more than is necessary until the market pressures them do to so. Eventually the market will press on for ever higher bandwidths and combined services that drives operators to adopt 4G. WiMAX and LTE are at a ‘pre-4G’ stage of evolution.. the systems and device evolution, needed disruptive re-farming of spectrum, and marketplace demands are building toward but are still years away from widespread adoption.

Verizon’s pursuit of LTE is pressing on but it will be more about holding onto image and 3G customers than about carving out revenue on a comparative scale for a few years.

A major advantage of 700 MHz is they can deploy thinly to achieve broad coverage. They will leverage that but it is not a panacea."

Opinion:   One thing I’ve learned in over 38 years in the telecom industry, is that new network infrastructures- especially a new high speed wireless network – takes much more time to be fully operational than anyone thinks. Once the infrastructure is in place, several levels of interoperability testing are required along with provisioning systems, monitoring, OSS and back end billing/accounting.

Will 3G Improvements Kill WiMAX?

The Ericsson view of comparisons between 3G/HSPA abd mobile WiMAX was outlined in a white paper released last month (January 2009):

"While the peak data rates, spectral efficiency and network architecture of HSPA Evolution and Mobile WiMAX are similar, HSPA offers better coverage. In short, Mobile WiMAX does not offer any technology advantage over HSPA. What is more, HSPA is a proven mobile broadband technology deployed in more than 100 commercial networks… [and] can be built out using existing GSM radio network sites and is a software upgrade of installed W-CDMA networks. Compared with other alternatives, HSPA is the clear and undisputed choice for mobile broadband services."…

But there’s a contrary point of you that favors WiMAX performance over 3G.  Many think that HSPA/GSM 3G will be overloaded when more mobile users access the Internet, upload photos and videos and watch streaming video on their devices.  Essentially, 3G is a TDM voice network with a data overlay. WiMAX is a flat (non-hierarchial) IP only network.

For EVDO/CDMA 3G, WiMAX avoids expensive royalty payments to Qualcomm, which owns most of CDMA intellectual property. Still, building a ubiquitous WiMAX network would be far more expensive than buying wholesale access to 3G with a Mobile Virtual Network Operator (MVNO) agreement.  But there is also the possibility of being a WiMAX based MVNO.  That is exactly what Sprint plans to do- using Clearwire’s mobile WiMAX network and supplying its own multi- mode (CDMA/WiMAX/WiFi) mobile phones that will operate on the CLEAR network. 

For more details, please refer to:

Sprint may sell tri-mode phone in 2010 that will include VoiP over WiMAX


We do not believe the market window is closed for WiMAX.  The technology works, is available now, and can offer download speeds of 2 – 4 M b/sec per user (depending on cell size- number users per Base Station).  However, we continue to believe the most lucrative market for WiMAX will be for fixed/nomadic services in developing countries.   While most WiMAX (IEEE 802.16e compliant) deployments will actually be used for fixed BWA, the same network can also support mobile BWA at 2.3G, 2.5G, or 3.5GHz spectrum.  That’s a key advantage for network operators that want to deploy a combination of fixed/nomadic and mobile services to subscribers.

You need to segmrent the market for WiMAX.  It is definitely the answer for fixed/nomadic broadband wireless access in developing countries and rural areas.  It is also a success in South Korea (WiBro) as a fixed/mobile technology

It may or may not succeed as a globally ubiquitous mobile wireless technology.  The places to evaluate that are Japan, Taiwan, India, and other Asian countries.  I do not think people should be so hung up on whether or not Clearwire suceeds in the U.S.  Think global, especially Asia, for mobile WiMAX.    Once WiMAX netbooks, MIDs, other CPE, and smart phones are available, then there will be a better outlook for mobile WiMAX.  But that will happen, if and only if,: the regulator in the country permits mobile service at the licensed frequency(s), the operator builds out the mobile network, and implements roaming agreements with both mobile WiMAX and 3G carriers.  That remains to be seen. 

There is another market segment where WiMAX has huge potential- backhaul (WiFi hot spots, video surveillance cameras, etc) and wireless backbone for campus/ private networks.  This is a dark horse growth area in my opinion!

Addendum: In-Stat: 30% of subscribers will be 3G or 4G by 2013

In-Stat says that 30 percent of subscribers worldwide will be using some form of 3G or 4G cellular technology by the end of 2013. With mobile WiMAX needing to prove itself in the market as LTE deployments expand, In-Stat predicts that WiMAX networks will find favor in developing countries. In addition to expected LTE deployments in the United States and other developed markets, the research firm predicts that numerous vendors will pick up on the mobile WiMAX trend in emerging markets.

ACLU Northern CA: Cloud Computing- Storm Warning for Privacy?

Summary:   In a February 10th presentation at SCU, ACLU-Northern CA Technology and Civil Liberties Policy Director Nicole Ozler warned that Cloud Computing could compromise privacy rights of its users. The problem is that since the information stored in "the cloud" is not in your office or data center, it may not be considered as your private property or an extension of your filing cabinet.  Once this information is located in one or more databases "in the cloud", it may be accessed and used in ways that individuals never envisioned or intended, and with little oversight. Governments can dip into this treasure trove with a subpoena; companies can mine this information to build profiles, deliver targeted advertising, and share with others. And with the lengthy data retention periods and ineffective deletion procedures of many companies, we may find it very difficult to remove their data once it is uploaded.

In particular, the state or federal government could order a subpoena that would force the cloud computing provider to turn over its records on your computer usage. Such subpoena’s have no judicial oversight, meaning that your privacy rights would be compromised and you would be denied due process!

Background:  Many companies are interested in cloud computing as a potential solution to computer and storage capacity constraints. The idea is an extension of a virtualized data center, where the cloud could potentially be an "overflow data center." In other words, computing capacity would expand during periods of high demand by using the virtual compute servers in the cloud. The major advantage here is that if the cloud can extend your data center, then you don’t need to build another one or increase the capacity of the one you have just to handle intermitted spikes in computing demand.

We have previously written about cloud computing at Viodi View:

Cloud Computing Issues: State of the Net West Conference – August 6, 2008, Santa Clara, CA

The Privacy Problem:  The legal precedents being set around the U.S. are potentially devastating for enterprise adoption of cloud computing. The executive branch is repeatedly taking the position that data stored in the cloud does not have the same assumptions of privacy and due process as does data stored in your own infrastructure. The very fact that you put the data "out there" somehow strips any "expectation of privacy" which is a key criterion for the level of due process protection (based on my limited understanding of law).

A recent decision by the Sixth Circuit Court of Appeals (Warshak vs U.S.) seemed to agree to this idea of a lower "expectation of privacy."

For more on this reference case, please refer to:

Key Question: Can the state or federal government issue a subpoena to access information you have stored with an on-line backup storage facility? Is privacy for on line storage covered under any law?

It turns out, that it is not an easy question to answer:

Looking at online data, the first question is whether the Fourth Amendment (4A) requires a search warrant to access that data. This depends on whether the record is treated as "in storage" (in which case 4A does apply and a warrant is needed) or as a "business record" (in which case 4A doesn’t apply and no warrant is constitutionally required).

There have been too few decisions on the topic of cloud computing to answer with any certainty at all.  However, the more a site/service looks like a "storage facility" – a site designed solely for online storage – the better the argument for constitutional protection.   Conversely, if the site uses your content for various purposes (e.g., advertisements or recommendations) and asserts some ownership over data about or generated by users, the constitutional argument is weaker.

Regarding statutory law, the primary federal law is ECPA (Electronic Communications Privacy Act), which applies (with different standards) to both communications in transit and stored communications. ECPA’s application to cloud computing is equally murky, but the same rough spectrum likely applies: the more user control, the greater protection, and the more the site controls or uses the information, the weaker the argument for protection. It’s worth noting that ECPA as written assumes that 4A does not cover cloud computing in many forms, as it proscribes much weaker protections than the Constitution would demand for web email in particular. The courts haven’t really addressed that assumption much, but one recent court held that 4A does apply to online email before the decision was vacated on unrelated grounds.



Back to the Future – Software as a Service & Managed Services

Software as a Service (SaaS) is a popular Web 2.0 buzzword. It struck me at NTCA’s panel, Software as a Service: Creating New Revenue Channels, that Independent Telcos have been providing a sort of Software as a Service, really a Managed Service since their inception. Independent Telcos are starting to add on Software as a Service and Managed applications, such as network computing, disaster recovery services and telemedicine applications, to their first SaaS application; POTS.

panelists at NTCA 2009 Annual Meeting

Warren Lee, President and CEO of NeoNova Network Services, distinguished between SaaS and Managed Services. He echoed his talk at last year’s IP Possibilities regarding the importance of understanding what your goals are before you commit to a plan. Understanding your goals will help answer whether you should provide SaaS or Managed Services. The reason he said it matters is that if you provide SaaS, then you open yourself to competition from anywhere. If you offer managed services, you can differentiate on the local service you provide (e.g. hard to do a truck roll from China).

He compared the differences between traditional compute solutions versus one that lives entirely in the cloud. Warren called this type of computing, “Living in the Cloud,” as opposed to Cloud Computing. NeoNova is using this NTCA Annual Meeting & Convention to introduce its Network Computer and associated applications that live on a Telco’s servers and utilize its broadband; more on that in another article.

Susan DiFlorio, COO of FiberCloud, focused on small business applications that independent Telcos can provide in order to increase market size, increase broadband usage and effectively lower staff costs. She called outsourcing an opportunity for Independent Telcos, as businesses are looking to cut costs and simplify the complex. FiberCloud provides data center services to Independent Telcos.

Along these lines, FiberCloud has been beta testing, with five rural telcos, cloud computing applications from Microsoft (see this video interview with George Henny for additional background). They found is that they were trivializing the markets and didn’t understand the nuances of the differences in locales and customers. They engaged an external marketing firm to help them understand the needs of the customers and, as a result, they found some commonality between rural and suburban markets; for instance, most of the Telcos’ potential business customers have with less than 99 employees.

Donald Fendrick, CTO, National Wireline Accounts, Alcatel-Lucent, spoke of Telemedicine applications where monitoring devices transmit real-time medical information via broadband. The interesting thing about this approach is the potential detection or correlation of trends much sooner than the traditional occasional visit to the doctor. Fendrick suggested that some insurance companies will give co-payments for the Telemedicine Devices.

John Granger President of Mapcom suggested that Telcos could add visual locating services as features in hosted dispatch and network management and hosted security applications. Granger said they have extended their mapping platform to allow independent Telcos to offer this sort of value-add.

Jeff Wick of NexTech, who was in the audience, put an exclamation point on the panel by providing an explanation of how they are replacing complete computing infrastructures for small businesses. They provide all of the workstations and manage for a monthly fee. He said it was important to have a back-end system to make sure it all works. Some of the server infrastructure is hosted and some is on the businesses’ sites. The important point he made is that this service is easy to sell, because it is a low cost of entry for the small businesses.

A Broadband Stimulus – A Chance for Something to Pass in Short Order?

Yesterday’s Wall Street Journal had a good article on the prospects for a near-term broadband stimulus plan. The article referenced a Free Press proposal for a 3 year, $44 Billion broadband stimulus plan. This plan just might have a chance, as it has elements that equipment vendors could embrace and incumbent carriers would like. For instance, the plan ties closely to OPASTCO’s point that there is an intrinsic link between video and broadband deployment.

Like the rest of the proposed economic stimulus, it is something that the advocates are trying to push through in short order. This might actually happen, because one of the incentives they are advocating is the creation of a Broadband Infrastructure Fund, as recommended to the FCC by the Federal-State Joint Board on Universal Service, to provide direct grants of $15 Billion (broadband fund) and $5 Billion (mobility fund).  These funds would be administered by NTIA/USAC, so, potentially, a new bureaucracy would not have to be created (just expanded).

Free Press is also advocating other programs to encourage broadband deployment, which include accelerated depreciation and tax credits. They are also suggesting the “lifeline” programs be extended to broadband, which could mean things like $10 per month subsidies for low-income homes and subsidies for devices that access the Internet.

Their plan is heavily oriented to solving the digital divide between rural and urban areas.  They suggest a minimum of 5 Mb/s down/upstream speeds (specifically to ensure transmission of H.D. video) with incentives to ensure the network can evolve to 100 Mb/s. In general, they are advocating FTTH, as it drives more short-term job creation while providing a better infrastructure for driving long-term economic growth and creating, “Desired social outcomes.”

It will be an interesting first quarter of 2009.   

TIE Wireless SIG Panel: Mobility 2009: Setting the Stage for a New Order?


The TIE Wireless Special Interest Group (SIG) held their annual review and outlook meeting on December 11th in Santa Clara. This article highlights key points, observations and take aways from the panel session.

2008 marked a milestone year for the mobile industry worldwide, with the launch of the Apple Appstore, Google’s Android, the mainstreaming of consumer smartphones, and a global recession. The key trends and shifts observed in 2008 were reviewed, and several predictions were made for what might be hot in 2009.

Session Moderator: Tim Chang, Norwest Venture Partners


  • Peter Barry, Head of Venture Capital and Start-ups, Vodafone
  • Jay Boddu, Kauffman Venture Fellow, Sofinnova Ventures
  • Nagesh Challa, Founder, Director, President & CEO, Ecrio
  • Rob Trice, Senior Managing Director, SK Telecom Ventures


According to Chetan Sharma, Technology & Strategy Consultant, “the US wireless data market shrugged off the economic doldrums in Q3 2008 and grew 7.3% Q/Q and 37.5% from Q307 to reach $8.8B in data services revenues. The total for the year (for first 9 months) stands at $24.5B which is equal to the revenues generated in 2007 (full year)." But things don’t look so rosy for the 4th quarter and for most or all of 2009, according to the panelists.

Mobile Industry Trends

  • Success of iPhone, touch screens, and App stores. Apple now going after enterprise customers for the iPhone.
  • Global Financial Crisis had negative impact on the sector in 4th quarter of 2008.
  • Open source movement: Android/Open Handset Alliance, Symbian, Taiwan Ministry of Economic Affairs Moblin initiative, etc
  • Wireless industry progress and momentum outside of Japan and Korea, e.g. U.S. movement to LBS’s and 3G with promise of LTE in the future.
  • Data centric mobile platforms and ecosystems are being built for 3G broadband mobile technologies.
  • Mobile broadband growth has shifted from developed to developing markets. This will be an enabler for open platforms and smart devices.
  • Energy savings as a cost driver, e.g. Nokia’s new innovations to reduce energy cost and carbon footprint.
  • Enterprise sales of mobile devices grew 40 to 50% in 2008. Data plans for corporate cell phones, downloaded apps, management and control are attributes of this market segment.
  • Location is starting to be combined with other mobile applications and services.

What didn’t happen in 2008

  • IMS based services for multi-media streaming
  • Mobile advertising didn’t grow as fast as expected
  • Network operator monetization of new content has proved difficult
  • WiMAX did not take off as expected
  • Mobile web 2.0 did not happen as expected

Wireless environment for VCs and Wireless Start-ups

Tim Chang of Norwest Venture Partners opined that, "wireless has become a graveyard for venture capital investors." This was probably because of the inability of the VCs to cash out of their investments with no exit strategy forthcoming. One panelist stated that $45M (average) exit price for wireless start-up companies- a far cry from the dot com boom years. So VCs need to be very cautious. They must carefully consider which wireless start-ups can generate a reasonable ROI and be capital efficient enough to survive on a $3M or $4M burn rate for the next 12 or 18 months.

An opportunity and challenge for VCs is to chose the right technology/ apps for a given country or geographic region. This is because the wireless market is geographically fragmented in terms of how people use data plans and what applications/ services are in demand (e.g. on-line gaming popular in Asia but not in the U.S.). Further, it’s very difficult to get an application to work over a variety of wireless network technologies deployed in different countries.

Tim stated that "channel is everything for a mobile company." That implies that sales/distribution channels must be lined up by the successful wireless start-up for their mobile products in development.

2009 Focus Areas for start-ups and VCs

Start-ups should address users pain points, do more with less, get better efficiencies, avoid bleeding edge (new or uncertain) technologies. For example, SK Telecom is avoiding technology companies pursuing white spaces, as they think there are too many uncertainties and unkowns there.

  • VCs will sacrifice top line growth for companies that can reduce costs and fix pain points, e.g. billing.
  • Vodafone is looking for companies with "differentiated capabilities for value added Internet services." They are also interested in mobile device innovation, especially screen usability and nex gen human interfaces.
  • Sofinnova Ventures sees opportunities in LTE for 4G "all IP" networks. Incumbent providers chose LTE (over mobile WiMAX or other BWA technology) to gain continuity and a graceful evolution from their 3G networks.
  • Norwest Ventures thinks WiMAX missed its market window which is now closed. While LTE is quite promising, they think that the Chinese companies may ultimately dominate the device and equipment markets.
  • Android in China will be a huge success. China Mobile is planning to give away Android phones sometime in 2009, according to Tim Chang of Norwest.

Postscript: I was not at all surprised to hear nothing positive about WiMAX and only a little about LTE at this VC panel session. But not even unlicensed broadband wireless Internet access for white spaces was thought to be a good investment.

Why? First recognize that in today’s very harsh economic environment, with no VC backed tech IPOs in over 1 year, the VCs must avoid uncertain or unproven technologies, especially for new network infrastructure. They are emphasizing companies that can deliver cost improvements and problem resolution for their customers over the existing mobile networks.

Do you agree or disagree with this approach?

Convergence – More Business than Technical Challenge for Smaller Operators

The challenges that small telcos face regarding the convergence of services onto a single platform was the subject of a panel featuring industry experts at the 2008 TelcoTV Conference. The overarching theme of the panel was the challenges surrounding convergence deal more with the business issues, as opposed to questions of technology.

Steve Pastorkovich of OPASTCO suggested that in order for independent telcos to have a powerful three-screen strategy, they need better and more affordable access to video content.

Warren Lee of NeoNova said, “Convergence should be the single greatest focus driving all of your businesses planning and decisions.” He went on to say that it is important for a telco to know what they are; are they a pipe provider or a full-service provider or some sort of hybrid between the two. He talked about the importance of telcos being able to easily add and generate revenue from multiple broadband services, such as Rhapsody, home security services, etc.

Lee stressed the importance of Business Management software to track things such as service profitability, customer support and return on investment. He recommended that telcos need new revenue modeling tools. He pointed out that planning has to transcend functions within a telco, so that there is cross-pollination between budgets.

Yue Chen, Director of Systems Engineering for Juniper Networks, echoed Lee by suggesting that it is important for telcos to align their network with their business.

Donovan Prostrollo of Calix said that it is necessary for telcos to think like the consumer in order to adapt to the changes in consumer behavior, competition, technology and regulatory. Prostrollo suggested that convergence is about a connected life.

IEEE ComSoc-SCV Workshop: Location Based Technologies and Services

Summary of Location Based Technologies and Services Workshop

[June 19, 2008, Crown Plaza Hotel, San Francisco International Airport] 

Alan J. Weissberger
IEEE ComSoc- SCV Secretary and Program Chair
Yankee Group tele-briefing report on Location Based Services and Technologies:

Speaker Remarks
1. Dave Reid, Director of Business Development, SiRF Technology Inc.
The world is on the go (which implies that mobile telecom services and devices will grow rapidly). SiRF believes that location awareness brings convenience to our lives. SiRF is predominantly a (fabless) semiconductor company- with the largest market share of discrete GPS chips and related intellectual property. SiRF powered mobile devices include personal navigation devices (PNDs), handheld GPS receivers, smart phones, feature phones, personal media players (PMPs), and in-dash car navigation systems. 
There are many types of Location Based Services (LBS’s) being deployed and being considered by network operators: navigation, social networking, location based advertising, mobile commerce, transportation, child locator, pet tracker, etc. New mobile broadband networks, like WiMAX, will be location enabled; so will new devices, including Mobile Internet Devices (MIDs) and even location aware watches. Applications and content are intersecting and this will lead to innovative new mobile services with location awareness. Enterprise customers have led applications in location for a long time, but the consumer market for LBS could now be poised for faster growth.
Verizon Navigator (offered by VZ Wireless) is the most popular LBS and most successful navigation service in the world (5M subs). VZ Navigator offers audible turn-by-turn directions for $10 per month.
LBS’s (mostly navigation) will continue to command a pricing premium over other wireless add-on services, e.g. music, ring tone, games.   In the future, LBS will be a key revenue generator for network operators. Nokia announced they would have location awareness in all their devices (Nokia uses TI processors). 
Location Based Technologies: While GPS is only one of several location-based technologies (others include cell site location, broadcast TV signals, WiFi AP locations, RF signatures- see graphic below), its accuracy is better than the others. Assisted GPS may be used to enhance performance when signal propagation conditions are poor (e.g. when surrounded by tall buildings or when the satellite signals are weakened by being indoors or under trees). In pure GPS location tracking, it typically takes 30 or 40 seconds for a GPS device to compute a location if it does not have recent ephemeris data for the GPS satellite network. Otherwise, locations are computed once a second or faster. 
Sky Hook Wireless ( creates a database of WiFi Access Points (APs) as the basis of its WiFi Positioning System. It uses the native IEEE 802.11 radio (already on mobile devices) to deliver accurate positioning worldwide.
Dave Reid was kind enough to provide this chart of Location Tracking Technologies:
 Location Tracking Technologies
RSSI = Received Signal Strength Indicator
TDOA = Time Difference of Arrival
Cell ID will assume location is in the midpoint of the cell (this could be inaccurate if person is at the cell edge or on the border of adjacent cell?)
SiRF has proposed a LBS Systems Architecture. They have an ecosystem in place to develop, test and market location based applications. SiRF provides end- to- end solutions and has engaged in partnerships with various companies.

2. Jon Metzler, Director of Strategic Initiatives, Rosum Corp.
Location determination capability is becoming a "table stakes" requirement for device makers and semiconductor companies. LBS’s should be considered as a utility – like electricity that can be turned on and off. 
Rosum is the first and only company to harness over the air, broadcast TV signals for position location. The key advantage of this approach is that TV frequencies were designed to penetrate walls, ceilings and trees, in order to deliver a good video signal indoors. The company was founded by original GPS architects to deliver always-on location awareness where GPS fails – indoors and in urban canyons. Rosum is a provider of location, timing and frequency calibration solutions for Mobile TV Device and Home Telecommunications markets. In particular:
  • Mobile TV Devices: cell phones, notebook PCs, and PND/PMPs equipped with TV tuners
  • Home Telecommunications: femto cells for the home, and E911 (E112) for Wireless and VoIP subscribers
  • Among recent milestones for the company:
    • Rosum Announces Successful DVB-H Positioning Trial with UK’s National Grid Wireless (6/25/08)
    • 2Wire Selects Rosum TV+GPS Location and Timing Solution for E911/ Home Telecom products using femtocells (3/31/08)
  • Rosum Signs Collaboration Agreement with Intel – Will Enable TV-Location on Mobile Devices (10/07)
But why use Broadcast TV signals for position location? 
The TV signals offer high power (1 MW ERP typical), low frequency (50-750 MHz), frequency diversity (wide 6 to 8 MHz channels, multiple channels per tower), and horizontal signals (less attenuation from roofs and walls). Moreover, the terrestrial TV infrastructure is highly correlated with population density and broadband penetration in the U.S.   In a one on one test of TV Positioning vs. GPS based location tracking, GPS failed at three of six indoor locations in the SF Bay Area.
Editors Note: GPS vendors, such as SiRF and others, would likely question those test results.  However, Rosum uses third party testing in order to address concerns of competing technology vendors.
The best of both worlds might be a hybrid approach – where GPS and TV based positioning are combined in one device. In that case, GPS would be used outdoors, while TV positioning would be used indoors and in canyons (where GPS often fails).
The location technology and device market is consolidating, with many mergers and acquisition of key players, e.g. Nokia acquiring mapmaker Navteq. Other market themes of note:
  • Online mapping arms race between Google, Microsoft, Yahoo
  • Combination Personal Navigation Device / Portable Media Players (PND / PMPs)
  • Convergence of PNDs and Communications devices (i.e., cell phones) 
Two popular hand held devices with LBS and positioning technology:
  • Blackberry with Google Maps and GPS positioning
  • Apple iPOD Touch with Google Maps and 802.11x (WiFi) based positioning
What Comes Next for LBS’s?
  • Connected (not silo’d) use of location information with two categories foreseen:
    • Groups: self-chosen affiliations, such as Social Networks
    • Swarms: (anonymous) use of location for ITS enhancements
  • Resolution of privacy issues (TBD)
  • Growth in new LBS’s such as: Social Networks, Intelligent Transportation Systems (ITS), Connected Navigation, and Local Search/ Advertising (Google and Yahoo)
 Panel Session
The author chaired a panel session with the two speakers. It consisted of a few pre-planned questions for discussion, audience Q and A, and a wrap up question about the nature of future devices for LBS’s (cell phones, iPODs, other gadgets, or Mobile Internet Devices=MIDs). The panelists agreed that the big software companies (including Microsoft, Yahoo, Google, Oracle) all had LBS initiatives underway. They also believed that the smart phone (cell phone + Internet + LB technology) would dominate the LBS market, especially over non-voice capable MIDs.
Jon later amended his panel session remarks regarding MIDs: "If you define MIDs as including devices with integrated WiFi, such as the mylo or iPod Touch, then yes, I believe that market will develop. With that said the overall cell phone market will still remain much larger."
The author thanked the panelists and the audience (35 attendees) for their participation in this very enlightening and informative workshop. We also thanked IEEE SECON for sponsoring the workshop in conjunction with their annual conference.

Addendum: Critical issues for mobile network operators
At a VoiceCon- Spring 2008 panel on LBS’s, the critical issues for mobile network operators were identified:
  • Security and privacy-authentication, authorization, encryption, etc.
  • Application integrity – to prevent apps from harming network or users
  • Power dissipation and utilization
  • Flexibility and customizability
  • Integration of new value added services (e.g. location)
  • Billing: What to charge for a new service? Flat rate vs. Usage based (metered)
Postscript: Location Based Social Networking from Verizon Wireless
On June 26, 2008, Verizon Wireless announced that its location based social networking service- known as loopt – is now available to its subscribers. The original announcement this past March anticipated an April launch for the service, but according to Verizon Wireless spokesman Jeffrey Nelson, “technical issues, pricing issues and running the application through some traps before launch,” caused the delay. Regarding security and privacy, Nelson said: "We’ve strengthened the privacy capabilities even further. We will be pinging customers on a regular basis to let them know their loopt account is active and that they can be tracked."
Loopt’s CEO Sam Altman had previously stated that privacy had been one of the biggest issues facing the uptake of location-based mobile social networking and that solving them is a key step toward achieving inter-carrier LBS services.   Evidently privacy is no longer a problem- at least not for Verizon Wireless.

Obscenity on ABC, Advertising on PBS, H.R. 6320 – Regulation of Broadband TV on the Horizon?

The inspiration for this article was a decision a few months ago by the FCC to slap a huge fine on ABC Affiliates for violating obscenity rules in airing an episode of “NYPD”.   The FCC Order, although extremely descriptive about the nudity, sparked my curiosity. 

"………The camera shot includes a full view of her buttocks and her upper legs as she leans across the sink to hang up her robe……….."

In years past, I would have been in the dark. Thanks to the power of Internet video, however, I was able to find a copy of the video within 30 seconds and view for myself what the FCC considered obscene.  The image below is a screen capture of one of the tamer shots.   Click here to see the video - assuming it hasn't been removed.  In my opinion, it crosses the line 

You can see the video by clicking on this link (assuming that ABC doesn’t make YouTube remove it).      

The point is, what was obscene on broadcast television does not violate FCC rules on broadband television; at least yet. This skirting of broadcast television rules could be the biggest impact that broadband TV has on the future of television, as this provides a new medium for the broadcast networks to create derivative products (e.g. an even coarser version of the “Family Guy”). Things like limits on advertising to children, hard liquor advertisements, the fairness doctrine, nudity and swearing are beyond the FCC’s current scope for regulating video over the Internet. 

Another piece of evidence that the television business model and rules are changing is Hulu’s deal with PBS to put advertisements in front of its programming [thanks Viodi View reader, Peggy for pointing this out).   This would have been a huge deal 20 years ago, when PBS was available only in a broadcast medium.

Where it will get interesting is when politicians start hearing complaints from constituents about “FCC rule violations.” Of course, the FCC won’t have rules (although they will probably try to figure out how to expand their powers) and Congress will get involved and it will get real political. Knowing how long it generally takes the Federal Government to act, this sort of political uproar may still be a long ways out (it took 5 years for the FCC to rule on the aforementioned NYPD obscenity case). 

Just as I was about to publish this article I saw today’s issue of the OPASTCO 411, which summarized the Markey Bill ( H.R. 6320).  This Bill is an indication that the future may be closer than I thought.  H.R. 6320 calls for captioning and providing emergency alert info for video over the net, as well as adding requirements for other IP devices. Clearly, this is a grab to regulate the Internet and it probably will not be successful in this election year, but it is start of what could be a very long and interesting fight.    

To avoid future legislation, the Broadband TV industry should adhere to current broadcast rules as much as possible and, as needed, set new guidelines.   This may require the various industry players to reach across ecosystems and proactively work together. 

The DTV Transition – At What Cost?

[Author’s Note:  Thank you very much to David Irwin, Director of the Communications Law Institute at The Catholic University of America, for his review and suggestions for this article]

Click here to learn the detail of the DTV Transition from the official government web siteMuch has been made about the recent 700 MHz spectrum auction and the potentially billions of dollars raised for the U.S. Treasury, but this is just one part of a complex equation related to spectrum management and economics that may be implicitly costing U.S. citizens much more than they are receiving.

For example, there is the opportunity cost of the new, digital broadcast spectrum, given away to existing broadcasters by the government that is worth untold billions of dollars. Unfortunately, this article is probably about 10 years too late to affect a change, but maybe it will serve as a warning for future generations.

Prior to auctions, back in the early days of the FCC, spectrum was regarded almost like land during the homestead days of the eighteen hundreds.  That is, entities were given spectrum in exchange for building out the infrastructure, providing certain public goods and accepting regulations and restrictions.  Where more than one entity sought spectrum in a given locale, each spectrum application being mutually exclusive to the other, evidentiary-type hearings were held to determine which applicant would best serve the “public interest;” but, at the end of the hearing process a spectrum license was simply awarded by the government to the winner.

Much like homesteading, this policy lessened the risk for entrepreneurs and was a catalyst that expedited the build-out of the radio, television and original cellular infrastructures. These build-outs evidenced the inherent value of the spectrum and, as result, Congress turned to spectrum auctions as the prevalent way to ensure that the public received a return on its spectrum assets; that is, except for the give-away of digital television broadcast spectrum.

Giving away spectrum may have made sense in the 1990s when Congress and the FCC began laying the foundation for what what has become known as the 2009 digital television transition. In February 2009, all legacy television stations (with the exception of low-power stations) will turn off their analog transmitters and thereafter only broadcast digital signals; this may include high-definition TV.

The intent of Congress, heavily lobbied by the broadcast industry, was to free up the analog spectrum for other uses, as well as ensure that the U.S. remained competitive with other nations by having a digital broadcast infrastructure that supported HDTV. Like in the 1940s, the Federal Government essentially gave away spectrum to broadcasters in return for a digital build-out and exchange of bandwidth that had been used for analog signals. There have been significant and unanticipated changes since the DTV legislation was passed in the 90s, including:

  • The continued growth of cable, telco video distribution via fiber optics and copper DSL and DBS operators, such that the number of households receiving off-air broadcasts is estimated by Jeff Zucker, CEO of NBC-Universal at 10%.  He suggested at NATPE 2008 that the number of households receiving off-air only will drop to approximately 5% after the transition.
  • The transition of the Internet into a video distribution medium is a true unknown — a “sleeper” in this equation.  Broadcasters and television networks are “re-inventing” themselves, embracing the internet, proving by their own actions and plans that broadband is a viable distribution outlet for video.
  • The success of unlicensed WiFi spectrum also indicates that spectrum does not have to be licensed in order to have value.  On the horizon is WiMax, which may be thought of as WiFi on steriods; WiMax is a potential threat to cable, satellite and telcos.  In all events, it appears that the concept of device-regulated spectrum (the devices are network-aware and transmit data accordingly, minimizing interference), instead of the traditional agency-licensed and regulated spectrum.
  • Improvements in video compression, which frees up bandwidth for uses that were probably never anticipated by Congress.
  • The success of auctions in allocating spectrum.

The transition to digital television broadcast required a huge investment by the broadcast industry and probably never would have been justified by better picture quality alone (although, the broadcast industry did make feature upgrades to include color and stereo in earlier days, but these didn’t involve such an extensive infrastructure upgrade).

That is, there is probably no new revenue for the broadcaster who only replaces a standard definition digital signal, albeit better than analog, with HDTV programming.   It is thus understandable why broadcasters are looking for ways to monetize their investments in digital technology including the use of the bandwidth not utilized for their primary digital and/or HDTV signal by:

  • Creating mini-cable systems by developing new content to go along with their primary channel
  • Potentially creating “pay versions” of popular programming (several years ago, one industry pundit suggested networks could create two versions of the same show; a tamer version for general broadcast and a wilder version that people would be willing to pay for as a premium service).
  • Leasing out bandwidth to third-parties that would essentially act as aggregators
  • Offering a mobile video solution that would extend their service offerings onto personal video devices.

Policy makers and broadcasters believe that these new applications of the broadcast bandwidth will have value to some consumers.  But, there are, as noted above, real costs, as well as opportunity costs that need to be considered.  Some of the real costs to the DTV transition include:

  • The $1.8 B in coupons provided by the Federal Government to consumers to pay for digital set-top coversion boxes that will let analog televisions play digital broadcast signals.
  • The costs that my telco friends and others had to spend as a result of an FCC mandate to publicize the digital TV transition.
  • The costs associated with cable systems and telcos having to support standard definition, long after the broadcasters make the switch to digital.
  • The big cost is probably the opportunity cost, as the digital spectrum given away would have value that could be realized explicitly through an auction process or as an unlicensed public good.  Based on the recent $19.6 B expected from the auction of the 700 MHz spectrum, the remaining 200+ MHz of spectrum could be well worth many multiples of the 700 MHz spectrum bids.

As much as I would like to be able to present a silver bullet that would change the situation, I doubt there is anything that could be done politically or practically to improve the value of the DTV transition for the U.S. taxpayer (it is our spectrum).  After making such a huge investment and with rules in place for so long, it simply isn’t fair to the broadcast industry to change the rules of the game at this late date.  Economists will suggest that there is nothing like political uncertainty to impede business investment and it would be bad precedent to make significant changes to DTV.  The time to make changes was 10 years ago.

It will be interesting to see how economic historians view the digital TV transition. Hopefully, they will learn from it and be able to influence politicians and regulators the next time we have the opportunity to make such a historic shift in our communications’ infrastructure.