Rural America Needs Advanced Services & Competition – Part 1

The above title is by no means an original thought; this belief has been documented in policy statements and legislation at the Federal and individual state level many times over the years.  Technology and other social factors have resulted in many changes to telecommunications policy for the common good of our country.  A significant number of the social goals and policy revisions were initiated to ensure that people, communities and economic development in Rural America were not left behind and to ensure that all Americans would realize the benefits from advancements in technology.  Today, our government, via laws, rules and policies have stipulated that open & fair competition is right for all of America.  But before we discuss the current landscape, a brief review of how we got here would be helpful.

Bell System hires 1921 logo
Image courtesy of Wikimedia Commons

Alexander Graham Bell patented the telephone in 1876; American Telephone & Telegraph (AT&T) and the associated Bell companies were formed and became known as the Bell System.  Generally speaking the local Bell companies provided the local connections in mostly larger cities and AT&T provided the long distance lines.  In addition to the Bell System, various non-bell telephone companies (independent or cooperative companies) were created to offer local service where the Bell companies were not currently offering service; these mostly rural areas were in some cases farms, ranches, very small towns and even resort communities.

[Because there was some concern that it was not an attractive economic decision to build facilities to serve rural areas, the government stepped in to offer financial assistance to ensure quality service was offered in rural America.  The Rural Electrification Administration (REA), now the Rural Utilities Service (RUS), was created to assist the interests of consumers and communities in areas where adequate service was not provided.  Because this requirement for financial assistance in rural areas continues today, the RUS remains a vital part of economic development in rural America.

Lots of Wires, Image courtesy of Wikimedia Commons
One of the earliest involvements of the government in the provision of telephone service was a determination that the American common good would be best served by determining that the telephone service marketplace should not be open to all competitors in all areas.  Telephone service at that time was provided by above ground wire and poles; policy makers were concerned about safety, economic and environmental impacts of numerous above ground facilities in urban areas and very little service in less populated areas.

This concern was highlighted by the strong belief that many companies would flock to more densely populated areas creating havoc for consumers; but few, if any, companies would want to offer serve in less populated more costly rural areas.

For these and other reasons, it was determined that the common good would be best served by replacing open “competition” for the provision of telephone service with “regulation” and by establishing local monopoly franchised service areas.  One company was given the right to be the only company in a designated franchised area to offer telephone service in exchange for operating under the rules and regulations (including the rates for services) of a regulatory organization; a federal agency for the services between the states and state agencies operating in each state jurisdiction.

This “regulated one-company monopoly” model appeared to satisfy consumers for a number of years; but advancements in technology (i.e., new products and services) were extremely slow.  Only the limited products and services offered by the company providing telephone service in that area were available to those consumers.                        

Another major development in government involvement was a policy that became known as “Universal Service”; which was an understanding that the common good for all America was best served when more telephones were on the network.  The idea…every new telephone line added to the network increased the value of every other telephone line on the network (the Network Effect).  This proposition resulted in revised rate structures, revenue sharing between companies and subsidies between services.  It led the way to subsidies between toll & local service, business & residential services and urban & rural services.  These defined subsidies, along with government financial assistance, fostered the expanded economic growth and consumer satisfaction in rural areas throughout America.  Telephone service penetration grew substantially throughout the country.

Throughout the earliest stages of the development of telephone service in this country, policy makers continued to believe that a regulated monopoly with no or limited competition was beneficial for the American consumers.  However, some consumers and other groups began to believe that this type of structure or model was stifling the development of new products and services.  These groups started to expound on the idea that, as long as a company was “guaranteed” all customers in a given area, that company would do very little to satisfy customer demand in that area.  It was clear that consumers wanted new products and services…they wanted choice and competitors wanted to enter the market.  These underlying beliefs brought about a national interest in revisiting the real benefits of a regulated monopoly for telephone service.

Carterfone cradle at CHM.agr
Carterfone adaptor made land-wireless communications possible. Image Courtesy of Wikimedia Commons
It began in the consumer product arena.  The regulated companies had stipulated that only their company provided equipment could be connected to the public switched network.  However, many new products (such as The Hush-a-Phone and, Carterfone) were being introduced into the market and consumer demand appeared to be growing.

The regulated companies argued that the newer products would harm the public network…the use of one of these products by one customer could have an adverse impact on the service of another customer.  After many court and regulatory battles, “customer provided equipment” was permitted, connection rules (including jacks) were established and telephone service went on with little or no problems.  The fear that competition would denigrate the network was unfounded and the consumers began to enjoy a plethora of many new end-user type products.

The interests of all American consumers started to become the primary measuring stick utilized by policy makers & regulators when discussing issues and making decisions affecting the provision of telecommunications.

The next area ripe for the introduction of competition was the long distance segment of the telecommunications marketplace; but it proved to be much more complex, requiring more time to implement.  It started slow, in the specialized services arena.  Some new companies (such as MCI, SBS, etc.) were offering private services to companies with large communications needs.

These companies wanted to offer any spare capacity to the general public.  The regulated monopolies objected; they believed that the public offering of this type of communications service was their exclusively their business and these upstart companies should cease offering these competitive services.  Again, numerous court and regulatory battles ensued and the policy makers wrestled with the balance of competition versus regulation.

It was clear that the true benefits (economic and quality) of technological advancements could only be realized if these new market entrants with alternative products and services were permitted to participate in the marketplace….for all consumers…in all areas.

One particular court battle was a very long antitrust action against AT&T and the Bell companies.  After extensive arguments, the parties agreed to an unprecedented settlement of monumental proportions for the telecommunications world.  It was announced on January 8, 1982 and scheduled to become effective January 1, 1984.  It would take almost two years to carry out the stipulated agreement, which was outlined in a “Plan of Reorganization” which outlined the corporate separation; distribution of assets; defining service areas, interconnection arrangements; assignment of personnel; establish contracts; etc.

An image depicting the 7 Baby Bells that were formed with the divestiture of AT&T.
AT&T Splits Into Long Distance and the “Baby Bells” Image Courtesy of Cellstream

In summary, it separated this enormous American corporate giant along the lines of what was perceived as competitive activities (AT&T long distance) and what activities that, at that time, were not considered to be competitive (the local Bell operating companies).  AT&T with a few subsidiaries. such as Bell Labs, Western Electric, etc. (that were more aligned with a competitive activity) went in one direction.  And the twenty-two local bell Operating companies were separated and reorganized into seven Regional Bell Companies providing local telephone exchange service in specific areas.

AT&T was immediately thrown into a competitive world, having to formally “interconnect” with its earlier corporate subsidiaries and others.  The Bell Operating companies would continue their local regulated monopoly services with all the requirements (Carrier of Last Resort, etc.) associated with that position.  But the handwriting was on the wall…..the word of the day was competition and even the individual Regional Bell Companies soon looked at the other regions (and everyone else in the marketplace) as competitors.

Bottom line; the telecommunications world was turned upside down.  Local Access and Tandem Areas (LATAs) were designed….“Access Charges” was added to the industry glossary of terms… Organizations were created, such as NECA, ECSA, etc., to assist in the management of interconnecting carriers.  The complexity of this event was compounded due to the number of other participating carriers & companies, such as the 1,000+ non-Bell independent and cooperative exchange companies, the new long-distance carriers, wireless providers and the equipment manufacturers.

This far-reaching action was another step in what was an obvious movement to a competitive operating environment for the entire telecommunications landscape.  Now that the Bell companies could control their own destiny, they began to venture into other areas and test their freedoms; i.e., they could not offer competitive services in their territory; but they could offer competitive services outside their individual region territory.  Wireless or cellular service was of particular interest and the bell companies were successful in grasping a major foothold in that arena.

[Click here to read part 2 of Mr. South’s article which discusses the post 1984 world and its implications, particularly on rural America.]


This picture is a very nice one of independent broadband industry icon, Gene South.
Gene South

Gene R. South Sr. is a telecommunications and broadband professional with 45 years of experience including positions as EVP for Panhandle Telephone Cooperative in Guymon, OK; CEO / GM of Lakedale Communications in Annandale, MN and currently the V.P. & Director of Governmental Affairs for Lake Communications in Two Harbors, MN.  Mr. South served as Chairman of the Board of USTA, RTFC and MART; he also has held Board memberships for OPASTCO and MTA.  In addition, he has testified before congress and state legislatures.  

Rural America Needs Advanced Services & Competition – Part 2

[Editor’s note: Mr. South’s first article provided a brief history of AT&T as as a regulated monopoly and the forces that drove to the 1984 break-up of “Ma Bell”. Part 2 examines the aftermath of the break-up, particularly its impact on telecommunication services to rural areas.]

Click here to read Rural America Needs Advanced Services & Competition – Part 1

AT&T was immediately thrown into a competitive world, having to formally “interconnect” with its previous corporate subsidiaries and others.  The Bell Operating companies would continue their local regulated monopoly services with all the requirements (Carrier of Last Resort, etc.) associated with that position.  But the handwriting was on the wall…..the word of the day was competition and even the individual Regional Bell Companies soon looked at the other regions (and everyone else in the marketplace) as competitors.

Bottom line; the telecommunications world was turned upside down.  Local Access and Tandem Areas (LATAs) were designed….“Access Charges” was added to the industry glossary of terms… Organizations were created, such as NECA, ECSA, etc., to assist in the management of interconnecting carriers.  The complexity of this event was compounded due to the number of other participating carriers & companies, such as the 1,000+ non-Bell independent and cooperative exchange companies, the new long-distance carriers, wireless providers and the equipment manufacturers.

This far-reaching action was another step in what was an obvious movement to a competitive operating environment for the entire telecommunications landscape.  Now that the Bell companies could control their own destiny, they began to venture into other areas and test their freedoms; i.e., they could not offer competitive services in their territory; but they could offer competitive services outside their individual region territory.  Wireless or cellular service was of particular interest and the Baby Bell companies were successful in grasping a major foothold in that arena.

What was a closed marketplace with a limited number of participants was changed into a semi-open market with many providers (telcos, long distance companies, CATV, wireless, internet providers, satellite, etc.) and you could not tell a player without a scorecard…and even then it was tricky.  The convergence of services over the facilities provided by some companies compounded the complexity; Digital Subscriber Line (DSL) internet access service over the voice grade copper facilities of telcos; Voice Over Internet Protocol (VOIP) provided by internet service providers, etc.

It was becoming abundantly clear that there would be some winners and some losers.  What happened next were a significant number of mergers and acquisitions…each was an attempt to gain a stronger position in the market by increasing its footprint and enhancing its product line offering…. Even the very large Regional Bell companies were not excluded from consolidations.

The marketplace was partly competitive and partly regulated….The lines of demarcation were very fuzzy.  To say it was chaotic would be an understatement.  Regulators were constantly changing rules; putting out fires with little long-term direction for planning purposes.  It was evident that something had to be done to protect the American consumers and ensure that all consumers would realize the numerous benefits from this technological explosion.

Congress was compelled to step in to attempt to crystallize the telecommunications landscape for everyone; regulators, companies, consumers.  The effort would prove to be herculean and consumed lengthy discussions, hearings, comments, arguments and positions from all interested parties. After various draft bills, congress produced The Telecommunications Act of 1996 which was signed into law by the President.  The stated objective of the law was

To promote competition and reduce regulation in order to secure lower prices and higher quality services for American telecommunications consumers and encourage the rapid deployment of new telecommunications technologies”.

The message was crystal clear; advanced services were critical to the economic growth of America and that competition was the vehicle to deliver those benefits.  When before the signals were cloudy and piecemeal, America was now focused on a direction that offers the greatest benefit to all consumers.  This very significant congressional action sent the message that all future decisions would be measured against what is best for all American consumers.

This national debate went far beyond just Plain Old Telephone Service (POTS).  Historical discussions had dealt with, “Who was going to provide POTS to a certain community.” Now it is, “What provider can offer me with all my telecommunications, Internet & broadband services….today”.  This new debate further continued on with, “What provider can provide the required services to assist my community with education, health care, security, etc.”

It will not only be based on who is the provider (telcos, ISPs, CATV operators, satellite providers, private companies, etc.), how it is financed (private or public funded); or who manages the operation., but it will be decided on the ability to offer the most advanced services at the best prices in the timeliest manner to serve the consumer and the community.

If one company is not in a position to offer satisfactory responses to these issues for the community and its consumers; then these services will be provided by an organization(s) that step forward and is ready, willing and capable of the task(s).

Economic development and Consumer interests are the prime movers in these current debates……

The entire industry became under a magnifying glass; externally by media interests, Congress, regulators, consumer groups and policy makers.  Internally, industry players studied the market for more self-serving reasons.

Reports indicate that approximately 100 million Americans do not have broadband in their home.  Internationally, America has fallen behind other countries in the deployment of broadband services.  Domestically, consumers continue to demand advanced services/faster speeds; educators want better service (especially in rural areas); health care providers indicate that enhanced services could improve health care (especially in rural areas). These type of reports are getting significant media attention and many policy makers continue to express  concern.

A June 2013 snapshot of broadband wireline service availability of at least 3 Mb/s.
3 Mb/s or more BB Availability – US Broadband Map

Because Congress, the FCC, the NTIA and state agencies began to place a focus on telecommunications and advanced services, various activities were initiated to investigate and analyze the current state of affairs.  Various studies were undertaken….from a National Broadband Mapping project… to a study of where we are today and what is needed for the future.

In 2009, Congress charged the FCC with developing a National Broadband Plan to ensure every American has access to broadband capability.  The FCC conducted a hearing in November, 2009 to discuss specifically identified “barriers” that exist in formulating a new national broadband policy plan.  One major barrier was the Universal Service Fund.  The FCC Task Force believed that …

the fund should also be used to help subsidize the cost of deploying broadband in rural areas.”

A second barrier that was identified by the FCC Task Force was….

“the fact that broadband service providers tend to favor higher-income regions in more populated areas over low-income areas.  The data suggests that many low-income people in these parts of the country are offered only one broadband service option. The data also suggests that these consumers who have only one option tend to pay higher prices for service.

What this means is that lower-income people, who have less disposable income, are often the ones forced to pay higher prices, while people who have more money pay lower prices for service.

Deployments in rural areas are often affected by the high cost of building infrastructure and providing service. The task force noted that “middle mile” costs are almost three times higher than general network operations costs. This high cost is often a serious barrier to rural broadband deployments, the group said.”

The FCC Task Force conducted an extensive analysis and investigation into what would be required to implement a national broadband policy that would provide high-speed internet access to every American.

The results of the FCC efforts were documented in a comprehensive report unveiled on March 16, 2010 entitled:

“Connecting America: The National Broadband Plan”

The Plan stipulated that the:

Government can influence the broadband ecosystem in four ways:

  1. Design policies to ensure robust competition and, as a result maximize consumer welfare, innovation and investment.
  2. Ensure efficient allocation and management of assets government controls or influences, such as spectrum, poles, and rights-of-way, to encourage network upgrades and competitive entry.
  3. Reform current universal service mechanisms to support deployment of broadband and voice in high-cost areas; and ensure that low-income Americans can afford broadband; and in addition, support efforts to boost adoption and utilization.
  4. Reform laws, policies, standards and incentives to maximize the benefits of broadband in sectors government influences significantly, such as public education, health care and government operations.

The plan also recommended that the country adopt the following six Goals:

  1. At least 100 million U.S. homes should have affordable access to actual download speeds of at least 100 megabits per second and actual upload speeds of at least 50 megabits per second by the year 2020.
  2. The United States should lead the world in mobile innovation, with the fastest and most extensive wireless networks of any nation.
  3. Every American should have affordable access to robust broadband service, and the means and skills to subscribe if they so choose.
  4. Every American community should have affordable access to at least one gigabit per second broadband service to anchor institutions such as schools, hospitals, and government buildings.
  5. To ensure the safety of the American people, every first responder should have access to a nationwide, wireless, interoperable broadband public safety network.
  6. To ensure that America leads in the clean energy economy, every American should be able to use broadband to track and manage their real-time energy consumption.

The release of the National Broadband Plan (NBP) received significant media attention and great anticipation from the entire telecommunications, Internet & broadband segments of the marketplace.  Market participants reviewed all their plans and strategies to measure any impacts on their operations.  Existing broadband providers studied their markets to make investment decisions on the most attractive locations to allocate resources; i.e., where they need to move quickly and where they could delay deployment.  These decisions could be based on a variety of factors such as density, cost to install facilities, current competitors in the area and where it was believed they had a sense of control over that market area.

New entrants in the market conducted similar analysis; but they were starting from a position of limited information; they did have the Broadband Mapping information, but lacked consumer demand and cost data.  But they did believe that time was critical….”first in the market, and so on…”.

Some encouraging good news was that some necessary financing capital was available.  The President had made universal broadband access a key goal for America.  Economic Stimulus money was available in the form of grants and loans to approved providers.  Numerous applications were prepared and submitted for approval from existing providers and new entrants.

The other good news for the American consumer was that individual communities, in the form of local municipal or county organizations, became well aware of the importance of advanced services to their constituency and its economic growth.  In the past, when consumers complained about the lack of available services in the area, local government officials believed that their hands were tied.  Now, they saw what was taking place in other parts of the country and around the world and said…Why not us and why not here.

This community awakening was contagious and many community activists and organizers (including the general public, businesses, schools, and medical institutions) joined this very active movement.  The battle cry was…”What can we do to secure advanced services for our community”.  They did not want their taxpaying public consumers to become “second class citizens” and in certain situations the communities had run out of patience in being last in line for advanced services from competitors providing service in the area.  Municipal and county officials believed that they were doing their job and this is one of the reasons why their constituency trusted them with the responsibility to protect their interests.

The municipalities and / or counties established organizations, reviewed their existing situations, analyzed alternatives, met with constituencies, sought voter approval (if warranted), developed a strategy, prepared documentation, filed paperwork, sought financial assistance (grants or loans from federal government), established contracts with consultants and construction companies….and scheduled installation.

Of course, this activity met with some opposition from competitors in the area; who believed that they had the right to that area.  The answer to that issue is simple….If the perceived competitors had been providing acceptable level of advanced services in the area; it would not have been necessary for the municipality or county to take that action.  The fact is that a lot of existing companies have assumed “ownership” of the area and they believed that these consumers were obligated to wait until the competitor was ready to upgrade facilities in the area to provided advanced or broadband service.

Well, contrary to their opinion, today’s consumers just do not want to wait indefinitely….and educators, health care administrators, business operators, police forces and economic development councils do not want to wait at all….especially in Rural America !

This picture is a very nice one of independent broadband industry icon, Gene South.
Gene South

Gene R. South Sr. is a telecommunications and broadband professional with 45 years of experience including positions as EVP for Panhandle Telephone Cooperative in Guymon, OK; CEO / GM of Lakedale Communications in Annandale, MN and currently the V.P. & Director of Governmental Affairs for Lake Communications in Two Harbors, MN.*  Mr. South served as Chairman of the Board of USTA, RTFC and MART; he also has held Board memberships for OPASTCO and MTA.  In addition, he has testified before congress and state legislatures.

*Lake Communications, a private company, is building and operating Lake Connections for Lake County. Lake Connections is a local fiber-optic broadband provider owned by Lake County and formed to bring High-Speed Internet, Digital TV, and Voice services to Lake County and Eastern St. Louis County in northeastern Minnesota starting in 2014. 



All-IP Network Transition Plan at FCC's Jan 30th Open Commission Meeting


[dropshadowbox align=”right” effect=”raised” width=”270px” height=”” background_color=”#ffffff” border_width=”1″ border_color=”#dddddd” ][/dropshadowbox]During his January 8th speech at the Computer History Museum (CHM) , FCC Chairman Tom Wheeler told the CHM audience that the U.S. was in a transition to a “4th Network Revolution” that would be led by a transition to an “all-IP” network.   The 4th Network is actually a multi-faceted revolution based on  IP based packet communications (for voice, data and video) replacing digital circuit switching and analog transmission. Communications protocols are moving from circuit-switched Time-division Multiplexing (or TDM) to IP packet switching.  At the same time, 3G and 4G wireless access networks are increasingly prevalent, empowering consumers to connect at the place and time of their choosing.

Wheeler said, “The transition to an all-IP network is important in its own right, but it also is important because it demonstrates that the Commission (FCC) will adapt its regulatory approach to the networks and markets of the 21st century.”

The FCC Chairman then said that no one would use a network without being able to make a 911 phone call (to report emergencies and seek help from law enforcement). That implies that the all-IP network must support 911 calls in a consistent manner.

From the FCC Web site, an image of FCC Chairman, Tom Wheeler
FCC Chairman, Tom Wheeler (image courtesy of

Wheeler told the CHM audience:

“The best way to speed technology transitions is to incent network innovation while preserving the enduring values that consumers and businesses have come to expect. Those values are all familiar: public safety, interconnection, competition, consumer protection and, of course, universal access. They are familiar, and they are fundamental.”

Continuing, he said: “At the January 30th Commission meeting, we will invite proposals for a series of experiments utilizing all-IP networks. We hope and expect that many proposed experiments, wired and wireless, will be forthcoming. Those experiments will allow the networks, their users, the FCC and the public to assess the impact and potential of all-IP networks on consumers, customers and businesses in all parts of our country, including rural America.”

All-IP Network Topic at the FCC’s January 30th Open Commission Meeting:

The all-IP network transition will be the number one agenda item at the FCC’s January 30th Open Commission Meeting  Advancing Technology Transitions While Protecting Network Values is all about the transition to an all-IP network.  “The Commission will consider a Report and Order, Notice of Proposed Rule making, and Notice of Inquiry that invites diverse technology transitions experiments to examine how to best accelerate technology transitions by preserving and enhancing the values consumers have come to expect from communication networks.”

In a November 19, 2013 blog post Wheeler provided an overview of the all-IP network migration.  He wrote: “The way forward is to encourage technological change while preserving the attributes of network services that customers have come to expect – that set of values we have begun to call the Network Compact.”

Wheeler noted various FCC Commissioner comments in that blog post:

  • “Commissioner Pai said that the FCC should ‘Embrace the future by expediting the IP Transition.’
  • Commissioner Rosenworcel told us that, ‘As we develop a new policy framework for IP networks, we must keep in mind the four enduring values that have always informed communications law — public safety, universal access, competition, and consumer protection.’
  • Commissioner Clyburn has called upon the Commission, ‘To carefully examine and collect data on the impact of technology transitions on consumers, public safety and competition.’”

AT&T Petition and FCC Technology Transitions Task Force are encouraging trials:

On November 7, 2012, AT&T petitioned the FCC to “Launch a Proceeding Concerning the TDM-to-IP Transition,” GN Docket No. 12-353 (AT&T Wire Center Trials Petition).

That document requested the FCC to “open a new proceeding to conduct, for a number of select wire centers, trial runs for a transition from legacy to next-generation services, including the retirement of TDM facilities and offerings” and that “the Commission should also seek public comment on how best to implement specific regulatory reforms within those wire centers on a trial basis.”

AT&T requested that the FCC consider conducting trials where certain equipment and services are retired and IP-based services are offered. These geographically limited trial runs, conducted after a public comment period on how they should be carried out, would help “guide the Commission’s nationwide efforts to facilitate the IP transition.” Such an approach, AT&T notes, will “enable the Commission to consider, from the ground up and on a competitively neutral basis, what, if any, legacy regulation remains appropriate after the IP transition.”

AT&T has set a date of 2020 to retire its TDM network and has been upgrading its IP-based service capabilities in its wireline markets via Project Velocity IP (VIP).  AT&T presented a progress report on the Project VIP at the June 2013 IEEE ComSocSCV meeting.  It can be read on pages 3-4 of this article: Telco Tours & Seminars Top ComSoc-SCV Activities.

Technology Transitions Policy Task Force” which was tasked to move forward with real-world trials to obtain data that will be helpful to the Commission. The goal of any trials would be to gather a factual record to help determine what policies are appropriate to promote investment and innovation, while protecting consumers, promoting competition, and ensuring that emerging all-Internet Protocol (IP) networks remain resilient.   The FCC task force is seeking public comment on several potential trials relating to the ongoing transitions from copper to fiber, from wireline to wireless, and from time-division multiplexing (TDM) to IP based packet switched networks.

Technology Trials Proposed:

The FCC task force has proposed the following trials related to the all-IP network transition:

  • VoIP Interconnection
  • Public Safety – NG911
  • Wireline to Wireless
  • Geographic All-IP Trials
  • Additional trials: numbering and related data bases, copper-to-fiber transition, retirement of copper?

 The US Telecom Association was very supportive of such trials as well as the previously referenced AT&T petition. In comments submitted on January 28, 2013, the trade organization wrote:

“The idea that the Commission should conduct real-world trials in order to better inform itself as to the technological and policy implications of the IP-transition is a way the Commission can continue its commitment to data-driven policy making. The Commission itself has urged carriers to ‘begin planning for the transition to IP-to-IP interconnection’ and the Commission-guided trials urged by AT&T would facilitate this effort.”

“In particular, the AT&T Petition offers an opportunity for the Commission and state regulators to conduct informative, but geographically limited, trial runs for regulatory reform in discrete wire centers. AT&T correctly notes that such an approach will enable the Commission to consider, from the ground up and on a competitively neutral basis, what, if any, legacy regulation remains appropriate after the IP transition.”

US Telecom’s comments can be read here.

Important Unanswered Issues for an all-IP network:

Transition to an “all-IP” network implies retiring the PSTN/POTs, TDM/circuit switching and all wireless networks other than 4G with VoIP over LTE. That is a huge undertaking that will be incredibly disruptive and take many years, if not decades, in our opinion.  Here are just a few points to ponder about this monumental transition:

  • Telcos and MSOs must universally deploy broadband for wireline VoIP to be ubiquitous. Currently, they make their deployment/build out decisions strategically- based on reasonable ROI.  Not every area in the U.S. has or will have wired broadband as a result.
  • Many rural areas have little or no wireless coverage and certainly not 4G-LTE.  What happens to people who live in those areas, e.g. Arnold, CA?
  • Even if wired or wireless broadband is available in many regions, there is likely to be only one or two network providers at most.  Hence, there is little or no choice in service which is effectively a monopoly. Santa Clara, CA is in the heart of Silicon Valley, yet we now have only two choices for wired broadband – AT&T or Comcast.
  • There is currently no Universal Service Fund/Lifeline or discounted rate (for low income folks) for VoIP service.  Lifeline service is ONLY available for the PSTN/POTS.
  • If an individual or family doesn’t want or can’t afford high speed Internet and/or broadband TV service, then it will most likely be uneconomical for the Telco/MSO to ONLY provide VoIP service over broadband access. This is the case for many poor people and older Americans!
  • Battery backup is required for an all-IP network to make emergency phone calls when power is lost.  There is a substantial monthly charge for a battery backup box for AT&T’s U-Verse VoIP service. An AT&T subscriber must also have battery backup power for the Wi-Fi gateway to enable your AT&T U-verse services to function during a power outage.
  • There will be a huge impact on business customers that use digital circuit switched networks if the proposed all-IP changes happen soon in the affected areas or “wire centers.” What if a company’s main or branch office site(s) are located in an all-IP wire center coverage area?  In that case, the business customer would have to give up it’s digital PBXs or hosted ISDN PRI voice trunks and move to SIP trunks–even though the company is not nearly ready for a total enterprise-wide transition to an IP voice network.
  • What happens to faxes, which are still overwhelmingly based on the analog PSTN and not IP fax? The death of fax has been predicted for over a decade, yet it is still alive and kicking!
  • There will be a huge impact on business customers that use digital circuit switched networks if the proposed all-IP changes happen soon in the affected areas or “wire centers.” What if a company’s main or branch office site(s) are located in an all-IP wire center coverage area?  In that case, the business customer would have to give up it’s digital PBXs or hosted ISDN PRI voice trunks and move to SIP trunks–even though the company is not nearly ready for a total enterprise-wide transition to an IP voice network.
  • The transition from the classic PSTN to an all IP infrastructure will mandate the end of Signaling System 7 and the entire infrastructure that supports it. This is a substantial undertaking, the consequences of which are not fully understood. Can SS7-based functions be replicated on a broadband IP-based network? What would be the equivalent of a “voice grade” circuit? Is a SIP connection a functional equivalent for the key functionalities of SS7 switches? What about SMS/texts?
  •  The telephone numbering system provides a way for callers served by virtually any service provided in the world to reach one another. What will replace that system has yet to be determined. It surely won’t be an IP address which is often dynamic and allocated for temporarily reaching IP endpoints.
  •  Interconnection and Inter-operability between IP and TDM networks is a work in progress-for both voice and data.
  •  Quality of Service/Reliability/Resiliency is largely unknown with an all IP network, which would need to scale to replace and reach all PSTN/TDM endpoints. What would constitute an “outage,” and how should “outage” data be collected and evaluated? Here again, the battery back-up on power fail would need to be made mandatory and low cost or no cost to consumers and enterprises.

For sure, the above issues will challenge equipment vendors, regulators, business and consumers. We think the transition from PSTN/TDM/digital circuit switched to an all-IP packet network will take much, much longer than many expect.

Google Fiber – A Step Function Connectivity Improvement

A step function improvement in capability is how Milo Medin described Google’s Kansas City fiber project at the February 13th IEEE ComSoc meeting in Santa Clara. That huge improvement in customer experience is in contrast to the incremental gains of MSO [Multiple System Operator] and telco broadband networks, which have much lower access speeds.

Picture of a boom truck with a technician pulling fiber on an existing utility pole line for Google.
Image Courtesy of Google

Medin, who is VP of Access for Google, described a Gigabit/second fiber network that eliminates the bottleneck between home and the cloud, unleashing new applications and devices both in the home and, by implication, throughout a city. Google’s incremental improvements in its construction and operations, its relatively simple offering and its grass-root marketing are as important to its success as its innovative fiber and home networking technologies.

The story of Google Fiber is pretty well-known by now; Google issued an RFI a couple of years ago to which 1,100 cities responded to be the test bed for Google’s fiber to the home project. What isn’t so well-known is that the motivation for this was the middling price/bandwidth performance of the U.S. as compared to other countries. Medin, who was a key figure in the early success of cable modems through his affiliation with @Home, suggested that, instead of complaining to government, Google decided to solve the problem. The unexpected response of so many communities was a surprise to Google and, according to Medin, an indicator of a pent-up demand.

Interestingly, government turns out to be part of the reason for their success, but not in the form of subsidies or tax breaks. The techniques Google and the local city are using to streamline the permit process and literally work together is saving an estimated 2% of the build cost. Similarly, attachment of fiber to the poles is made somewhat easier because the local utility is municipally owned.

Thanks a Bunch CEQA, No Google Fiber for California

Rules and regulations are definitely shaping where and how the service will develop. Echoing testimony before Congress, Medin suggested that as long as CEQA [California Environmental Quality Act] is in place in its current form, Google Fiber will be virtually non-existent in California (there is a 850 home Google FTTH project on the Stanford campus). The irony that Google’s home state will not see its fiber network anytime soon was not lost in the room full of engineers at the IEEE meeting.

Medin explained that anyone can use CEQA to initiate a lawsuit to block a development. He cited the example of the use of CEQA to delay the rollout of Uverse in San Francisco for years. A linchpin of Google’s approach is achieving scale at a fast-rate and the uncertainty caused by CEQA sinks their business case. And there is a business case, as Medin pointed out that the margins on broadband are as high as 95% for incumbent providers in urban areas.

To critics who suggest an infrastructure play is far afield for a “search” company, they should think again:

  • With YouTube and their other Google properties, Google already operates one of the world’s largest Content Delivery Networks
  • With a Fiber to the Home network, outside plant maintenance is almost zero, as compared to a traditional cable or telephone network.
  • With a gigabit connection and customized hardware, the home becomes an extension of their data centers. Although it wasn’t said in his talk, they are sure to have TR-069 or equivalent technology to allow the monitoring of devices within the home. Additionally, network managed WiFi routers integrated into each set-top will deliver a better experience than the home WiFi networks cobbled together by consumers.
Depicted is a Google optical/electrical converter (ONT) that resides in the home. What's not clear is whether or not it has built-in battery back-up.
Image Courtesy of Google

Google is taking an approach that, in some ways, is reminiscent of the old Ma Bell, whereby Google designs their own equipment. From Optical Network Terminals [ONTs] to set-top boxes, Google has created devices that maximize the customer experience [a DVR that records 8 programs at once] and minimizes operational cost. Medin indicated that Google has some of the world’s best optic engineers on staff.

Unlike the days of Ma Bell, Google can work with third-party manufacturers to build what they need, allowing them to introduce devices without the overhead burden of owning factories.

Keep It Simple Marketing

As with Google’s other offerings, they are taking a brand follows product approach to their fiber product. That is, the end service is the focus on creating an offer that provides great value and a high customer loyalty/buzz factor that will essentially market and sell itself. Like Google’s approach to their search home page, Google is keeping their offer simple. Unlike the Chinese food menu of a seemingly infinite number of tiers that traditional video and broadband operators offer, Google has only three tiers:

  • 5 Mb/s  with $300 construction charge (may be amortized at $25/month for 12 months) & no recurring charges for 7 years
  • 1 Gb/s broadband with 1 Terabyte storage For $70 per month
  • 1 Gb/s broadband with video for $120 per month with a Nexus 7 as a remote control

These three offerings probably cover 95% of the market. Amortized over 7 years, the 5 Mb/s tier exceeds the National Broadband Plan’s minimum at a very affordable rate of less than $4 per month and serves those who can least afford broadband. The $120 per month tier includes a basic level of video that many people would like. On a dollar per bit basis, the $70 provides great value to cord-cutters, while providing a superior broadband option for those who do not want to switch their existing video providers.

Further simplifying their offering is the decision they made not to offer telephone as part of their bundle. Although this decision was made for regulatory reasons, this reduces the operational complexity of their network and minimizes the staff required to run their network. With one less complex feature to offer, their network implementation is faster. They probably don’t lose much of their Total Addressable Market, given the number of people who are either wireless only or can easily pick a VoIP service (including Google Voice, which works great with a Obihai VoIP adapter).

A picture of a Google truck at a customer install. Note, the lawn sign promoting the Google Fiber project.
Image Courtesy of Google

Like what so many independent, rural operators have done with their Fiber to the Home deployments, Google is taking a grass-roots approach to marketing. Google uses a crowd-sourcing technique to determine where to build. Instead of taking a top-down approach that focus on demographics, Google split the Kansas City market into neighborhoods. When a critical mass of people commit to service in a given neighborhood, Google builds out that area creating what they call a “Fiberhood”.

Where they build is thus dependent upon the citizens of a given neighborhood. Like the way it has marketed its other Internet businesses, Google is betting on and seeding efforts to create a viral buzz about their network. One of the more interesting developments is their retail store. Although not mentioned in a recent Wall Street Journal article about Google’s rumored jump into retail, this point of presence offers a physical location to educate potential customers and the local influencers who will help sell their neighbors on the service.

And this approach seems to be working as Medin reported that in some neighborhoods 50% of the residents are committing to Google Fiber prior to build.

Just the Beginning

The Google Fiber Space retail store hints at some of the developments in the future that a gigabit network enables.
Image Courtesy of Google

A gigabit to the home with its low latency and high-speed brings the compute power of the cloud to the home; particularly when much of the content is cached locally within Kansas City. In a sense, this extends Google’s cloud platform to the home and business, such that the performance at the end point is virtually the same as what it would be in the data center. Medin hinted that 1 Gb/s is just a start. It is not too difficult to imagine the types of things that could be enabled with this sort of bandwidth, such as:

  • City-wide WiFi or some other wireless solution (Google has been received FCC authorization to experiment with various wireless approaches for access). City-wide wireless could offer a low-cost mobile/nomadic solution for its customers. It could also be important for autonomous transit options.
  • Distributed data centers – with 1 Gb/s connections, a Peer to Peer compute network (think connection of those DVRs) becomes a possibility. Why not use the computing power as well and create a virtual data center spread over hundreds of thousands of residences.
  • Like what Google has done with its Android and Chrome operating systems, the fiber network has the potential to enable applications from third-parties. It is possible that some of these apps might even come from existing telecom providers.

The Google fiber project in Kansas City is on its way to meeting its goal as a showcase of how low latency, Gigabit per second bandwidth can transform a city one neighborhood at a time. The fiber is really serving as a last mile nervous system that connects the seemingly disparate pieces to an ever-expanding Google ecosystem, which is where the change will really take place. Unfortunately for California residents, and particularly ironic for Silicon Valley residents, new Google Fiberhoods won’t be making their way to the Golden State anytime soon.

[Author’s Note: Thank you IEEE for the facilitating the excellent program that featured Medin as one of the speakers and thank you Alan Weissberger for your editing assistance].

AT&T to Expand U-Verse & IP-DSLAM; Bring Fiber to Commercial Buildings & Cover 99% of US with LTE!

In the most significant announcement since SBC acquired the old AT&T and became “the new” AT&T, the telco giant announced it will spend $14B over the next three years to expand its wireline and wireless networks under its newly coined “Project Velocity” initiative.  The company wants to move to an all IP network platform, which means they’ll be phasing out TDM transmission and the PSTN.

Surprising most analysts, AT&T said $6B of that $14B will be spent on wireline upgrades.  In particular:

1.  Residential Broadband via U-Verse, IP-DSLAM and (in some rural areas) LTE:

Traditional U-Verse (TV, high-speed Internet, VoIP) as well as U-Verse IP-DSLAM (high-speed Internet, VoIP, but NO TV service) will be available in many more areas with Internet access speeds of 75M b/sec for most customers, with many achieving speeds of up to 100 M b/sec (downstream).

Currently, 32% of AT&Ts customers are covered by (triple play) U-Verse, which will increase by one-third to 8.5M additional customers and 43% coverage by the end of 2015.   U-Verse revenues are running at a $9B annual rate and increasing at a 38.6% annual rate.  AT&T is making “customer retention improvements in all areas.”  (Presumably to avoid losing U-Verse customers to triple play MSO services, e.g Comcast/Xfinity which runs commercials enticing U-Verse customers to come back to Comcast for better high speed Internet and TV service).

AT&Ts wired IP broadband network will expand to 75 percent of residential customer locations in AT&T’s 22-state wireline service area by year-end 2015.  Not all of those potential customers will be able to get U-Verse TV service.  Rather, they will be connected to IP DSLAMs to achieve higher speed Internet access.  (This means AT&T will be jettisoning its ATM over ADSL network in favor of IP/Ethernet transport to/from customer premises to its initial point of presence where the DSLAM resides.  Customers currently using ATM over ADSL will have to be retrofitted with new CPE to access U-Verse IP DSLAM or traditional U-Verse).

AT&T's wireline coverage
Image Courtesy of AT&T

The higher Internet access speeds (over last mile copper) for U-Verse and IP DSLAM will be achieved by VDSL pair bonding, “small form electronics,” and VDSL vectoring.

Status Report: Currently about 9% of AT&T landline customers cannot get broadband, while 32% have the U-Verse triple play available to them, 32% have U-Verse IP DSLAM available and 27% are served via legacy broadband (i.e. non-IP DSL).

By the end of 2015, AT&T said 99% of customers in its 22 state service area will have broadband available to them via either via wireline or LTE option.  The breakdown is as follows: 43% of customers will have access to the U-verse triple play, 32% will have access to the U-verse IP DSLAM and the remaining 25% will need to rely on LTE, which will be available to 99% of AT&T’s customer base.

Other key points related to residential broadband are as follows:

  • Much higher speed wireline Internet, via either U-Verse triple play or U-Verse IP DSLAM (double play) will be available to 57M AT&T customer locations by 2015.  The IP DSLAM double play offering of broadband Internet and VoIP will be available to 24 million customer locations by year-end 2013.  Those customers will be offered  a triple play bundle based on IP DSLAM and satellite video service from Dish network.
  • Customers in rural or remote locations (presumably the 25% who won’t get wireline broadband access) will be able to get  LTE wireless access,  as AT&T plans to extend its 4G LTE build out to cover 300M POPs by end of 2014.
  • High speed IP connectivity will be available  to 99% of wireline service?area customers (via either U-Verse, IP-DSLAM or LTE) by 2015.

Summing up,  AT&T’s firmly believes that:

  • Wireline IP broadband is structurally attractive in dense population areas
  • IP broadband is the most important product in the triple or quad-play bundle
  • AT&T IP broadband will meet customers’ growing speed requirements
  • Significant synergies exist between wireless and wireline assets

2.  Fiber to the Building deployment:

AT&T will light fiber to reach 1 million additional business customer locations, covering 50 percent of multi-tenant office buildings in AT&T’s wireline service area by year-end 2015.  50% of those multi-tenant office buildings in AT&Ts wireline service area will be fiber connected. (That’s up from about 15% nationwide today).

3.  Strategic Business services:

IP VPN, Carrier Ethernet, (Web server) hosting and vaious  managed business services generated $6.4B in revenues last year  and is growing at 14.5% annually.  Wireline data and managed IT services for enterprise customers are growing at a rate in excess of 6%.

Cloud computing and security are seen as the next big growth opportunities as AT&T transitions to managed services for its enterprise customers. In particular, AT&T plans to partner with cloud service providers as well as providing cloud services over their own managed IP network that leverages performance, reliability and security.   During the Analyst Day webcast, AT&T said, “Virtualization and mobilization are driving the need for a ubiquitous, dense wireline footprint solutions that bundle cloud with connectivity (AKA Cloud Networking), symmetrical bandwidth, and security through active network management.”

Century Link/Savvis and Verizon/Terremark  are recognized cloud leaders each having very solid cloud computing with managed IP VPNs for delivery of cloud services.   They will now have much more competition from AT&T in the cloud space.

Additional information on U-Verse:

As indicated in the graph below, AT&Ts broadband market share is growing in areas where U-Verse is available to residential customers.

Image Courtesy of AT&T

U-Verse has delivered 5 years of top line growth for AT&T:

  • $9.5B revenues, which are growing 38% Year over Year
  • 7.1M IP broadband subscribers, with 2.5M added in last 12 months
  • 4.3M IPTV subscribers, 760K gained in last 12 months
  • 18% U-verse video penetration; 23% U-verse broadband penetration
  • ~$170 ARPU for U-verse triple-play service bundle
Synergies between AT&Ts wireline and wireless networks:

At a Wells Fargo investment conference on November 8th,  AT&Ts VP & CFO John Stephens said that AT&T evaluated commercial buildings with six tenants or more to determine whether they should get fiber connected.  Considerations included: distance from AT&Ts central office (CO), cost efficiency and build-out cost.

A huge side benefit for AT&T is that once the fiber to the building is installed and AT&T owns the right of way, the company will install Distributed Antenna Systems (DAS) along the fiber route to provide increased 3G/LTE wireless coverage.  The DAS’s would use fiber backhaul to AT&Ts CO.  Mr. Stephens hinted that DAS’s (deployed along the fiber-to-the-building route) might also be used for  broadband wireless offload, but did not disclose any details how that might work or be configured.

[Infonetics analyst Stéphane Téral recently said in an email, “The majority of operators are still using distributed antennas (DAS) in their mobile networks for coverage, and despite all the talk about using small cells to boost capacity in large venues, operators we interviewed believe DAS will remain a fundamental tool for malls, airports, stadiums and the like.”]

Mr. Stephens was both enthusiastic and confident during his presentation.  He said, “AT&T is investing in tried and true things we know.  We are moving away from PSTN and torward an all IP network platform for delivery of all telecom services including voice.”

Closing Comment:

During its November 7th Analyst Day webcast, AT&T CEO  Randall Stephenson echoed Mr. Stephens confidence, “These are things we’ve done before – logical extensions of proven technologies and already successful businesses. We are very confident in our ability to execute this plan.”


Summary of Telecom Council TC3, Part 1- Service Provider Innovation Forum


Telecom Council Carrier Connections (TC3) – the Telecom Council’s annual summit- was held Sept 12-13, 2012 in Sunnyvale, CA. The event provides an opportunity for startups and application developers to interact with telecom carriers (telcos) and network operators. Telco representatives who manage innovation, from developer programs and labs facilities to venture investing, discussed many issues that are relevant to their vendors and partner companies.  These included:

  • What innovations are network operators looking for?
  • How does a young company work with a large operator?
  • What kind of partnerships do carriers prefer?
  • Who are the right people inside the carriers to properly receive, handle, and implement new ideas?
  • What developer and partner programs are available?

A complete description of TC3 is available here from the Telecom Council.

Telco Innovation in SF Bay Area and Startups:

Over the last few years, more than 25 Telco Innovation Labs have opened up in the SF Bay Area, including Sprint’s in Burlingame, AT&T’s in Palo Alto, Verizon’s in San Francisco and Deutsche Telekom’s in Palo Alto.  These Telco Innovation Labs serve as incubators and offer testing facilities to a wave of startups, particularly in the wireless space.  Global telcos have also established Venture Capital (VC) divisions throughout the SF Bay Area.  This makes Silicon Valley a very appropriate place to hold the TC3 summit conference. Throughout the 2 day summit, speakers from telcos and mobile operators described what they’re doing for developers and how they’ve been handling partnerships with startups.

Telecom Council Service Provider Innovation Forum (SPIF) Meeting:

TC3 conference chair Derek Kerton said that the Silicon Valley culture of co-operation has been working for carriers. They are able to share leads and help each other out without worrying about competition. With that introduction, the first Telecom Council Service Provider Innovation Forum Meeting open to the public began. 11 later stage startups, with network ready products and services that “push the envelope of telecom innovation” gave rapid fire pitches. After each presentation, SPIF session moderator Liz Kerton invited carriers sitting in the front rows to ask them questions.

We highlight three of the most interesting vendor rapid fires below.  Click here for the complete TC3-2012 agenda.

1.  Actellis has offered Ethernet over Copper products, but is now shifting into the residential broadband space. Millions of Americans don’t have broadband access, primarily due to a lack of infrastructure. The FCC is trying to address this problem with the Connect America Fund. “The digital divide is a challenge, but an economic opportunity for carriers,” said Chris Heinemann, Director of Marketing at Actellis.

The Actellis Broadband Accelerator (BBA) delivers high speed broadband services to current unserved and underserved customers who are out of reach because of their geographical location.  The patented, shoe box sized box is placed between the telco’s DSLAM and the ADSL subscriber.  It provides “ubiquitous coverage over the existing copper infrastructure and takes only 15 minutes to install. The BBA is in field trials worldwide, with several deployments in the U.S. and one in South America,” according to Mr. Heineman. He encouraged the audience to watch You Tube videos describing the product and how to install it (wall or pole mounted). Please refer to:

Overview (YouTube video)

How to Install (YouTube video)

The BBA received the 2012 NGN Leadership Award for outstanding innovation.

Author’s Note: Mr Heinemann did not disclose how the Broadband Accelerator actually worked.  Yet he implied it could be used to deliver 15 to 30 Mb/sec total bandwidth per subscriber (the sweet spot for triple play services).

2.  Joyent provides “Cloud Infrastructure for Real-time Web and Mobile Applications.” The company, which counts Intel and Telefonica as investors, “builds a data center as a solid state device,” according to Jason Hoffman, Joyent’s founder. The company’s strategy is focused on local service delivery from a global alliance of tier 1 mobile carriers that operate their own mobile clouds and/or Infrastructure as a Service (IaaS) on Joyent’s data center fabric.

Joyent’s data center technology addresses challenges in real time latency sensitive mobile apps. “It’s designed as the back end of the storage array that runs Virtual Machines (VMs). The product can do throttling, scheduling, bursting and I/O acceleration in a unique way,” according to Mr Hoffman. “It can detect when applications are running slow via real time diagnostics and trace capabilities,” he said.

3.  Shared Spectrum Company is not technically a startup as it was founded in 2000 and funded by DARPA.  The company develops embedded wireless software for accessing shared spectrum resources and mitigating effects of RF interference by avoiding those bands. Their Dynamic Spectrum Access technology senses what frequencies are used as well as interference in unused bands. It avoids those and switches wireless traffic to selected frequency bands that are unused and clear of interference. Shared Spectrum’s software has been embedded in products from military radio manufacturers. Recently it has been used by InterDigital- a femtocell vendor. The company is now hoping to attract a broader range of OEMs (as described below).

Based on measurements his company has performed in major markets around the country, CEO Tom Stroup claims there is no spectrum shortage (in direct conflict with AT&T’s CEO Randall Stephenson who says AT&T needs a whole lot more spectrum to cope with exponential growth in mobile data traffic).

Instead, Mr. Stroup maintains that most allocated spectrum is not used.  He said that <20% of available and allocated spectrum is not in use at any one given time. That’s quite a bold statement!

The company sees a growth market in mobile cloud computing, which requires additional spectrum with QoS. Examples are TV white spaces (unused frequencies allocated to TV broadcasters) where interference from wireless microphones must be detected so as not to use those bands for wireless broadband services. The company’s “Spectrum Sensing Toolbox” is targeted at equipment used in femtocells, IEEE 802.22 Regional Wireless Area Networks, digital broadcasters in Europe, Machine to Machine (M2M) devices, Department of Defense and civilian government radio systems. Tom said that “Shared Spectrum’s Dynamic Spectrum Access technology was applicable across the world.”

Author’s Note: Various proposals, including IEEE 802.11af, IEEE 802.22 and those from the White Spaces Coalition, have advocated using white spaces left by the termination of analog TV to provide wireless broadband Internet access. A device intended to use these available channels is referred to as a “white-spaces device.” The FCC will meet September 28th to discuss rules for an auction where UHF broadcasters will sell spectrum to wireless carriers that have complained about a lack of available spectrum as U.S. consumers increasingly use and want more mobile data.

Closing Comment:

“Our monthly Service Provider Innovation Forum meeting has a 12 year history of helping entrepreneurs meet telcos and visa versa,” said Liz Kerton President of the Telecom Council of Silicon Valley. “This was the first time the public has had access to the inner circle of the Council and it worked well enough to do it again next year.”

Indeed, this author found the meeting a very effective way for telcos and start-ups to meet one another.


This concludes part 1 of the TC3 Summary.

Part 2 will cover the panel session on Rich Communications Suite (RCS) and carrier innovation agendas, strategies and case studies.  Part 3 will be on WiFi Offload 2.0.


SPIFFY award winners at TC3 in various categories-


Does Broadband Lead to a Broad Waist?

In late August, the Milken Institute issued a provocative study, Waistlines of the World – The Effect of Information and Communications Technology on Obesity. This report looked at 27 OECD countries between the period of 1988-2009 and the impact of knowledge-based society on obesity rates in those countries. The growth of these rates are a serious concern, as overweight and obesity together are the fifth leading cause of death worldwide, according to the report’s citation of World Health Organization statistics.

The report suggests that,

“For every 10 percentage point increase in the share of ICT spending, obesity rates will significantly rise by 1 percentage point directly and 0.4 percentage point indirectly based on the impact of additional consumption of leisure ‘screen’ time.”

The U.S. has the highest rate of obesity, leaping from 23.3% to 33.8% from 1991 to 2008. In absolute numbers and percentage growth, China’s obesity rate is a huge concern as it more than doubled between 2002 and 2008 from 2.5 to 5.7 percent and the number of overweight people doubled from 1991 to 2006.

The report is loaded with statistics like the aforementioned, but one assumption that is a given in the report, is that, “Urbanization, in general, leads to a more sedentary lifestyle and hence weight gain.”  The implication is that urban areas will see higher obesity rates than will rural areas. While this may be true in developing countries, based on the study, it is not clear whether this is the case in already-urbanized countries, such as the United States.

Information and communication technology (ICT), as defined in the report, is somewhat expansive and includes,

“Information technology (IT), unified communications, telecommunications (telephone lines and wireless signals), broadcast media, all types of audio and video processing and transmission, and network-based control and monitoring functions.”

The report suggests the world’s transition to a knowledge-based society has led to changes in work habits (less manual labor, more dual income families) and lifestyles (increasing urbanization, greater caloric intake, more screen time), which lead to obesity. Backing up their conclusions are complex econometric models, that are way beyond the intellectual firepower of this reporter, but common sense says more screen time leads to a bigger waistline.

More common sense is embodied in their citation of the “last hour” rule, which,

“basically states that when the enjoyment associated with technological advances increases in sedentary leisure, people will devote more time to sedentary entertainment at the margin.”

In other words, most people will kick-off their shoes and sit in front of a screen or screens at the end of the day instead of exercising.

Prescription for Change 

Spring Grove Fitness Center

Their prescription for change involves governmental and employer assistance to help make exercise part of everyday life; like it was prior to screen time. Again, more common sense, but they recommend policies and programs that reduce reliance on vehicle transportation and encourage walking, bike riding  and, of course, exercise programs.

Some 27 existing government and corporate programs from around the world are cited in their list. A concept of interest to telecommunications providers include their suggestion of a tighter coupling between health-care providers and people via things such as keeping track of biometric data. Another example a program that ties patients closer to health-care professionals is one started by a Ohio physician called “Walk with a Doc.”

Employee Fitness Program

Two programs that could be added to their list are from rural broadband operators. Spring Grove Communications, as seen in this 2010 interview, built a gym/library/community center, when it rebuilt its office; benefiting its employees, as well as the community at large. Arrowhead Electric has a simple program that rewards employees for starting and sticking to an exercise program. Both programs represent ways to help people proactively prevent broadband usage (and other screen time) from leading to a broad waistline.

OTT movies equal VOD with more gardens to pick from

rog-tv-ott-720Sometimes I think about keeping thoughts to myself as they often conflict with what people want to hear or contradict what they believe and then I feel bad and then it happens and then the cycle repeats

So here we are at the refresh of another cycle. I’m about to feel bad. Over-the-top progressed faster than I thought and gave more options than I thought, but now I think OTT has done better than VOD because it offers more options, probably did a better job marketing, and will likely dominate in the future because of its options. To me OTT movies are simply VOD out of someone elses garden but I have a vehicle driving me to many more gardens to pick from.

So what got me thinking about this cycle? Well, I finally got a Roku box which I thought would stop me from plugging my PC into the TV, but it didn’t. Next I read that people think VOD is losing out to OTT because it has inadequate advertising support and awkward program guides. This was in a report summary sent out by TDG recently and repeated by many others…. Market Watch, Broadband and TV News, Marketwire, WCBSTV, etc.

Yes, I’ve been told that I’m not reading between the lines of the summary and it’s really about cable operators losing a market they should have controlled. So perhaps, yes, VOD should have done better, but how can it compete against all the options.

  • Roku gives me the option to pay a monthly fee for Netflix, or to rent by the movie from Amazon, or to watch for free on the Crackle channel with advertisements. Those are only 3 options out of many more options.
  • Playstation, Wii, and xBox give me many of the same options as Roku and at the same time provide entertaining game that are owned by perhaps 3 or 4 times as many people that have VOD boxes. I have all 3.
  • I can still plug my PC into the TV and rent YouTube movies or watch for free with a commercial. I often go to YouTube for free movies because I prefer the one opening commercial to Crackle’s one every 15 or 20 minutes.
  • And there are more media players out there with similar options that I won’t even get started on.

So why would VOD dominate the movie watching world? I just don’t see it… or should I say watch it.

Now please don’t be offended it you’re a VOD provider or supporter. This is just the way I look at things and I’m an older baby boomer. I’m not the one to watch out for like x, y, z generation that will decide how this plays out. The younger generations are way more exposed to this stuff than this atypical bb gen’r but a lot of operators recognize this as they focus on broadband so the future is looking brighter for options.

Social Stories of Folly and Frustration

A Scrolling Blur?
A Scrolling Blur?

Facebook Follies

Just how valuable are those fleeting Facebooks posts so many businesses use to promote their product or message? The messages quickly disappear into a blog scroll longer than the Mississippi – with locks and dam obstacles to discourage anyone from ever trying to find significant information- so they tend to be more like missing links of Facebook follies in marketing strategies gone astray.

So what prompted that outcry? I was looking at a corporate website for information today – info that should have been on their website. The catch! It was on their Facebook page. I struggled to locate the information as I had to click “more stories” over and over and over again. I wouldn’t have even looked except I knew it was there; I’d seen it before. I was one of the few that had seen the fresh post before it got lost in the big scroll.

I wonder what will happen next, now that Google+ Pages are available.  Google seems to provide nice features and search will likely work better (although few will use it), but I wonder… Will companies jump ship and continue to ignore their websites focusing on social scrolls that end up in nowhere land?


Excited about You Who?

I’m often frustrated by the way people use social media, and I actually disliked YouTube for a long time (because of past revenue models), but now I’m getting excited. Being approved for ad revenue was a big reason, but another reason for my excitement is the timeline link. The timeline link gets viewers to a specific point in a video for immediate viewing. Time specific links could actually be a boom for using video more effectively as the links integrate into text stories very well. Here is an example using videos that have multiple messages, while linking to specific topics thanks to the timeline links.

I interviewed small town rural breweries at Great Taste of the Midwest. Tyranena talked of buying and supporting local, and how important the Internet is for their business. Central Waters heats with renewable energy and supports MREA. They also believe the Internet is extremely important for their business. Dave’s Brew Farm is as rural as you can get. Dave talks about how they depend on the Internet for doing business.

Maybe I’m overly optimistic, but I love the idea of reading about something and finding the exact point in a video that’s relevant to topic – assuming I want to visualize. It’s a way to filter the unnecessary. Youwho, or woohoo, I find it interesting.


Hard Questions for ISPs

Cutting CableIt isn’t often that a former boss and mentor’s son makes the news.  My first professional awareness of André Vrignaud was last September when I read an announcement that he had moved from one Seattle internet giant to another Seattle internet giant.  So, it was a bit surprising to see him making headlines for a different reason.  That is, he is the Seattle customer who Comcast cast off for accessing too much Internet.

The best source of his saga is his blog.  In short, he exceeded Comcast’s 250 Gbyte/month cap and they cut him off from their service for a year.  He suspects the reason he exceeded the cap was that he just signed up for online back-up service Carbonite and Amazon’s Cloud Drive service.  Thanks to his uncompressed audio collection and thousands of high quality photos, he has terabytes of data which could account for why he exceeded the cap.  He isn’t the first to exceed his cap due to an online back-up service, as indicated by this post from David Martin.

Vrignaud argues that,

“The ability to access broadband internet is a right, and should be defined as an essential utility.”

Is access to broadband a right?  That is an interesting question for lawmakers and policy makers.  The U.N thinks so, as they indicate that broadband is critical to freedom of opinion and expression”, as well as an enabler for other rights (page 7).

The National Broadband Plan suggests that everyone should have access to broadband:

“Everyone in the United States today should have access to broadband services supporting a basic set of applications that include sending and receiving e-mail, downloading Web pages, photos and video, and using simple video conferencing.”

Although the NBP suggests a 4/1 Mbs download/upload speed as today’s minimum definition of broadband, it does not seem to bound the amount of “access”.  In the case of their minimum definition, assuming an aggregate access rate of 5 Mbs (download plus upload), bandwidth usage would be  approximately 625 kbytes per second or 1.62 Tbytes per month or more than 6x the Comcast cap and 10x the AT&T limit.

Granted, Vrignaud is at the extreme in terms of today’s bandwidth consumption.  However, as the demand for bandwidth increases more of us will be bumping into these caps.  As pointed out by network engineer and IT specialist, Andrew Froehlich, caps could have a chilling effect on internet applications and cloud services.  For instance, based on what I saw happen to Vrignaud, I am going to be much more cautious about using a given cloud application.

Vrignaud asks some very good questions that ISPs should have a cogent response to, if they are going to be able to justify caps to policy makers and to their customers.

Hard Questions for Broadband ISPs around Data Caps:

  1. Is your bandwidth data cap designed to protect your television distribution business? If not, why do you insist on completely cutting off data instead of using other more consumer-friendly options such as charging for overages or slowing internet use?
  2. What ISP-offered services are excluded from the cap? Specifically, are your voice telephony and video programming services excluded? If so, why doesn’t your data cap apply to data consumed when watching television or making a phone call?
  3. How are your data caps set? What data informed that decision? Why do different ISPs have different data caps when using similar networks and distribution technology?
  4. How are your data caps evaluated on an ongoing basis? What customer input do you seek? What are the conditions under which those caps could be raised and/or eliminated?
  5. Do you practice selective enforcement of data caps? (Many ISP users report being over their supposed limits for months in a row without action.)”