1. Start by knowing just what HD voice is. Despite its reputation for realistic-sounding audio, HD voice doesn't deliver from speaker to listener the full spectrum of sounds audible in a real-life conversation. It does, however, transmit more than double the range of audio frequencies that traditional phone calls do. PSTN connections, sampling voice audio 8,000 times per second, transmit frequencies from roughly 300 Hz to 3,400 Hz, a range of just over 3 KHz. HD voice technology, sampling the audio 16,000 times per second, can transmit frequencies from around 50 Hz to 7,000 Hz, a range of almost 7 KHz. In concrete terms, that means HD calls can carry sounds a full two octaves lower, as well as sounds significantly higher, than those in conventional calls.
The difference is important because humans can detect frequencies ranging from 20 Hz to 20,000 Hz (20 KHz), and human speech typically spans a 10-KHz to 12-KHz band within that range. Thus a PSTN call delivers sounds representing only a fraction of the range one would hear in a live conversation — the rest is deliberately filtered out. And HD voice, though still not transmitting all audible frequencies, delivers enough of the ones relevant to human speech that many listeners say it seems like the speaker is in the same room.
2. Understand the real-world limitations of conventional PSTN voice. Understanding how much easier HD makes communicating by phone requires first understanding how hard traditional voice technology makes it. That's difficult to grasp because we're so accustomed to the limitations of the PSTN that we hardly notice them. But they're very real. To start with, cutting off so many high and low sounds makes distinguishing between s and f, or other similar sounds, a major challenge. Likewise, many individuals have voices that are significantly higher- or lower-pitched than others'. Thus in conventional conference calls it's often difficult to know who is speaking. It's similarly hard to understand people with different accents — an increasingly important consideration when even small businesses are often global operations. It's also easy to miss subtle verbal cues that come through in live conversations.
Such limitations impose significant penalties on call participants. For one thing, they force listeners to make extra efforts to understand what is being said and who is saying it, by interpolating or simply guessing. That can be mentally exhausting, particularly during long conference calls. And if participants have trouble discerning, for example, whether a key number was $15 million or $50 million, or which executive was in favor of a key proposal, it can even be disturbing.
3. Understand what HD voice will and won't do for you. The main benefits of HD voice lie simply in eliminating such penalties. Long calls, especially conference calls with multiple speakers with different accents, become less tiring and thus more productive. Participants are able to spend less time calling back afterward to clarify what various speakers said, or worrying that they got it wrong. Other benefits are less tangible. For example, because it lets so many more verbal nuances come through, HD voice may improve a company's rapport and image with whatever customers are involved in such calls. On the other hand, it definitely won't provide the kind of clear-cut ROI, such as measurable savings on long-distance charges, that moving to VoIP transport, for example, does.
4. Get HD-capable phones, not HD-compatible ones. Jeff Rodman, Polycom co-founder and voice division CTO, argues that IP phones with HD codecs such as the ITU-standardized G.722 may not necessarily provide true HD voice quality. That's because it takes more than a codec to deliver the full range of sound that the codecs can theoretically transmit. For example, HD-capable microphones and speakers are also necessary, as well as careful acoustic design. Polycom and snom have been particularly aggressive about building IP desk phones that take full advantage of HD voice, and AudioCodes is moving in the same direction. Softphones, the client software programs that allow users to make VoIP calls through headset-equipped PCs, can also provide HD capability.
5. Make sure your IP PBX is HD-compatible. For HD phones to work, whatever IP PBX they are connected to has to recognize the codecs they incorporate. Most IP PBXes these days are G.722-aware, but many IP phones contain other codecs in addition to the basic ITU standard. Making an IP PBX compatible with a given codec is a software upgrade, though compatibility testing is always necessary. Either way, make sure the different types of equipment you're using or planning to buy work together.
6. Make sure your network equipment can handle it. Polycom's Rodman also noted that in certain cases wideband codecs will increase the bandwidth requirements of phone calls. While the G.722 codec requires the same bandwidth per call as the narrowband G.711, some existing VoIP setups using high-compression codecs may be designed to use even less bandwidth. In such cases, moving to HD voice may not only increase bandwidth requirements, but also put a strain on queues, buffers and other elements of your LAN and WAN equipment.
7. Analyze your telecom connections and service. HD phones aren't much help if the connection between them can't support HD calls. But that's exactly what happens whenever a call travels over the PSTN for even a portion of its journey — the entire call becomes narrowband. There's not much you can do about it when you're making or receiving calls to or from outside companies or individuals. You have much more control, though, for calls between branches of your own company. Choosing a provider that offers end-to-end SIP (session initiation protocol) connections, for example, will also let you have end-to-end HD voice. And it will do so even if you have different types of HD phones or softphones at different locations. When it first sets up a call, SIP checks the endpoints, in this case the IP phones, to find out what codecs they have in common. It then transmits the call using the protocol that will provide the highest-quality connection.
8. Consider HD-capable hosted VoIP. Like IP PBX vendors, hosted IP PBX providers are seeing the light when it comes to HD voice. Companies such as Alteva, Apptix, FreedomIQ, IP5280 and LightEdge Solutions, for example, offer services that can use HD-capable phones from Polycom. And snom has introduced a number of phones incorporating the HD technology it calls klarVOICE wideband audio, which it is targeting at so-called standards-based (i.e. SIP-based) hosted VoIP services, among other applications.
9. Use an HD voice-capable conferencing service. As noted, conference calls are one place where you want to make sure everyone can hear everyone else as clearly as possible. Most commercial conferencing services, however, focus almost exclusively on PSTN connectivity. A Citrix Online service called HiDef Conferencing, formerly Vapps, similarly allows callers to dial in from the PSTN. But because it is integrated with Skype, any or all participants can alternatively call in on the Internet phone service. Those who do will hear at least each other in Skype's full HD glory. Similarly, a number of Web conferencing services offer a VoIP option with HD capability for the audio component of their online meetings. And VoIP conferencing service ZipDX, which supports HD (which it calls wideband) voice conferencing with Polycom phones, recently announced the integration of its service with snom klarVOICE phones as well.
10. Be aware — or beware — of background noise. Remember that HD voice connections transmit a lot of sounds that PSTN calls don't. Listeners will hear every rustle of a paper, every click of a keyboard, every conversation in the hallway and every sound from the next cubicle. One soon learns that "potato chips have different ranges of crunches," noted Cailin Pitcher, senior marketing manager for Citrix Online.
11. Wait for the world to catch up to you. Even with HD equipment and SIP phone service, you may not have a lot of HD voice calls to start. The ones you do have will mainly be internal calls between branches. For outside calls, you'll be stuck with mostly non-HD connections for some time. That's because end-to-end HD calls across different service providers' networks require that the providers have interconnected IP voice networks, with no transit over the PSTN. And there won't be enough such interconnections to make external HD calls commonplace for years.
That doesn't mean you shouldn't go with HD right now if it otherwise makes sense. The phones you buy today will typically have a useful life of six to seven years, or longer. And they should (make sure of that, too) be firmware upgradeable for compatibility with new codecs as those come out. By that time, there will be a lot more interconnected provider VoIP networks, so even calls to and from outside customers or suppliers could be HD as well. In the meantime, your own internal communications will improve dramatically. That alone could make the investment worthwhile.
Some steps the maximum benefit from HD voice.
Thursday, June 4, 2009
Posted by muhammad abbas at 2:29 AM 0 comments
How to Benefit From HD Voice
It takes more than HD phones to make calls sound like they're coming from the same room.
HD (high-definition) voice, also known as wideband voice, is gaining a lot of attention these days. And for good reason: The audio technology, which lets IP phones send a far broader range of sounds over VoIP connections than traditional phones can over PSTN (public switched telephone network) circuits, vastly increases the clarity of voice calls. That can benefit businesses and individuals in tangible ways. But making good use of the technology involves more than simply buying new phones labeled HD. Here are some steps to take to gain the maximum benefit from HD voice.
Posted by muhammad abbas at 2:28 AM 0 comments
Japan and India put wireless plans in motion
Countries ready themselves for LTE and 3G spectrum auctions
Japan and India may be far removed in terms of geography, but they do have one thing in common: a drive to expand the availability of mobile broadband.
First up is the ever-ambitious Japan. The country appears to be ready to press ahead with 4G mobile services or Long Term Evolution (LTE) with the government reportedly ready to offer licenses in the second half of 2009 amidst talk that local telcos will invest up to US$10 billion.
With its track record of being an early adopter, Japan joins other European and USA telcos where the telecom sectors have announced implementation of LTE 2010 and 2012, according to budde.com, an Australian-based telecom analyst firm. “The Japanese operators are expected to face lower costs for the new networks versus 3G when they paid a premium on equipment for being early adopters,” the telecom analyst’s report added.
It is understood Japan’s four leading wireless operators, comprising NTT DoCoMo, KDDI, Softbank Mobile and eMobile, have submitted applications for the 4G licenses within the May deadline set by the Ministry of Internal Affairs and Communication (MIC). NTT DoCoMo has allocated a budget of JPY300-400 billion (US$3-4 billion) over five year and is targeting a launch as early as in 2010, while Softbank Mobile is aiming for 2011-2012 start-up with a budget believed to be around JPY100 billion (US1 billion), the report continued. KDDI, the number two mobile carrier, which initially planned to migrate from CDMA EV-DO, is now planning an LTE overlay and will operate the networks in parallel, while eMobile aims to launch its 4G service in 2011.
The telecom analyst added that the telcos will use frequencies in the 2,010 MHz to 2,025 MHz range for LTE technology. It is roughly comparable with fiber-optic networks and a number of domestic carriers intend to use existing 3G infrastructure, where they have spent JPY5 trillion (US$50 billion) to keep a lid on rollout costs.
A tradition of firsts
Meanwhile, the country’s telecom sector has reached a landmark with 3G CDMA subscribers exceeding 100 million in April 2009, announced Japan’s Telecommunications Carriers Association. Domestic subscribers were introduced to 3G CDMA in October 2001 which was launched by DoCoMo. It was followed by CDMA2000 IX by KDDI in April 2002, then Softbank Mobile’s WCDMA in December 2002 and eMobile’s HSDPA in March 2007.
Continuing its tradition of pioneering new services, DoCoMo announced that it expects to introduce a new service for cash transfer to be made by its mobile subscribers without requiring them to key in banking details. After applying online, the subscriber enters the handphone of the recipient, who must be a DoCoMo subscriber, and the amount will be charged to the sender’s phone bill. The Japanese telco is targeting a launch in summer and the amount will be limited to JPY30,000 (US$320) a month.
India: 3G is top priority
India’s telecoms minister, A Raja told the Press Trust of India, after his reconfirmation following recent election, that 3G auction is top on his priority list. The minister indicated that the Department of Telecommunications (DoT) will put up proposals for decision by the cabinet. A DoT official has been quoted in a local newspaper as saying that a ‘3G auction will definitely be held this year and sooner rather than later’.
After creating much buzz with an online auction scheduled for 16 January 2009 which was delayed to end January, the matter has been postponed indefinitely. There have been warnings about the country and businesses losing out because of delays in awarding 3G licenses. (see India’s 3G auction looks scuttled as government eyes a doubling of reserve price). Two state-owned telcos, Bhara Sanchar Nigam Ltd (BSNL) and Mahanagar Telephone Nigam Ltd (MTNL) are exempt from bidding but they are committed to pay the highest bid in the circles they operate in.
Various reasons have been ascribed for the delay, not least of which is an argument that the reserved price for licenses have been underpriced in view of earlier experience when successful domestic telcos secured multi-millions dollars from overseas telcos eyeing the fast-growing Indian market. Hence the country’s Finance Ministry reportedly wants to double the original reserved price of INR20.2 billion (US$420 million).
Posted by muhammad abbas at 2:24 AM 0 comments
Obama to create White House cybersecurity post
WASHINGTON (Reuters) - President Barack Obama said he will name a White House-level czar to coordinate government efforts to fight an epidemic of cybercrime, which even touched his presidential campaign.
"Cyberspace is real and so are the risks that come with it," said Obama in remarks Friday at the White House in which he discussed threats to the nation's digital infrastructure from organized crime, industrial spies and international espionage.
Obama said he would name an official to coordinate cybersecurity policies across the government and organize a response to any major cyber attack.
"I'm creating a new office here at the White House that will be led by the cybersecurity coordinator. Because of the critical importance of this work, I will personally select this official," said Obama. "This official will have my full support and regular access to me."
Obama said his administration would not dictate cybersecurity standards for private companies but would strengthen public-private partnerships and invest in research to develop better ways to secure information infrastructure.
He also stressed the importance of privacy. "Our pursuit of cybersecurity will not -- I repeat, will not -- include monitoring private sector networks or Internet traffic."
Holes in U.S. cybersecurity defenses have allowed major incidents of thefts of personal identity, money, intellectual property and corporate secrets. They also allowed a penetration of Obama's campaign.
"What isn't widely known is that during the general election hackers managed to penetrate our computer systems," said Obama. "Between August and October, hackers gained access to emails and a range of campaign files, from policy position papers to travel plans."
RECOMMENDATIONS
The cybersecurity review, headed by Melissa Hathaway, had urged the president to name a White House coordinator to oversee cybersecurity.
The report, requested by Obama in February, also urged the creation of a strong National Security Council directorate on cybersecurity with a privacy official attached to it.
Other recommendations included preparation of a national strategy to secure U.S. digital networks and stronger international partnerships to fight cybercrime and espionage.
The report said the government, in working with the private sector, should consider tax incentives and reduced liability in exchange for improved security, or increased liability for lax security.
Separately, the Pentagon is considering creating a command dedicated to cyberspace, under the umbrella of U.S. Strategic Command, but Defense Secretary Robert Gates had made no decisions yet, said Pentagon spokesman Bryan Whitman.
"We view cyberspace as a warfighting domain that we have to be able to operate within," said Whitman.
FBR Capital Markets analyst Daniel Ives said Friday's announcement could presage a surge in spending on security software purchased from companies like Symantec Corp and McAfee Inc, both of which have some government sales. "We've heard for so long the government was going to spend. Finally the ball is going to start rolling," said Ives.
John Stewart, Cisco's chief security officer, said some of the important next steps would be on the international stage.
"There's going to be a need for massive international cooperation in all this," he said. "This will show up in varying venues, (like) trade negotiations."
Phillip Dunkelberger, president of security company PGP Corp, said he was hoping for concrete steps to secure the U.S. digital network -- for example, some idea of what the next generation of security architecture would look like.
The cybersecurity report was posted at: http://www.whitehouse.gov/assets/documents/Cyberspace_Policy_Re view_final.pdf
(Additional reporting by Andrew Gray in Washington and Jim Finkle in Boston; Editing by Tim Dobbyn)
Posted by muhammad abbas at 2:23 AM 0 comments
The emerging business case for enhanced OTT video
Maintaining quality could enable service providers to gain new revenue streams.
Study after study by research analysts point toward a clear and, probably, irreversible trend: downstream web traffic comprised of streaming internet video is rapidly growing both in absolute volume and as a percentage of overall traffic. In fact, IDC predicts that by 2013 slightly more than half of all downstream broadband traffic will be streaming video and that the volume of this traffic will exceed all downstream traffic in 2009. In short, streaming video, frequently referred to as over-the-top (OTT) video, threatens to swamp broadband networks as it undermines business models built around linear video broadcasting (cable, DBS, and IPTV).
Broadband Service Providers (BSPs) feel understandably pressured—they are expected to expand their broadband coverage and increase the capacity of their broadband networks while the prices associated with basic broadband services steadily decline. Even with the help of broadband stimulus funds from US taxpayers, it is difficult to construct viable business models that call for pouring capital into infrastructure yielding declining per-user revenue.
Fortunately, there is an emerging business model that can generate the incremental revenue streams required to justify capital investment in broadband infrastructure. Understanding this business model requires a closer examination into the nature of streaming OTT video.
Coincident with the unparalleled growth in OTT video is a change in its composition. Whereas in the past, the bulk of OTT video traffic was composed of short-form, YouTube-ish clips viewed on user laptops, PDAs, and smartphones; increasingly it is made up of long-form TV episodes and movies viewed on flat-panel televisions, sometimes in high definition. Obviously, in addition to fueling the massive growth in the first place, this shift in the composition of streaming OTT video carries with it stringent quality requirements. Many may have tolerated low resolution video clips that frequently timed out (i.e., froze) when the video was free and the viewing device was the computer. But when they pay for the content and display it on their HDTVs, these same viewers become extremely sensitive to anything other than crystal-clear video and audio fidelity.
Unfortunately, the best-effort traffic management principles associated with the Internet in general and broadband in particular are not well suited to handle traffic with such stringent service quality requirements. Recognizing this, virtually all video content storefronts (e.g., Apple iTunes, Amazon Video on Demand, Hulu, Netflix, etc.) pay content delivery networks (CDNs) to bypass the somewhat rickety internet and deliver high-value video traffic directly to BSP peering points. CDNs, with well-managed, high-capacity backbone networks and distributed storage points, are able to provide video storefronts with explicit service level agreements (SLAs), something the internet will never be capable of doing.
But video content storefronts still face a challenge: the CDNs do not deliver traffic to the end user, they deliver it to the BSP. The traffic then has to traverse another network before reaching the ultimate end user. To CDNs this did not initially appear to be a problem as they believed they were “close” to the end user, measured in terms of router hops (in fact CDNs were born primarily as a way to minimize router hops between web browsers and web servers). And, indeed, the point at which CDNs hand off traffic to BSPs is often no more than one or two router hops away from the end user. But while the end user may appear “close” at layer three, he/she is still somewhat distant at layer two. The entire broadband aggregation network as well as the broadband access facility itself is a large, multi-hop, layer two network that is often heavily congested. In most cases, that entire network is governed by first-come-first-serve, best-effort traffic management mechanisms (or lack thereof). Obviously this is not an environment well suited for streaming OTT video.
Arguably, this problem can only be rectified by the BSP; they own and operate the only portion of the network between video servers and digital video players that does not carry an explicit SLA. It stands to reason, therefore, that if BSPs could guarantee service quality for specific OTT video streams and provide SLAs on those streams, various parties might be willing to pay them for the service. Here there are two potential revenue sources for the BSP: the end user, paying for “premium internet TV,” or the CDN/content storefront paying for an explicit SLA. (See Figure 1.)
Figure 1.
Whether the end user pays for a superior OTT viewing experience, one that is nearly indistinguishable from a locally attached DVD player, or the CDN/content storefront pays for a guaranteed SLA, the technical requirements placed on the BSP are identical:
1. Identify OTT video “sessions.”
2. Determine policy for each session (e.g., has this subscriber requested premium OTT service?).
3. Determine availability of network resources (i.e., don’t make service quality guarantees that can’t be fulfilled).
4. Allocate specific amount of bandwidth (above and beyond basic high-speed internet access) for the duration of the video.
5. Monitor video quality during session (optional but highly desirable).
6. Release dedicated bandwidth and generate settlement records at session termination.
How these functions are implemented will be service provider dependent. In many cases they can be implemented within a single device, in other cases multiple devices may be required. But in any case, these functions are required in order to recognize and handle streaming OTT video in a manner that delivers the viewing experience subscribers are looking for—and will pay for.
Most questions service providers have on this topic deal not so much with technology but rather with consumer and regulatory acceptance. Fortunately, there are positive indications on these fronts as well.
From the standpoint of consumer acceptance, surveys conducted by IDC have consistently shown a substantial subset of the population willing to pay incremental fees for an OTT video viewing experience that approximates that of a locally attached DVD player. The size of the subset obviously depends on price and other services that are bundled with the service but, contrary to conventional wisdom held by many BSP marketing departments (“customers won’t pay for anything”), there is a clear willingness to pay for better video.
One factor to keep in mind is how premium OTT video services are marketed. If the service is billed merely as a better way for broadband to handle streaming video, take rates might peak in the 20-25% range. However, if the service is billed as a TV video on demand (VOD) service and bundled with a digital video player (e.g., Roku), take rates may be much higher. The reason is that in the former case, the subscriber may believe that broadband should already deliver such capabilities. In the latter case, the subscriber views the service as a TV/movie service, not a broadband service.
A final point is the regulatory treatment of these services. Many analysts covering the FCC believe, and recent FCC reports seem to concur, that these services would be acceptable from a regulatory standpoint. The reason is that these services, by and large, have nothing to do with the internet. Rather, they take traffic from CDNs and deliver it over broadband using discrete “broadband video channels” with specific bandwidth allocations and separated from the high-speed internet (HSI) access channel using protocol mechanisms. The HSI channel is unimpaired by streaming video. In this regard it is similar to the handling of cable TV and IPTV.
With more and more consumers “cutting the cable cord” and obtaining their video on-line, the business case for premium OTT video services is especially compelling; the capital cost to offer the service (even with bundled digital video players) is modest compared to IPTV, and consumer demand is high and growing. Regulators appear, rightly, focused on stamping out bad behavior rather than obstructing services consumers are clamoring for. In short, all lights appear green.
Posted by muhammad abbas at 2:21 AM 0 comments
Verizon offers pay-as-you-go hosting service
NEW YORK (Reuters) - Verizon Communications Inc on Thursday unveiled a pay-as-you go hosting service for corporate customers looking to save money by buying only as much computing capacity as they need at any given time.
Verizon is the latest U.S. telecom to jump on the bandwagon of so-called "cloud computing" services after AT&T Inc said last month that it would offer Web-based storage services for enterprises.
Gartner Research estimated earlier this year that global revenues from cloud computing and storage, or the use of the Web to access those services at remote data centers, will climb 31 percent to $3.4 billion this year.
The fledgling field is led by Internet pioneers Amazon.com Inc, Google Inc and Salesforce.com Inc, but more established technology companies like Verizon, AT&T, IBM and Microsoft Corp are introducing new services this year in a bid to catch up before sales start to boom.
Verizon said that its new "computing as a service" offering can provide customers with remote hosting capacity for applications such as retail websites within an hour after they put in an order.
A time lag of anywhere from four to eight weeks would have been more typical using older technologies.
Verizon, which had promised in February to launch on-demand hosting this summer, said some customers would likely end up switching from traditional hosting to on-demand services because the new products cost less.
"There will be some of that," Michael Marcellin, vice president of global managed solutions told Reuters on Tuesday when asked if the cloud service might cannibalizing the traditional business.
"As people look at this delivery model this is very compelling," he said.
(Reporting by Sinead Carew in New York and Jim Finkle in
Boston; Editing by Edwin Chan and Carol Bishopric)
Posted by muhammad abbas at 2:20 AM 0 comments
Eight Asian telcos to build subsea fiber optic network
Higher bandwidth capacity and redundancy will benefit region known for its subsea earthquakes.
A new submarine fiber optic network, known as Asia-Pacific Gateway (APG), is in the offing with a memorandum of understanding signed recently by eight of Asia’s biggest telcos. Targeted to be ready for service in 2011, the 8,000 kilometer network with a minimum design capacity of four terabits per-second will link the region’s growing economies.
As demand continues to increase for high bandwidth communication, including enterprise network services and the Internet, the proposed fiber optic network will offer alternative communication routes and nodes. In the event of accidents or subsea earthquakes, it will minimize the impact of a breakdown in services provided by existing regional fiber optic network.
Using the latest Dense Wavelength Division Multiplexing (DWDM) technologies, the proposed APG network is designed to provide a high-degree of interconnectivity with existing and planned high bandwidth communication systems.
The APG network will connect Japan, Korea, mainland China, Taiwan, Philippines, Hong Kong, Vietnam, Thailand, Malaysia and Singapore. The signatories to the agreement for the development of the gateway are China Telecom, China Unicom, Chunghwa Telecom, NTT Communication (Japan), Vietnam Post and Telecommunications (VNPT), Korea Telecom (KT), Philippines Long Distance Telephone (PLDT) and Telekom Malaysia. The telcos will jointly finance and own the APG network.
Three APG telcos also in AAG network
Three of the telcos in the APG consortium – PLDT, VNPT and Telekom Malaysia - are also participating in the Asia-America Gateway (AAG) project which spans 20,000 kilometers to connect Southeast Asia with the US.
Aside from Bayan Telecommunications and Eastern Telecommunications from Philippines, the other members of the AAG consortium are Brunei government , AT&T (US), Bharti Airtel (India), Communications Global Network (UK), Pacific Communication (Cambodia), CAT Telecom (Thailand), PT Indosat (Indonesia), PT Telekomunikasi (Indonesia), Starhub (Singapore), Telecom New Zealand, Telstra (Australia), while participants from Vietnam are Saigon Postel and Viettel.
Originally scheduled for completion in the last quarter of 2008, delays in construction has led to revised deadline till August this year. The US$550 million the AAG network will link Southeast Asian countries, namely Malaysia, Singapore, Thailand, Brunei, Vietnam, Hong Kong and Philippines with US territories in the Pacific, such as Guam and Hawaii, and the west coast.
In another development, NTT Communications announced its acquisition of Pacific Crossing without providing financial details. Pacific Crossings operates the 21,000-kilometer fiber optic network, PC-1, spanning Japan and the USA with two points in each country and has capacity of 3.2 terabits per second.
Posted by muhammad abbas at 2:18 AM 0 comments
Tekelec turns SMS deployments on their head
SMS router leverages FDA to drive lower cost per SMS message.
In Q1 09, Verizon Wireless reported that its customers sent or recieved an average of 1.4 billion text messages each day, totaling more than 127 billion text messages. What's more, Verizon Wireless customers sent nearly 2.1 billion picture and video messages and completed 48.6 million music and video downloads during the quarter.
The traffic traversing Verizon Wireless’ network along with that of other large mobile players such as AT&T, Sprint and T-Mobile has led Frost and Sullivan to forecast that global SMS message volume is growing at a CAGR of 15.6 percent (2007-2011). However, SMS revenues are only growing at a CAGR of 5.9 percent over the same period.
Figure 1. Global SMS traffic.
Source: Tekelec
Figure 2. Global SMS Revenues.
Source: Tekelec
Still, the increased momentum around SMS is driving wireless operators to rethink how they handle SMS in their respective networks.
Enter Tekelec and its SMS router. With the goal of helping operators handle SMS traffic while keeping costs down, the vendor claims it can achieve these goals by using a First Delivery Attempt (FDA) method for Mobile Originated (MO)/Application-Originated (AO) messages.
By incorporating flexible routing rules the router enables traffic to be routed on any SMS parameter: sender, recipient, SMSC address, data coding schemes, mobile switch center and message content.
Alan Pascoe, senior product marketing manager for Tekelec, argues that the problem with legacy SMSC was in their overall approach to distributing and storing SMS messages. And Pascoe should know. Prior to taking his post Tekelec, Pascoe previously was an SMS service designer for O2 wireless.
“Routers have been around for a number of years,” he said. “A traditional SMSC leverages a Store and Forward approach, but what a router does is completely turning it around and goes forward and then store because chances of me sending you a text message in a FDA format are a lot higher these days than it was five to 10 years ago.”
Cap and grow
Given the large investments wireless operators have already made in their existing SMSC networks, the SMS router and the overall Tekelec SMS network concept allow the service provider to cap and grow their existing SMSC investments.
In so doing, the SMS Router can help service providers extend their existing investments while making a transition to the next-gen router platform. What’s more, the SMS router supports SS7 and SIGTRAN signaling as well as CAMEL and DIAMETER billing protocols.
Figure 3. Tekelec's SMS Router.
Source: Tekelec
FDA is the key enabler to help wireless operators make the the legacy-to-next gen SMS network transition.
By using FDA, the SMS router can free up the legacy SMSC capacity and enable the operator to handle about 15 percent of stored message traffic until it can be delivered. An existing SMSC is required to only handle the remaining 15 percent that require storage for future delivery.
To provide a better picture of what benefits they can gain by making the switch to a next-gen SMS network, Tekelec developed a business case in collaboration with Frost and Sullivan to help wireless operators profitably deploy the SMS router and adopt its overall SMS network concept. These business cases include four distinct elements: SMS Offload SMS First Delivery Attempt; SMS Offload – Voting Traffic; Blocking SMS MO Spoofing; and the combination of all three in one SMS set.
In a SMS offload application, Tekelec claims a wireless operator could potentially reduce the current SMSC load by 80-90 percent. This load reduction, however, depends on FDA success ratio.
Tekelec claims that this cap and grow strategy allows a wireless operator to realize 60 percent in savings over a traditional SMSC. Opting for a SMS Network with a SMS router would cost only $0.9 million versus spending $2.2 million on a new SMSC.
“The fourth benefit of the SMS network concept is where a wireless operator can combine the offload and the voting offload applications,” Pascoe said. “You can pretty much use the same equipment to deliver both of those savings from the voting and from the first delivery point of view.”
Breaking SMS bottlenecks
In bridging the legacy and next-gen SMS worlds, the SMS router can help service providers overcome traditional SMS capacity bottlenecks by leveraging load balancing and throughput control techniques. Wireless operators can support high-end SMS customers with the ability to prioritize traffic based on premium SLAs.
One application where traffic flow prioritization will come in handy is televoting.
Televoting is not just vendor speak, however. Verisign’s Mobile Media Division, which provides Internet infrastructure services, predicts that mobile voting and other interactions with the government via SMS will become commonplace.
However, one of the common problems wireless operators have in dealing with voting applications is that existing SMSCs can’t handle voting traffic spikes and the ability balance loads throughout the SMS network.
“Using a router approach to conduct voting offload has some distinct advantages over using an SMSC,” said Pascoe. “Typically, this sort of traffic is delivered in a huge spike, but with an SMSC you need to buffer all of those messages which can overload meaning that messages are dropped because the buffers fill the filter quickly.”
In a televoting application, the SMS router can manage peak traffic control while preventing legacy SMSC overload by delivering traffic directly to the application. This means that SMS traffic can be delivered to a part of the subscriber’s handset.
Pascoe adds that by untangling bottlenecks, the operator can cut opex costs by ensuring customer happiness.
“Most of the savings you’ll actually get is in the brand awareness and the reduction of customer care complaints,” Pascoe said. “When you look into an operator’s business model in terms of customer care complaints, they usually attribute a dollar value to each particular complaint.”
Posted by muhammad abbas at 2:16 AM 0 comments
Nokeena pushes boundaries of online video delivery
Sets focus on enhancing user Quality of Experience.
Sets focus on enhancing user Quality of Experience
Nokeena
In this latest Telecom Engine Audiocast, Rajan Raghavan, CEO and co-founder of Nokeena Networks talks to us about the company’s debut and how it can bring a value-added service experience to the burgeoning online video business.
Raghavan addresses the following questions in this Audiocast:
• ABI Research believes the number of viewers who access video via the Web will quadruple in the next few years, reaching at least 1 billion by 2013. To start tell me your feeling about the online Over the Top (OTT) video market?
• How does Nokeena approach the OTT video opportunity?
• Why is the Nokeena product beneficial for content owners, media publishers and distributors?
• How is Nokeena different from other streaming software available currently available on the market?
• The past few years have not been the easiest economic times for new startups, but what makes 2009 the right time for Nokeena to enter into this market?
Posted by muhammad abbas at 2:13 AM 0 comments
Nokeena pushes boundaries of online video delivery Sets focus on enhancing user Quality of Experience
Posted by muhammad abbas at 2:13 AM 0 comments
Eight Asian telcos to build subsea fiber optic network
Higher bandwidth capacity and redundancy will benefit region known for its subsea earthquakes.
A new submarine fiber optic network, known as Asia-Pacific Gateway (APG), is in the offing with a memorandum of understanding signed recently by eight of Asia’s biggest telcos. Targeted to be ready for service in 2011, the 8,000 kilometer network with a minimum design capacity of four terabits per-second will link the region’s growing economies.
As demand continues to increase for high bandwidth communication, including enterprise network services and the Internet, the proposed fiber optic network will offer alternative communication routes and nodes. In the event of accidents or subsea earthquakes, it will minimize the impact of a breakdown in services provided by existing regional fiber optic network.
Using the latest Dense Wavelength Division Multiplexing (DWDM) technologies, the proposed APG network is designed to provide a high-degree of interconnectivity with existing and planned high bandwidth communication systems.
The APG network will connect Japan, Korea, mainland China, Taiwan, Philippines, Hong Kong, Vietnam, Thailand, Malaysia and Singapore. The signatories to the agreement for the development of the gateway are China Telecom, China Unicom, Chunghwa Telecom, NTT Communication (Japan), Vietnam Post and Telecommunications (VNPT), Korea Telecom (KT), Philippines Long Distance Telephone (PLDT) and Telekom Malaysia. The telcos will jointly finance and own the APG network.
Three APG telcos also in AAG network
Three of the telcos in the APG consortium – PLDT, VNPT and Telekom Malaysia - are also participating in the Asia-America Gateway (AAG) project which spans 20,000 kilometers to connect Southeast Asia with the US.
Aside from Bayan Telecommunications and Eastern Telecommunications from Philippines, the other members of the AAG consortium are Brunei government , AT&T (US), Bharti Airtel (India), Communications Global Network (UK), Pacific Communication (Cambodia), CAT Telecom (Thailand), PT Indosat (Indonesia), PT Telekomunikasi (Indonesia), Starhub (Singapore), Telecom New Zealand, Telstra (Australia), while participants from Vietnam are Saigon Postel and Viettel.
Originally scheduled for completion in the last quarter of 2008, delays in construction has led to revised deadline till August this year. The US$550 million the AAG network will link Southeast Asian countries, namely Malaysia, Singapore, Thailand, Brunei, Vietnam, Hong Kong and Philippines with US territories in the Pacific, such as Guam and Hawaii, and the west coast.
In another development, NTT Communications announced its acquisition of Pacific Crossing without providing financial details. Pacific Crossings operates the 21,000-kilometer fiber optic network, PC-1, spanning Japan and the USA with two points in each country and has capacity of 3.2 terabits per second.
Posted by muhammad abbas at 2:08 AM 0 comments