Tuesday, December 15, 2009

Google Reportedly To Offer Its Own Phone

Google is reportedly considering offering its own mobile phone based on the Android operating system as early as next year.

While the company is not confirming its plans, it has been widely reported that Google has already given test phones to employees. (According to a Google blog post in response to the reports, its employees often experiment with new technologies.)

If Google actually does release its own phone, it would pit the company against carriers and handset makers that have released Android-based smartphones. Such a move could also put pressure on Microsoft to go a similar route, perhaps offering a smartphone based on the Zune that incorporates some or all of the components of the Windows Mobile platform.
At any rate, Google's move is expected to challenge market leader Apple, whose iPhone and iTunes App Store are the dominant mobile platform.

Google employees testing the phone told the New York Times that the Google phone will be unlocked, allowing users to choose their service provider. It will also be GSM-based, the same technology "used by AT&T and T-Mobile in the United States and by most other carriers around the world," according to the NYT article.

Analysts believe Google's interest in offering its own phone is to cash in on mobile advertising. The phone would possibly be subsidized by advertising rather than by carriers, noted Forrester Research analyst Charles Golvin in a blog post about the implications of a Google phone.

"I think they are trying to get the market to a place where mobile advertising is as big as the market can be, and they are trying to move the market faster than it would otherwise move," said Yankee Group analyst Joshua Holbrook in an interview.

While a phone from Google does pose the risk of annoying handset providers who are supporting Android, Holbrook said, Google will likely offer limited-device (if not single-device) configuration, allowing handset makers to offer more choices.

Telecom analyst Tom Nolle of CIMI Corp. said it remains to be seen whether Google will follow through. But if it does, "what's likely happening is that Google is viewing this as a move against the telco walled-garden approach by moving against the bundled handset programs that put the operators more in the driver's seat with innovation in cellular," he said in an e-mail interview.

Meanwhile, the buzz over a potential Google phone also highlights the question of whether Microsoft will change its position on offering its own phone, something the company has insisted it has no plans to do.

Nolle said it would be too risky for Microsoft to jump into the phone market, even if Google does.

"I think it doubles down on Microsoft's risk," he said. "They have a lot more to lose by getting directly into the cellular business than Google does. Microsoft expected to make money on Windows Mobile and they'd certainly kill off all their handset partners if they decided to sell phones of their own. Imagine what would happen to Windows if Microsoft sold computers. Google Android is open source and free, so Google might very well be able to sustain partner interest even with its own handsets."

Holbrook agreed. "They've [Microsoft] said that repeatedly they won't and just recently reiterated it," he said. "I'll take them at their word, and frankly I don't think they will because at the heart, they are a software company. That said, if there is anything that can get them to change their mind and do that, it would be Google."



‘New Moon’ Co-Star Thinks ‘Breaking Dawn’ Green Light Is ImminentWindows Phones To Arrive in Coming Months

Sun Dispute Leads Apache Software Foundation To Vote 'No' on Java EE 6 Spec

The Java EE 6 spec, released today, was approved by a vote the JCP Executive Committee (see related story here), but it wasn't unanimous. The Apache Software Foundation (ASF) cast the only nay vote (there were two abstentions).

Geir Magnusson, Jr., director of the ASF and that group's representative to the JCP, explained the Foundation's position to this site via e-mail: "The vote hinged on the ASF's policy that any member of the JCP who does not abide by their obligations under the rules and agreements of the JCP should not be able to participate in, let alone lead, a JSR… The ASF contends that Sun Microsystems is in breach of their contractual obligations under the JSPA [the contract under which an entity becomes a member of the JCP], as well as acting in a way contrary to the community expectations of a spec lead, as well as public promises made to the Java community at large by its corporate officers. Thus the вЂno' vote for the Sun-led JSRs at the last ballot," he wrote.
The ASF has claimed that Sun is refusing to provide the Foundation with a license for a Technology Compatibility Kit (TCK), the official test suite that can only be provided by the spec lead, and which the ASF needs to complete work on its Harmony implementation of Java Standard Edition (Java SE) under the Apache License. Magnusson asserts that Sun wants to protect the licensing revenue it receives for its own implementation of the Java SE spec, and that's why it won't issue the license.

If that truly is Sun's aim, it would be contrary to the objective of the JCP, which is charged with providing specifications that can have multiple independent implementations.

"To be clear," Magnusson added, "there have been proposals from Sun, but each time the license for the TCK came with restrictions that would prevent the ASF from distributing the final tested code… For an open source organization that exists to distribute software under open source licenses, this is clearly unacceptable. We feel that the JSPA [the contract under which an entity becomes a member of the JCP] is very clear on what terms a spec lead can, and more importantly cannot place in a TCK license, and thus we feel Sun is in breach of the [contract]… and thus shouldn't be able to lead or participate in JSRs."

Magnusson, who is the creator of the Harmony project, pointed out that the ASF has implemented JSRs before. The Foundation implemented the Java EE spec in the Apache Geronimo app server, the servlet and JSP spec in the Apache Tomcat servlet container, the JMS spec in the Apache ActiveMQ message broker, and the JSF specification in Apache MyFaces, among others.

Rod Johnson, founder of VMware subsidiary SpringSource and member of the JCP Executive Committee, said he is disappointed that Sun has not been able to resolve this dispute with ASF. "This issue raises legitimate concerns about the credibility of the JCP as a whole," he said. "I mean, the JCP is either it's open or it's not. I have a lot of sympathy for the Foundation on this issue."

Sun is unable to comment on the ASF's charges, the company said, because it can't make forward-looking statements in the midst of being acquired by Oracle Corp. for $7.4 billion. That deal, still in the works, has been stalled by the European Union over concerns about Oracle's ownership of the open source MySQL database, which Sun acquired last year.



‘Avatar’ London Premiere Brings Out Stars, Wows CrowdMicrosoft Shareholders Approve Executive Compensation

Hotfixes Issued for Office 2003 Bug

Microsoft issued hotfixes on Friday to address problems some users were having with Microsoft Office 2003 files.

The files were protected by a security option called the Active Directory rights management service, or RMS. Organizations typically use RMS to restrict access to documents created with Office. However, some users found last week that these RMS-protected files could not be opened or saved.

Instead, they got a message reading, "Unexpected error occurred. Please try again later or contact your system administrator," according to a Microsoft blog posted on Friday. On Saturday, Microsoft issued three hotfixes to address the problem.
An account by Technologizer initially spotted the glitch announcement and quoted a Microsoft spokesperson as saying that the problem stemmed from the expiration of an "Information Rights Management (IRM) certificate."

The RMS problem was associated only with Excel, Outlook, PowerPoint and Word in Office 2003, according to the Microsoft KnowledgeBase articles. Office 2007 applications were not affected.

The hotfixes -- Office Client (KB978551), Word Viewer (KB978558) and Excel Viewer (KB978557) -- are available from Microsoft Support without charge. However, IT pros will have to call Microsoft Support and reference those three KnowledgeBase articles to get the hotfixes, according to a Microsoft blog. The blog noted that Office 2003 has to be updated with Service Pack 3 before installing these hotfixes.

In addition to controlling who can open or modify the files, Microsoft's RMS technology can be used to restrict actions on files, such as forwarding and printing. Document security can be customized by modifying a rights expression language based on XML. Developers can use a Microsoft-provided software development kit to make the changes, according to the kit's documentation.



‘Avatar’ Director James Cameron Says Fan Pressure Inspires HimDecember Microsoft Patch Contains Critical IE Fix

Sunday, December 13, 2009

Microsoft To Acquire Sentillion

Looking to extend its presence in the growing field of health care information technology, Microsoft Thursday said it will acquire Sentillion Inc. for an undisclosed sum.

In announcing the deal, Microsoft said it plans to continue offering Sentillion's single sign-on (SSO) software designed for clinical systems deployed in more than 1,000 hospitals, according to the closely held vendor.

But the deal will also pave the way for Microsoft to bridge Sentillion's technology with Microsoft's Amalga Unified Intelligence System (UIS), a data aggregation platform launched in April. Amalga UIS is designed to integrate patient data with a variety of systems and let authorized clinicians and patients alike to access their medical records, Microsoft said in announcing the platform at the time.
Among the 115 hospitals using it are John Hopkins, New York Presbyterian Hospital, Novant Health and Seattle's Hospital, according to Microsoft.

Back in June, Microsoft licensed Sentillion's SSO, context management and strong authentication technology for use with Amalga UIS. Sentillion has its own offering, called Vergence, which combines the SSO, context management and strong authentication into a clinical workstation, and ProVision, which supports user provisioning.

In announcing the planned acquisition, Microsoft said by combining the capabilities of Amalga UIS and Sentillion's technology, it will be able to offer real-time information and simplified workflow, thereby reducing the administrative tasks in administering care. Amalga UIS is also intended to integrate disparate health care management, administrative and financial systems, Microsoft said.

Sentillion will continue to operate out of its corporate headquarters of Andover, Mass. The deal is scheduled to close in early 2010.



Microsoft and Yahoo Finalize Search Contract‘Avatar’: The Early Reviews Are In!

December Microsoft Patch Contains Critical IE Fix

As expected, the last Microsoft Security Update of 2009 will include six security bulletins -- three dubbed "critical" and three labeled "important."

By normal workload standards, things aren't that heavy in this month's slate, which contains fixes for 12 vulnerabilities, according to Wolfgang Kandek, chief technology officer at Qualys. Still, when you add it all up over the year, patch management professionals have been kept busy.

"Microsoft closes 2009 with its last regular patch release adding six bulletins bringing the year's total to 74," Kandek said.
As usual, remote code execution considerations dominate this month's rollout, plus there's one denial-of-service bug to patch.

Critical Patches
The first critical item on the slate is said to fix two bugs, affecting all versions of Windows. Left unpatched, Redmond said the bugs could allow remote code execution if messages "received by the Internet Authentication Service server are copied incorrectly into memory when handling PEAP authentication attempts."

PEAP, or Protected Extensible Authorization Protocol, is a wireless protocol that verifies 802.1X security authentication. The vulnerability could derive from third-party software installed on client systems, according to Jason Miller, data and security team leader at Shavlik Technologies.

"Interesting enough, client systems do not have the vulnerable files on the system as they are not part of the base operating system, but Microsoft is providing a patch for Windows client system," Miller explained. "However, third-party products can be installed on client systems that can be vulnerable."

The second critical item fixes a privately reported vulnerability in Microsoft Project. According to Redmond, the bug could "allow remote code execution if a user opens a specially crafted Project file." This patch touches Microsoft Project 2000, 2002 and 2003.

Critical item No. 3 is where IT pros should be most focused, security experts say. It resolves four privately reported vulnerabilities and one publicly disclosed vulnerability in Internet Explorer. Once again, the focus of this IE fix is Microsoft's Active Template Library (ATL).

"IE cumulative updates are generally run-of-the-mill," said Sheldon Malm, senior director of security strategy at Rapid7. "But this month's is the big one with five separate vulnerabilities including ATL. Who knows how long Microsoft has been wrestling with this one…particularly since the CVE reference for ATL clearly precedes the other IE vulnerabilities."

Along with the patch for IE, IT pros will face other high-priority items to patch this month, according to Paul Zimski, vice president of market strategy at Lumension.

"Although there are several critical patches that need to be addressed this month, the big deal for Microsoft patches centers on the ubiquitous [IE patch] affecting all versions of Internet Explorer and carrying Microsoft's highest exploitability rating," Zimski said. "This, combined with subsequent updates issued in Apple's Java for OS X, Adobe's Flash Player and AIR make this month particularly important to shore-up patches and protect against Web-borne malware threats."

Important Patches
The No. 1 important patch on the slate is also the lone security bulletin with denial-of-service implications. The fix resolves a vulnerability that could lock users or administrators out of a Windows system if a remote attacker sends "a specially crafted ISAKMP message to the Local Security Authority Subsystem Service (LSASS)" on an affected system.

Under such an attack, the LSASS service can use up all Windows system resources, forcing users to reboot their systems to gain back those resources and make the system responsive once again. This item affects Windows 2000, XP and Windows Server 2003.

The second important item in the December rollout affects Windows Server 2003 and 2008. It addresses a vulnerability in which specially crafted HTTP requests can get sent to a Web server enabled by Active Directory Federation Services.

The third important item addresses a privately reported vulnerability in Windows 2000, XP and Windows Server 2003 associated with Microsoft WordPad and Microsoft Office text converters. Microsoft explained that the bug is triggered when "a specially crafted Word 97 file is opened in WordPad or Microsoft Office Word."

All patches may require a restart.

Those looking for nonsecurity updates via Windows Server Update Services, Windows Update and Microsoft Update can find information in this KnowledgeBase article.



‘Eclipse’ Actress Bryce Dallas Howard Strived For ‘Continuity’ As VictoriaMicrosoft Warns of IE 6 and IE 7 Bug

2009: The Year of the Social Media Hack

In the fall of 2009, something curious happened to Shanon Murray's Facebook account.

Judging from her status updates on the popular social networking site, Murray, a public relations professional based in Washington, D.C., had been mugged in London and was now being held hostage. Terrified, she was sending distressed messages to friends and family via her Facebook account.

Naturally, those who viewed Murray's messages were concerned -- but only for a moment. That's because Murray's account, it turned out, had been hacked.

"When I pull up Facebook, I can actually see the hacker messaging all my contacts. And I'm like, 'Hey, everyone, not only am I safe but I'm not in London. I'm being hacked,'" Murray recalled.
This year, these types of attacks became more common among users of social networking sites such as Facebook, MySpace and the micro-blogging site Twitter.

"One of the biggest threats from social media sites is hackers getting users to click links that they don't know and might not trust," said Corey Thomas, vice president of products and operations for IT security firm Rapid7. "The threat to businesses is that a hacker has the same access to a company system as the employee who has been hacked."

The use of social media grew exponentially in 2009; based on data from Cisco's 2009 security report releasedВ in December (PDF), more than 2 percent of all Web traffic for businesses comes from employees accessing social networking sites.

While social networking has opened new possibilities for communicating over the Internet, it also gives hackers more vectors through which to attack.

Murray, who uses Internet Explorer to access Facebook, got hacked while she was logged in at an office workstation with her Facebook window open. She said Facebook investigated her incident but still doesn't know how it happened.

However, Cisco's report gives some clues. It suggested that most Web-based attacks, particularly those on social networking sites, are not due to vulnerabilities in browsers or the applications they run. Usually, the trouble comes when users develop a tendency to trust communications from friends.

"More commonly, these threats originate from individuals who place an unwarranted amount of 'transitive trust' in the safety of these communities," the report said.

Experts say the most frequent causes of social networking hacks are spoofing and phishing. Both of these methods rely mostly on users clicking Web links or logging in via fake Web sites.

Spoofing is the more pervasive of the two. It involves hackers sending phony alerts or messages supposedly from friends (or, in the case of Twitter, followers). But once a user opens them, there's a possibility of being re-routed to malicious sites or triggering automated viruses or remote code execution, which gives a hacker control of a user's browsing session. Worst-case scenario: Hackers could release intelligent worms onto a user's system that search profiles and send messages to all contacts.

Phishing, meanwhile, usually uses more innocuous-seeming bait. Examples include Murray's Facebook messages, a "video of you" from a friend or an error message from a social networking site that requires the user's action. With phishing, users are lured into clicking on a spoofed link or page (often designed to look like the homepage of a trusted Web site, like Facebook) where unwitting users enter log-in information or click on page links from people they know -- or think they know.

Links are becoming an important component of social networking security. Recently, the heavy use of condensed Web addresses (such as those from TinyURL and bit.ly) to post on Twitter and Facebook has made it easier to surf but nearly impossible to identify the links' domain or origin. This increases the chances of clicking on a spoofed or malicious link.

"Whether it's 'TinyURL' or 'bit.ly' technology, users are getting into the habit of clicking links that they don't know or trust," Thomas said. "This makes it much easier for a hacker to hijack the target's system."

As it continues to grow in popularity, social media will also increasingly become a staging ground for hackers. This presents a range of challenges for businesses with employees who have accounts on one or more social networking sites.В 

Ideally, users at the office wouldn't use social media when they should be working. "A little common sense on the part of users won't hurt either," Thomas said. "Although there is no perfect solution, there are best practices, which include training users on what to click on and what not to click on."



Microsoft Warns of IE 6 and IE 7 Bug‘Up In The Air’ Director Jason Reitman Reveals The Real George Clooney

Friday, December 11, 2009

IT Heavyweights Back Telecom Cloud Forum

The telecommunications industry has joined the effort to forge interoperability, security and common service levels among cloud providers by forming a consortium backed by some key cloud providers, enterprise customers, and hardware and software vendors.

The TM Forum, an established telecom industry association, launched the Enterprise Cloud Buyers' Council (ECBC) at its annual Management World Americas conference in Orlando this week. The ECBC is backed by some key providers including Microsoft, IBM, CA, Hewlett-Packard, Cisco and EMC, as well as telecom providers AT&T, BT, Telecom Italia and Nokia Siemens Networks.
Noticeably absent from the list of backers are leading cloud providers Amazon and Google.

Some analysts were quick to question the need for another cloud computing consortium, noting numerous other groups such as the Cloud Computing Interoperability Forum and the Distributed Management Task Force's Open Cloud Standards Incubator.

"If I could get a couple of publications to blow kisses at the idea, I could launch a cloud forum and I could get key players to join it," said Tom Nolle, a telecom and IT consultant who recently withdrew his involvement in the TM Forum.

Citing its track record for providing standards among telecom carriers, the TM Forum believes it has the clout and expertise to bring cloud providers together to forge compatibility and universal definitions.

The ECBC's goal is to forge common API's for developers, security, product definitions, SLAs, benchmarking, federated cloud stores and interoperability among cloud services. It has agreed to follow the initiatives of the DMTF and the IT Service Management Forum (itSMF), an organization focused on providing service delivery, as well as guidelines from the National Institute of Standards and Technology (NIST).

Microsoft's involvement was particularly noteworthy, given that it has declined to participate in the Cloud Computing Interoperability Forum.

"There were a lot of politics in that that a lot of people didn't want to step into," said Keith Willetts, the TM Forum's chairman and CEO.

Microsoft was not available for comment but endorsed the ECBC in a statement, saying it is consistent with Microsoft's effort to support private and hybrid clouds.

For his part, Nolle said he wasn't surprised by Microsoft's decision to support the ECBC, noting that Microsoft has long been involved with the TM Forum.

The TM Forum maintains it will succeed for one key reason: It is relying on enterprise customers, who have raised key concerns about cloud services, to bring providers to the table. The ECBC has six major enterprise customers on board representing such industries as pharmaceuticals, retail and banking, and has been conducting ongoing discussions with a number of key influencers, according to Willetts. Only two of those enterprise customers have revealed their membership: Commonwealth Bank of Australia and Deutsche Bank.

"We've historically brought together buyers and sellers to really get the buyer requirements of the supplier side lined up," Willetts said in an interview. "We've been doing that for quite some time for the communications industry through real deep-dive technical working groups and best practices and standards."

Eric Pulier, ECBC's executive director, emphasized in an e-mail that the group's charter is focused on forging buyer requirements rather than more common seller-led efforts. "The result [of seller-led efforts] is an expensive and unwieldy set of terms, management consoles, commercial constructs and product descriptions that lead to opaque pricing, inability to benchmark, high integration costs and vendor lock-in," Pulier said.

But despite the need to address those issues, the telecom industry has tended to be slow and bureaucratic in forging standards, Nolle said. During his tenure with the TM Forum, Nolle said he was on several committees where forging standards took so long that once done, they no longer addressed current priorities.

"I know how these things work and the problem is the buyers are helplessly trapped in a bureaucratic process, and the thing that worries me the most about their enterprise initiatives is that the enterprises have a shorter capital cycle than the providers," Nolle said.

He also questioned whether the telecom industry has the chops to work with today's leading cloud providers. "In the days when the TM Forum was growingВ up, networks were built by connecting boxes. Now networks are built by connecting software components. It's a different world and they're notВ there yet," he said.

"I think there is a view by the standards guys that the Amazons and Googles of the world don't participate because they are not in favor of standards," he added. "That's not true. If you look at Google Wave, they supported the XMPP standard for federation. What they are saying is you can't try to standardize something which you haven't even built."



‘2012′: The Reviews Are In!Early Enterprise Cloud Deployments Likely To Be Hybrid

2010 a Boom Year for Cloud Computing?

There's still a lot of hype surrounding the platform, but if recent market research is any indication, computing in the clouds could achieve a breakthrough of sorts in 2010.

Consider a new study from market watcher Aberdeen Group, which concludes that companies pursuing cloud strategies are able to realize significant savings, including a near 20 percent reduction in administrative costs, compared to non-adopters.

Aberdeen's research comes with an obvious caveat: It isn't enough to simply develop a cloud computing strategy; successful cloud adopters -- shops that reap the biggest benefits from their cloud implementations -- must also develop appropriate support resources, such as a formal cloud computing team or task force, to better manage their cloud investments.
"Discussions of cloud computing have made their way up to the boardroom just as board-level talk of outsourcing did in the 1980s and the Internet did in the 1990s," wrote analysts Bill Lesieur and Carol Baroudi in "Business Adoption of Cloud Computing," a recent Aberdeen research publication. "Beyond cloud as a transformational technology, the CFO and board level are looking at the economics of cloud, specifically converting traditional fixed capital expenditure...spending to variable operating expenditure [or opex] spending."

One salient takeaway, Lesieur and Baroudi concluded, is that successful cloud adopters continuously monitor their cloud-based assets to measure both efficiency and use. On top of this, Aberdeen noted, successful cloud implementations start from the top down (i.e., with C-level leadership). They're also ROI-driven -- or, more precisely, use ROI analysis to drive deployment decisions.

Cloud applications aren't deployed in a vacuum, either. Successful cloud adopters take end user satisfaction seriously; cloud-based apps can't have an impact, much less fulfill ROI projections, if users aren't consuming them.

There's a kind of stratification between cloud adopters, according to Aberdeen. Enterprise-class shops tend to pursue cloud computing as a consequence of existing investments in virtualization and, to a lesser extent, green IT. In this sense, cloud computing could even be described as a logical continuation of extant virtualization or energy-efficiency efforts.

One upshot of this is that enterprise IT organizations, more so than IT practices in small- and midsize (SMB) shops, have a disproportionate interest in private clouds. More than half (56 percent) of enterprise IT organizations are building their own private cloud-based infrastructures, according to Aberdeen; less than one-fifth (20 percent) of SMB shops are doing as much.

"Having become a mainstream IT infrastructure solution for a number of years, virtualization has dramatically changed datacenter operations," Lesieur and Baroudi wrote. "Cloud computing was born to further abstract the physical hardware infrastructure from the services and application layer of IT architectures."

SMB customers, on the other hand, typically start with the cloud platform-as-a-service (PaaS). These shops have either been slower to pursue in-house virtualization or energy-savings efforts or -- just as likely -- have fewer legacy resources to bring along for the ride. One upshot of this is agility. They can more easily transition to a loosely coupled PaaS cloud architecture. Nearly half (45 percent) of SMB shops cited plans to outsource some or all of their applications to PaaS cloud providers.

Similarly, SMBs are much more likely to outsource infrastructure resources to PaaS providers, too. According to Aberdeen's survey, 28 percent of SMB shops cited a willingness to do as much compared to just 15 percent of large enterprises.

"Enterprise companies have more complex IT environments with legacy applications and infrastructures while SMBs are adopting cloud applications that do not require legacy planning and transition strategies," Lesieur and Baroudi wrote.



Michael Jackson ‘Will Always Be An Icon,’ Alicia Keys SaysEarly Enterprise Cloud Deployments Likely To Be Hybrid

Microsoft: We Will Steal Google's Lunch

A Microsoft corporate vice president talked about the company's business prospects, including competition with Google, at a Barclays event on Tuesday.

Robert Youngjohns, Microsoft's president of North American sales and marketing, fielded questions at the Barclays Capital Technology Conference in San Francisco. As might be expected of a marketing executive, Youngjohns was fairly bullish about Microsoft's prospects, particularly with regard to cloud computing and the enterprise software markets.

Youngjohns also played up Microsoft flagging efforts to catch up with Google's search market share. With regard to search, some people say that Microsoft should just concede to Google, Youngjohns said. However, Microsoft isn't giving up because it needs to "attack someone who is attacking us," he explained.
"We think we can change the rules around search and that's why we're charging so hard with Bing," Youngjohns said. He added that Microsoft wants to "steal [Google's] lunch on search" and that "we clearly are concerned about [Google's] Android and Chrome." Microsoft will leverage its ecosystem of products to compete with Google, he explained.

Microsoft is also going strong after releasing a number of 2010 products. Last month, the company released Exchange 2010 as a final product, as well as betas of Office 2010 and SharePoint 2010, among others. Microsoft's challenge now is "to get customers excited about deploying our recent releases," Youngjohns said, noting that many customers still see Microsoft through its older products, like Office 2003.

The two key applications that businesses typically need are collaboration and workplace management tools, Youngjohns said. Microsoft addresses those needs with its SharePoint and Dynamics products, respectively, as well as Microsoft Office, he explained.

While CIOs in the past have said that Microsoft's enterprise software "wasn't ready for prime time," that perception is changing with the new releases of Windows Server and SQL Server, Youngjohns said. He added that Microsoft has cases studies showing that Windows Server 2008 R2 is ready for mission-critical, tier-1 application support.

On the server virtualization front, Microsoft has been achieving market differentiation through its System Center management products, Youngjohns said. Desktop virtualization is an area where there's been a lot of interest, but companies have not realized infrastructure cost savings by deploying that technology, he added.

Youngjohns touted Microsoft's Windows Azure cloud computing initiative as "fundamentally different from other cloud computing structures." Microsoft offers its customers the ability to deploy on premises, in the cloud or use a hybrid approach. In particular, Microsoft has been seeing "very heavy interest in hosted Exchange," he said. He expected to see applications that require scale, like e-mail, increasingly moving into the cloud. Microsoft's transition to serving customers through the cloud hasn't been difficult to do because it already sells long-term annuity agreements to businesses, he added.

Finally, Youngjohns presented Windows 7's release in October as a positive note for Microsoft, especially among consumers. The netbooks market was initially seen as problematic for Microsoft, but "we've done really well" there, he said.

Enterprise sales of PCs with Windows 7 represent a different case because it takes between six months to 18 months to migrate from Windows XP to Windows 7, he said. He added that IT departments also have to do user training, which can seem like a limiting prospect if you have 20,000 to 30,000 users.

An audio recording of Youngjohns' talk can be accessed at Microsoft's investor relations Web page here.



Robert Pattinson, Kristen Stewart, Taylor Lautner Miss Life Before FameWindows SMB Subject to Denial-of-Service Attack

Microsoft SQL Server 2008 R2 Feature-Complete Preview

In November, Microsoft released the feature-complete preview of SQL Server 2008 R2, the first major upgrade of its flagship database platform in nearly two years. The new version is expected to ship in the first half of next year, Microsoft officials said at the annual Professional Association of SQL Server (PASS) Summit in Seattle.

In addition to introducing new self-service business intelligence (BI) capabilities, Microsoft said the new release of SQL Server will come in two new packages: a Data Center Edition and Parallel Data Warehouse (formerly known as Project Madison), representing the most scalable database servers offered by Microsoft to date.
November CTP
The November CTP of SQL Server 2008 R2 is the first preview of Microsoft's new PowerPivot technology-formerly code-named "Project Gemini"-and will allow BI data collected from it to be shared to SharePoint Server. PowerPivot lets business users create their own BI solutions in Excel with a new in-memory analysis engine that Microsoft says will work on millions of rows of data. "Users can manipulate the data in new ways to create BI solutions and then publish them to SharePoint to collaborate with other users," explains Fausto Ibarra, Microsoft's director of SQL Server product management.

Gartner Inc. analyst Mark Beyer says the PowerPivot technology will help Microsoft bring BI to end users while helping them clean up many of the information silos that are in Excel cubes today. "You kind of have to clean up your own house when you make such a powerful and easy-to-use tool as Excel, and I think this is a step toward cleaning up the BI chaos," Beyer says.

Also in the new CTP, developers and administrators will be able to centrally manage groups of SQL Server databases to monitor performance problems and move databases from one server to another. The CTP also supports Hyper-V virtualization in Windows Server. "Specifically, we'll be supporting live migrations to enable users who have virtual database workloads to seamlessly migrate them to other servers without any perceived downtime for their users," Ibarra says.

A new component of the SQL Server Information platform, as previously announced, will include Stream Insight, a complex event processing platform that allows users to build new types of solutions like algorithmic trading, Web analytics or industrial process controls, where an organization might have hundreds or thousands of events happening in real time. This would allow a customer to act on those events, Ibarra explains.

Object-Relational Mapper Supports Microsoft's Cloud

In November, Telerik Inc. was the first third-party vendor to offer what the company describes as complete object-relational mapping (ORM) support for SQL Azure. Telerik acquired Vanetek of Germany and its OpenAccess ORM product-a full-featured ORM platform-in December 2008.

Microsoft officially launched the Windows Azure Platform-which includes SQL Azure, essentially SQL Server in the cloud-at its Professional Developers Conference 2009 last month.

"People can use our ORM tool just as if they were working with SQL Server today locally in their office," says Todd Anglin, a Microsoft MVP and Telerik's chief evangelist. "They can forward map, reverse map and-as they say in Entity Framework language-use the Model First, Schema First approach to developing databases. They can do it all directly against the database in the cloud, so this is a huge step forward for developers looking at developing with the cloud."

OpenAccess ORM supports the Microsoft .NET Framework 2.0, 3.0 and 3.5, and requires Telerik's Visual Studio 2008 wizards and other tooling.

"It's a popular alternative to the Entity Framework, which in its current version has a number of issues. It's tough to configure," explains Anglin.

Telerik will continue to look at different approaches to help developers move data in and out of their applications more easily in the cloud and platforms like Silverlight, according to Anglin. "OpenAccess is the foundation for that," he says.

OpenAccess ORM with support for SQL Azure is part of Telerik's Premium Collection for .NET Q3 2009 release. The entire collection is available now for $1,299 per developer seat. The products are also sold separately.

–Kathleen Richards

Improvements and Pricing
Data Center Edition ($57,500 per processor) will scale up to 256 logical processors or cores, a fourfold increase from the limit of the current edition of SQL Server. Parallel Data Warehouse is Microsoft's highest-end implementation to date. Based on the technology it acquired from DATAllegro, the systems will be sold as appliances by IBM Corp., Hewlett-Packard Co., Bull and Dell Inc., among others.

Standard Edition is $7,500 per processor or $1,000 per server plus $162 per client access license (CAL). Enterprise Edition costs $28,800 per processor or $9,900 per server plus $162 per CAL.



‘New Moon’ Stars Answer Your Questions About Scooters, Online HatersMicrosoft Readies SQL Server 2008 R2 Feature-Complete Preview

SharePoint 2010 Emerges as a Developer Platform

The Microsoft SharePoint Conference 2009 held in Las Vegas in October marked an important milestone for Redmond's popular SharePoint collaboration and publishing platform. At the conference keynote, Microsoft CEO Steve Ballmer discussed extensive user interface and integration improvements to SharePoint, as well as much-needed hooks into the Visual Studio 2010 integrated development environment (IDE). Ballmer said that SharePoint 2010 was expected to be released to manufacturing in the first half of 2010.

With the 2010 wave, Windows SharePoint Services (WSS) has been renamed SharePoint Foundation, while the Microsoft Office SharePoint Server (MOSS) product becomes simply SharePoint Server. There's also an improved SharePoint Designer, the freely available tool for building custom SharePoint sites and behaviors. Designer gains include the Office Ribbon UI and other enhancements to help end users build mashups, connect to external data and enable workflows, wrote Jeff Teper, Microsoft corporate vice president for SharePoint Server, in a blog post.
SharePoint Server will continue to be offered in both Standard and Enterprise versions, and customers can purchase the FAST Search for SharePoint add-on, which incorporates the advanced enterprise search technology acquired from FAST Search & Transfer in January 2008. In addition to traditional on-premises SharePoint Server licensing, Microsoft will offer SharePoint Online in the cloud. In his keynote remarks, Ballmer said that customers will be able to "mix and match" SharePoint Server and SharePoint Online deployments.

Developer Picture
Many SharePoint 2010 enhancements announced at the conference are focused squarely on the dev community. Ballmer discussed enhancements to Visual Studio 2010, currently in beta 2, which add a SharePoint Explorer for accessing SharePoint project templates, as well as a Windows SharePoint Package in the VS Solutions Explorer. VS 2010 will also provide one-click deploy and debug capability and a new Map folders feature to allow developers to map image URLs to SharePoint environments. Tom Rizzo, director of SharePoint product management at Microsoft, noted in his keynote address that developers would no longer need to hand-code Web parts and XML.

SharePoint developers will also gain access to the expanded source code and ALM feature set of VS 2010, which, starting with VS Professional with an MSDN subscription, now provides Team Foundation Server functionality at all tiers. To take advantage of these new features, developers need VS 2010 beta 2, the SharePoint 2010 Designer and SharePoint 2010.

Growing Interest
Arthur Brost, portals and collaboration practice manager for consultancy twentysix New York, gave a presentation at the conference on creating Custom Actions for SharePoint Designer using VS. He says attendees at the show were "very excited about 2010."

"I think people's main complaint would be that they have to wait to get their hands on it," Brost says of SharePoint 2010. "VS 2010 with SharePoint 2010 addresses a lot of problems developers are facing now, and we're anxious to move on to the new tools."

One of those problems has been the inability of SharePoint developers to work on default developer workstations, as SharePoint 2007 requires a server OS to run. With SharePoint 2010, developers can run the platform on the same Windows Vista or Windows 7 client OS hosting the VS IDE.

Also new is the SharePoint Developer Dashboard, which provides immediate data on response times, stored SQL Server calls and a history of SharePoint calls, so developers can zero in on problem code. The improvement, while welcome, will demand close attention as developers get familiar with the extended capabilities of SharePoint 2010. Brost says the supporting documentation for SharePoint 2010 is "much further along" than it was for earlier iterations, but he warns old SharePoint hands to avoid jumping into customizing code.

"Get to know the new built-in functionality before you go off customizing it," Brost says. "There's a lot more built-in functionality, so cases where you would've written custom code in SharePoint 2007 may now be handled with built-in functionality."



SharePoint 2010 Dev Center Ramps Up, Adds BCS Resources

Intel Unveils 48-Core 'Cloud Computer' Chip

Intel pushed the outer limits of computing this week by unveiling a new experimental processor it described as a "single-chip cloud computer."

The chip features 48-core processor technology developed by Intel's Tera-scale Computing Research Program. It was co-created by Intel labs in India, Germany and the United States.

Intel hopes to engage researchers in the coming year by providing more than 100 of these experimental chips for R&D efforts. Those efforts will include developing new software and programming models based on the chip's technology.

Microsoft is involved in the research, according to Dan Reed, corporate vice president of extreme computing at Microsoft. The company is exploring market opportunities in "intelligent resource management, system software design, programming models and tools, and future application scenarios," Reed said in a released statement.
The chip's connection to cloud computing was rather vaguely expressed in Intel's announcement. Specifically, the announcement said that computers and networks can be integrated on a single piece of 45-nm, high-k metal-gate silicon, which has the footprint of a postage stamp.

The smaller footprint may be useful for crowded datacenters. In addition, the chip may introduce new data input, processing and output possibilities.

"Computers are very good at processing data, but it requires humans to input that data and then analyze it," said Shane Rau, a research director at analyst firm IDC. "Intel is looking to speed up the computer-to-human interaction by basically getting the human element out of the way."

According to Intel, this kind of interaction could lead to the elimination of keyboards, mouse devices and even joysticks for computer gaming. Intel's announcement even suggested that future computers might be able to read brain waves, allowing users to control functions by simply thinking about them.

In thinking about that, Rau said there's still room for slowed-down human processes.

"This process needs to be thought out very carefully, and that's one area where the slow IO [input/output] of humans may be an advantage," he said.

Intel developed the chip based on its "recognition, mining and synthesis" (RMS) approach, according Rau.

"The technology announcement today is similar to Intel's announcement regarding an 80-core processor last year," Rau said in a telephone interview. "It's basically an effort known as RMS by Intel that puts silicon in the hands of the people and institutions that can create the building blocks for future computing devices and software."

The chip is just designed for research efforts at the moment, according to an Intel spokesperson.

"There are no product plans for this chip. We will never sell it so there won't be a price for it," the Intel spokesperson noted in an e-mail. "We will give about a hundred or more to industry partners like Microsoft and academia to help us research software development and learn on a real piece of hardware, [of] which nothing of its kind exists today."



Security Development Lifecycle Gets an Agile Update‘A Christmas Carol’ Cheat Sheet: Everything You Need To Know

Microsoft and Yahoo Finalize Search Contract

Microsoft and Yahoo have finalized the legal details of their proposed search-advertising deal.

The deal was first announced back in July. On Friday, the companies announced that they have "finalized and executed the definitive Search and Advertising Services and Sales Agreement and License Agreement" based on that initial agreement.

Microsoft hopes to use the deal to boost its search user base and catch up with No. 1 search provider Google. To that end, the deal calls for Microsoft to provide its Bing search engine technology to Yahoo. Yusuf Mehdi, senior vice president of Microsoft's online audience business, recently explained that Microsoft will allocate $100 million to $200 million for turnaround costs in the first year to help with the technical integration details.
Yahoo, for its part in the deal, will maintain control over the user interface at its Web sites and will manage search-advertising customers for both itself and Microsoft.

The deal is subject to regulatory reviews, but the two companies expect to get approvals sometime in "early 2010."

Microsoft recently enhanced Bing with a new mapping capability, narrowing the gap in that respect with Google, which offers similar features in its Google Maps interface. Bing has so far been launched primarily in the U.S. market, but Microsoft is preparing other launches soon in Canada and the United Kingdom.

Microsoft still has a long ways to go to catch up with Google in search. Google currently holds 85 percent of the global search market, according to Market Share. In contrast, Bing holds a U.S. search market share ranging from three percent to 10 percent.



Windows Phones To Arrive in Coming Months‘New Moon’ Star Jamie Campbell Bower Dreads Seeing Himself Onscreen

Friday, December 4, 2009

Study: App Maintenance Is Neglected -- at a Cost

Should you be paying more attention to application maintenance and less to application development?

The answer may be yes – especially if you're in a government agency. Government organizations score lower on application maintainability benchmarks than private sector companies, and all organizations score higher on application performance than they do on application maintenance, according to information gleaned from a new "software quality benchmarking database" produced by Cast Software.

Called Appmarq, the database, announced yesterday, contains benchmarking information from more than 500 applications coming from 70 different companies, said Jay Sappidi, director of the Cast Center for Quantitative Research, in an e-mail interview.
He said the database was developed over time as Cast analyzed applications from client companies. It gets its data directly from application source code via an algorithm that measures weighted rules pertaining to factors such as robustness, security and transferability. The database information will be available to customers through four different service models, including subscriptions and access to annual industry trend reports that will include information like the examples above. Clients can also submit their own applications to see how they measure up to industry benchmarks and best practices and determine where more resources are needed, for example.

Noting the government vs. private sector discrepancy in application maintainability, the company said it may account for a Gartner report -- "IT Spending and Staffing Report, 2009" -- that indicates public companies spend up to 18 percent more on application maintenance than do government shops.

The company also said in a news release that Appmarq indicated application performance scores are nearly 20 percent higher than application maintenance scores. It said that even though IT shops typically spend almost 70 percent of their budgets on maintenance, they usually aren't focused on cutting long-term costs by improving applications, paying more attention to projects that are more visible to end users.

Cast Software said it will collaborate with organizations such as Software Engineering Institute (SEI) and Object Management Group (OMG) to produce benchmarking reports for customers. Sappidi told this site that the company is currently validating more industry trend examples, "like the most frequently made mistakes, etc., comparison of quality across technologies, etc., which will be released along with the annual report."



IBM Acquires Guardium, Adds Real-Time Compliance to DatabasesCatherine Hardwicke Discusses Robert Pattinson, Kristen Stewart Chemistry

Windows 7 Contest Winner Trades Vacation for $17,777

The winner of Microsoft’s recent Code 7 contest only heard about the competition about three weeks before it ended, so he urgently requested two weeks’ vacation time from work and barely submitted his winning entry before the deadline.

For his trouble, Benjamin Bondi pocketed the first-place prize of $17,777, along with many other perks.

Bondi, 24, a .NET solution architect living in Haifa, Israel, was announced as the grand prize winner during Microsoft’s recent Professional Developers Conference in Los Angeles. His winning application, titled Notes Everywhere, is described as a Windows Presentation Foundation (WPF) client that leverages Windows 7 and Windows Azure to manage desktop notes from any location (see video in contest gallery).
Now he plans to commercialize his project - with a sense of humor. In an e-mail, he said "I have a very clear business model (which wasn't revealed during the contest) that requires another few months of development. I'm working these days in finding the appropriate sponsor for this activity, and if you're a nice looking investor between the ages of 20-120, you are more than welcome to contact me for a full demonstration of the future of NotesEverywhere."

He said he almost didn’t make the deadline, partly because he had to learn new technologies such as WPF, Managed Extensibility Framework (MEF), Entity Framework and Windows Azure. He worked on the presentation video on the last day of his vacation, but then it took him about two days to upload it - the upload failed repeatedly until he used another network with more bandwidth. He finally got it submitted on the last day of the contest.

Bondi said he’s going to use the prize money to help complete a house he is building. "The money couldn't have come at a better time (as I just ran out of it)," he said.

What were the biggest coding problems he faced?

"I think I wasted most of my time on GUI design issues," he said. "Technology-wise, everything was quite simple. On the server side, I had issues with the Entity Framework (which is, by the way, truly not production ready in .NET 3.5) and on the client side, some WPF timer and binding issues - but eventually I worked them out."

As the winner, he will be invited to visit the Microsoft campus and meet with developers. He said he will go "if my current workplace will provide me with some extra vacation days (as I already wasted all of them for this year on this project). I'm interested in seeing how Microsoft manages and coordinates so many projects and developers and wonder if they have any special tricks that other companies are not familiar with."

And he’s already looking forward to next year’s PDC. "Who knows? Maybe they'll invite me to demonstrate the real production application in the first or second day keynotes in PDC 2010."



Zooey Deschanel, Joseph Gordon-Levitt Are Thankful For The Past, The ‘Summer’ And The FutureWindows 7 Programming Finalists Vie for $17,777 Grand Prize

Microsoft Shareholders Approve Executive Compensation

More than 99 percent of Microsoft voting shareholders have approved the company's executive compensation practices in its first "say-on-pay" vote.

More than 3 million stockholders were given the option to cast an advisory vote, which took place at the company's annual shareholders' meeting on Nov. 19 in Bellevue, Washington.В Overall, 85 percent of shares were entitled to vote.

Announced in September, Microsoft shareholders will have a "say on pay" every three years. Microsoft officials said in a statement that a triennial model best matches its executive compensation program, which aims to link executive compensation to company performance. As part of the policy, the votes are non-binding and could be overruled by the Board.
Microsoft General Counsel Brad Smith told the Wall Street Journal that the policy would promote dialogue with shareholders. According to a Microsoft blog, Redmond reportedly discussed "say-on-pay" options after receiving shareholder proposals from Walden Asset Management and Calvert Investments, and from the United Brotherhood of Carpenters.

Microsoft's vote comes during a time of increased government and media focus on executive pay, as well as proposed legislation to require greater disclosure and stockholder involvement. There are at least three pieces of legislation in the Senate that include pay disclosure measures.

While the legislative discussion initially focused on companies that had received financial recovery assistance from the Troubled Assets Relief Program (TARP) fund, the scope widened in June 2009 when Treasury Secretary Timothy Geithner announced that the Administration would authorize the SEC authority to implement "say-on-pay" regulations at all companies (contingent on Congressional approval).

More than two dozen companies have already adopted the plan, including Verizon Communications Inc. and Apple Inc. The Journa l also reports that more than half of the companies in the Fortune 100 included the initiative in their most recent proxy statement, as of September 2009.

About government policy, Microsoft not surprisingly advocates that companies be given the flexibility in determining specifics of "say-on-pay" policies.



Windows Phones To Arrive in Coming Months‘New Moon’ Poses The Question: Is Edward Cullen A Jerk?

Microsoft Warns of IE 6 and IE 7 Bug

A new zero-day Internet Explorer bug awaits IT pros returning from the holiday break.

Just before Thanksgiving Day, Microsoft released a security advisory on a vulnerability affecting IE 6 and IE 7 browsers, based on "new public reports." Browser versions that aren't affected include IE 8 and IE 5.01 Service Pack 4, according to Microsoft.

Microsoft is continuing to investigate the bug, which allows an attack based on the deletion of a cascading style sheet (CSS) object. The security bulletin indicated that IE 6 SP1 on Windows 2000 SP4 may be affected. Other affected browsers may include IE 6 and IE 7 on Windows XP, Windows Vista, Windows Server 2003 and Windows Server 2008.
Microsoft's security advisory explained that hackers can potentially run malicious code after a CSS object gets deleted.

"It is possible under certain conditions for a CSS/Style object to be accessed after the object is deleted," the bulletin stated. "In a specially-crafted attack, Internet Explorer attempting to access a freed object can lead to running attacker-supplied code."

The bulletin adds that users still have to be diverted to a malicious Web page in order for the attack to occur.

IT pros need to have preventive measures in place, both for this bug and in general, according to Paul Henry, security and forensic analyst at Lumension.

"The latest Internet Explorer zero-day threat will unfortunately catch many off guard and will have a significant impact on many organizations that are still relying on outdated defenses," Henry said. "Vendor software vulnerabilities are not going away and zero-day threats will continue to plague even those organizations that have the best of the best in flaw remediation plans in place."

The security advisory offered a few workarounds for the issue until the vulnerability is patched. The workarounds involve changing IE's security zone settings, configuring active scripting settings in IE and turning on data execution prevention in the browser.

Microsoft explained that protected mode, available in IE 7 running on Windows Vista, "limits the impact of the vulnerability." Also, there is some protection for those running IE on Windows Server 2003 and Windows Server 2008. By default, those operating systems use Microsoft's enhanced security configuration, which sets IE's Internet zone security level to "high."

To date, there's no word on when a patch will arrive, which could appear with Microsoft's monthly patch release or in an out-of-band fix.



‘New Moon’ Screenwriter Reveals Four Tweaks That Made The Film WorkWindows SMB Subject to Denial-of-Service Attack

Liddell Quitting Microsoft CFO Position

Microsoft announced last week that Chris Liddell is stepping down as Microsoft's chief financial officer.

Liddell will serve until Dec. 31, after which Peter Klein will assume Microsoft's CFO position. Klein currently serves as the CFO of Microsoft's Business Division.

In the last year or so, Liddell oversaw Microsoft's first companywide mass layoffs. More than 5,000 positions were cut, although some employees were rehired in different positions within the company. Microsoft initiated the layoffs as part of a renewed focus on cost cutting, prompted by the global economic downturn and declining revenues, particularly in its Windows division.
In October, Liddell announced that Microsoft's cost-cutting measures had reduced operating expenses by 10 percent year over year. Those measures included freezing employee wage hikes, cutting back on travel expenses and reducing Microsoft's vendor support. Microsoft's employee numbers plummeted by 4 percent year over year, according to Liddell. The company employed 93,000 workers full time worldwide in 2009.

Microsoft also began issuing debt in the form of company bonds for the first time under Liddell's tenure, which lasted four-and-a-half years.

Liddell apparently left voluntarily, and will be seeking other positions beyond the CFO role, according to Microsoft's announcement. He previously served as International Paper Co.'s CFO before joining Microsoft and was CEO at New Zealand's Carter Holt Harvey Ltd. prior to joining the paper company. A Bloomberg.com article suggested that Liddell is aiming for a CEO position "at a technology company."

Microsoft has not yet announced who will take Klein's position at the Business Division. Before joining the Business Division, Klein served as the CFO of Microsoft's Server and Tools Division for three years. Klein has been with Microsoft for more than seven years and previously worked at McCaw Cellular Communications, Orca Bay Capital and HomeGrocer.com.

"Peter brings great finance and operations expertise and a deep understanding of the company, and I am looking forward to a smooth transition that continues our commitment to cost containment and finance excellence," said Microsoft CEO Steve Ballmer in a prepared statement.



Robert Pattinson, Kristen Stewart Reveal What Makes Them A Good PairMicrosoft Holds Down Losses in 1Q 2009

Wednesday, December 2, 2009

IBM Acquires Guardium, Adds Real-Time Compliance to Databases

IBM on Monday said it has acquired Guardium, a leading supplier of real-time database application monitoring software.

Guardium's namesake software modules provide audits, real-time monitoring and tracking of database trails, and discovery tools. The modules support every major database platform including Oracle, IBM's DB2, Sybase, Microsoft's SQL Server and MySQL.

IBM acquired the closely held, 150-person company for an undisclosed amount.

"Organizations are facing very strict government and industry regulations," said Arvind Krishna, general manager of IBM's Information Management business, on a conference call announcing the deal. "They also have demands for unprecedented levels of transparency and access to data and information within the enterprise. This transparency and accountability begins with the data. Guardium's technology helps our clients take up the data by providing the transparency and access."
Guardium identifies patterns and inconsistencies in the way data is accessed. Its real-time monitoring functions can detect fraud and unauthorized access to databases and information residing in ERP, CRM and data warehousing systems.

"It is, at data-level, a complement to the application security offerings from Rational at the application level," said Ovum analyst Tony Baer. "While I won't go as far as calling this a double-layer firewall, it does add another layer of defenses against database intrusions."

IBM's Krishna said the software, used by about 400 customers including some of the world's largest banks, has applicability beyond just serving regulatory issues.

"This is not just about compliance with government regulations. The technology plays a big role in helping orgs comply with mandates like the Payment Card Industry and also helps reduce fraud within the enterprises," he said.

While Guardium's software will be integrated into IBM's business intelligence and Information on Demand portfolio, Krishna said the company plans to continue support for all other database platforms.

"This cross-platform support will be maintained and will be a critical component of our offerings as we move forward," he said. "More than ever, organizations need to maintain real-time insight, in effect applying analytics to analytics, to make sure that people know who is getting access and whether it's authorized or unauthorized."



Taylor Lautner Says Relationship With Kristen Stewart Is ‘Wonderful’Microsoft Readies SQL Server 2008 R2 Feature-Complete Preview

Google Fixes Bugs in Chrome Frame Affecting IE 8

Google has fixed bugs in Google Chrome Frame, including one that Microsoft called out as a security problem.

Chrome Frame is designed to let Web developers use HTML5 features that Google claims are not supported in Microsoft's Internet Explorer 8 browser. When Chrome Frame was announced in September, Microsoft quickly noted that it posed security problems.

On Wednesday, Google issued a bug fix announcement indicating that a patch would automatically update Chrome Frame. The patch solves an issue where IE 8 freezes intermittently. It also includes a security fix for a "cross-origin bypass" vulnerability. Apparently, this is the security issue pointed out by Microsoft and others.
Google indicated in its announcement that this vulnerability was of high severity and important to patch.

"An attacker could have bypassed cross-origin protections," Google wrote. "Although important, 'High' severity issues do not permit persistent malware to infect a user's machine. We're unaware of any exploitation of this issue."

Google credited "Billy Rios and Microsoft Vulnerability Research (MSVR) and also to Lostmon for finding and reporting this vulnerability responsibly."

To make Chrome Frame work, Web developers place a snippet of code in their Web page. The code switches IE's Trident layout engine to the open source WebKit one used in the Google Chrome browser. According to Google, this technology allows IE 8 to access HTML5 and JavaScript improvements that currently aren't supported in Microsoft's browser.



Opera Beats IE in Browser Web Security‘New Moon’ Star Robert Pattinson Laughs Off Pregnancy Rumors

Windows Phones To Arrive in Coming Months

Microsoft is planning to rev up the mobility market under its "Windows phone" brand.

The company initially launched a Windows phone campaign back in October. At that time, Microsoft announced that various OEM partners were building new mobile phones using the Windows Mobile 6.5 operating system. Those efforts will soon bear fruit, according to a Microsoft spokesperson.

"Over the coming months you will see a regular drum beat of exciting new devices from our partners as well as regular updates to our software to keep pace with evolving demands for great browsing experiences, touch screen capabilities, and services," the spokesperson noted in an e-mail on Tuesday.
The new mobile offerings may mark a departure of sorts for the company. Microsoft's mobile strategy has been somewhat stagnant since it entered the market in 2000, according to Matt Rosoff, vice president of research at Directions on Microsoft.

"The Microsoft mobility story is one of tragically missed opportunities," Rosoff said in a telephone interview. "They started off competing well with Palm Pilot in the beginning, but they were slow to evolve. Then, when Apple introduced the iPhone in the consumer market, it literally took them by surprise."

Microsoft has been playing catch up ever since, according to Rosoff.

For its part, Microsoft is sticking with its traditional platform approach, offering software that can run on a variety of phones and across numerous service provider networks.

"Microsoft is committed to giving its worldwide partners the tools and programs they need to successfully launch and sell Microsoft offerings," the Microsoft spokesperson stated. "Guiding the delivery of Microsoft resources is a clear strategy that takes a business framework approach to map tools and resources for partners at each step of their business cycle."

The company already announced several Windows phone components in October, including an Office Mobile 2010 beta, a new My Phone app for backup and sharing files, and an online source for mobile apps called Windows Marketplace for Mobile.

"They don't want to be in that [phone hardware] business," said Rosoff, who added that the new branding element, Windows phone, will apply to a range of device makers, but will carry a distinctive user interface that will correlate with other Windows offerings.

"They are coming off a huge missed opportunity, which is the consumer side of the smart phone market," said Rosoff. "I think we will see some major improvements [in their mobile offerings] next year when Windows Mobile 7 is introduced."

He noted improvements in Microsoft's Zune being incorporated into Windows phones as well as some gaming elements. Both will help compete against leaders such as Apple and Google.

Rosoff said that the smart phone market does not have room for five or six major players. If Microsoft is going to succeed it will have to be in the top three, he added.

"I think the smart phone market could produce from 10 to 20 percent of future overall revenues for Microsoft," Rosoff said. "But, I think their real focus is to make sure other companies don't get a platform edge that might compete with other Microsoft offerings down the line."



‘The Boondock Saints II’: Sloppy Second, By Kurt LoderAndroid, BlackBerry and iPhone Take Share from Windows Mobile

Wednesday, November 25, 2009

Dave Campbell Discusses Dallas

Microsoft Technical Fellow Dave Campbell is pretty worked up about the freshly-announced Dallas project, the data-centric initiative that hopes to radically reduce the friction involved in discovering, manipulating and analyzing data culled from multiple public and private data sources via the cloud.

"Being able to take those things, pull them all together, and let developers build new experiences is what Dallas is all about," Campbell said during an interview following the keynote.

Campbell gave a demo during the morning keynote that showed how Dallas could be used to search out data accessed via ADO.NET Data Services (code-name Astoria) and to manipulate that data using the Excel PowerPivot component. The demo was accompanied by a video presentation by Federal Chief Technology Officer Vivek Kundra, who talked about how opening access to public data has led to significant innovation in medical research, GPS services and astronomical research. government information and the power of Dallas in that.
Clearly, Dallas has some legs already. And Campbell says Microsoft is getting so many requests to participate in the Dallas effort that the company simply can't keep up with them all.

What's interesting about Dallas is that it's less a typical, PDC-borne platform advancement and more an effort to leverage Microsoft's matured data and Cloud services stack in combination with its peerless corporate sway. In essence, Microsoft has worked to craft a viable syndication and licensing model for private data, enabling organizations to make available for broad consumption data that would otherwise remain locked up in quarterly DVDs or rigid subscription-based packages.

Associated Press On Board
Organizations can either host their data with Microsoft or they can allow Dallas to access data stored on-premise. Todd Martin, vice president of technology development for the Associated Press, says his organization opted for the latter approach. He says Dallas provides an opportunity to expose content from more than 1,500 AP news organizations to the channel of Visual Studio developers.

"The newspapers actually participate in the value of an application that is created by a developer," Martin said during a round table Q&A session. "Developers and information workers have easy access to a content API that is of course very well integrated into their workspace. You have this potential for both the developer and the news publisher to benefit from this collaboration."

To make data available, Dallas provides what Campbell described as "an open catalog and market place for data, both public data and commercial data." By providing easy discovery and search of both subscription and public data sources, Dallas makes it possible for developers to pull together disparate data sources.

"Dallas makes the whole world of data better than the sum of its parts by creating a uniform discovery mechanism for data, a uniform binding and access mechanism for data, a uniform way of exposing trial data sets for developers, a uniform licensing model so data from multiple providers can be easily joined and recombined. By delivering data as a service, our aspiration is that Dallas might catalyze a whole new wave of remixing and experimentation by developers."

At the core of Dallas' potential success, Campbell said, that it arms Visual Studio developers with a standards-based way to access, manipulate and deliver data.

"Dallas has a consistent set of APIs. It's a service interface and a REST-based interface. It produces data in ATOM pub right now. And we've encoded structured data elements within the ATOM pub items so they can be consumed directly by applications," Campbell explained in an interview after the keynote.

"We build a service proxy," Campbell continued. "Part of the magic of Dallas is that it is powered by an underlying service model. So when you go up and start to explore data sets there are actual fields you can restrict on, so if you want to filter it one way or the other. We'll produce a service proxy which you can include in your .NET application to make it very, very easy to consume the data."

Campbell said ADO.NET Data Services (formerly code-named Astoria) is "the protocol that powers this." He said the technology makes it easy for developers to create a service to produce and consume data.

The Cloud's First Killer App?
A lot of focus in the arena of cloud computing has been on finding efficiencies. Moving infrastructure to service providers and enabling highly scalable, virtualized server environments. But with Dallas' clear momentum and its ambitious vision, the question begs. Could this be Windows Azure's killer app?

Campbell thinks it could be.

"I think it is--I would propose it as a killer app for the cloud. And here's the thing. It really does represent how we think about the cloud right now," Campbell said. "This one I had an intuition on. And the response we've gotten from the content owners we've shown it to is just tremendous. So it really has the potential to be something big."



‘New Moon’ Star Jamie Campbell Bower Dreads Seeing Himself OnscreenRay Ozzie and Bob Muglia Kick Off PDC 09

Tuesday, November 24, 2009

Early Enterprise Cloud Deployments Likely To Be Hybrid

Are early adopters of cloud services likely to go the pure hosted route, build private clouds run within an enterprise, or deploy a hybrid approach of the two?

That was a topic debated Wednesday at the Interop New York conference. Certain data and applications cannot run in the cloud due to regulatory, compliance or security concerns. Yet other applications make more economic sense to run in the cloud or offer faster deployment options. The hybrid approach allows for the best of both worlds.

"My view is we're debating whether a model like this with the right policy constraints can get the enterprise to a place where it is cloud-like and eventually migrating toward a cloud environment," said Alistair Croll, principal analyst at Bitcurrent, speaking during an Interop session he moderated.
"A hybrid model gets people comfortable in the shorter term," said Tyson Hartman, CTO of Avanade, a subsidiary of Accenture that deploys Microsoft-based solutions. That's been the case with the company's Business Productivity Online Suite (BPOS) for hosting Exchange, SharePoint and Dynamics applications.

Hartman said in an interview following the session that he anticipates Microsoft's new Azure cloud services, launched this week at the Professional Developers Conference (PDC) in Los Angeles, will follow that same pattern. "We are doing a lot of hybrid work with BPOS because people can realize the cloud operating model business benefits," he said. "I think as Azure gets more predominant, we will see these hybrid models emerge."

Indeed, a survey of 500 C-level IT executives by Kelton Research commissioned by Avanade found a 300 percent spike over the past nine months in plans to either test or start deploying cloud services. However, 95 percent of them plan on a hybrid approach, according to a study released last month.

Still, some on the panel questioned whether that will make sense over the long term. "I think there's value in private clouds and value in public clouds but they are very separate things," said Anders Lofgren, chief research officer at TheInfoPro, an IT research firm. "I think the hybrid cloud complicates something that's already really complex."



Interop New York: Customers Press Vendors for Private Clouds‘2012′ TV Show Would Be ‘A Little Bit Like ‘Lost,’ ‘ Director Says

Google Chrome OS Released as Open Source Code

Google's previously under-wraps operating system is now available to developers as open source code, the company announced on Thursday.

In an online press briefing, company officials demonstrated Google Chrome OS publicly for perhaps the first time, although the OS was initially described in July. Chrome OS is currently being designed to work on netbooks and the company is about "a year away" from launching a product, according to Sundar Pichai, Google's vice president of product management.
Chrome OS is a free open source operating system designed for use on netbooks, but it could be repurposed to run on desktops. In a Q&A, Pichai said that Google will be reviewing that possibility "in the middle of next year." The OS will run on x86 hardware, but Google also will make sure than it runs on ARM processors, according to Matt Papakipos, engineering director for Google Chrome OS.

Pichai added that Google doesn't see the OS as being limited to the rather small screen size and keyboard size of today's netbooks.

Chrome OS, which looks a lot like Google's Chrome Web browser, will work with Web applications only. People still need to work offline, and Google is developing a database API to enable offline access in the Google Chrome browser, Pichai said. In 2010, Google plans to have all of the APIs built into Chrome, he added.


[Click on image for larger view.]  

The main Chrome OS screen strongly resembles the Chrome browser. The tabs at top hold both programs and external Web links.

 

Users do not have to install applications on netbooks running Chrome OS. In fact, they can't. Instead, they just run a Web application through their browser, which sits atop Chrome OS. The OS has an applications tab in the top left corner of the screen that lists available applications. In addition, Google plans to make it easy for users to discover new Web applications.


[Click on image for larger view.]  

Clicking the top left tab in the Chrome OS takes users directly to the list of their applications hosted online (the Chrome OS does not run installed apps).

 

Applications can be pinned to the top of the OS screen in tabs. They are distinguished from Web pages because the application tabs don't close. Chrome OS also uses panels as part of the user interface. Panels are "persistent lightweight windows" that can be minimized, according to Pichai. Typical uses for panels include chat windows, applications like notepad, file management and even Google Books, which can be read in full-screen mode.

The new OS will be hardware specific. Google is currently working with its hardware partners on the specs. Pichai said that means that users will definitely have to buy a Google Chrome OS-specific device. In particular, Chrome OS will only run on machines with solid state drives (SSDs). In taking that approach, Google is bypassing a lot of the delays seen with running traditional operating systems. For instance, most OSes spend time looking for a floppy disk on bootup, Papakipos explained.

Google's demo included a laptop that quickly booted to a log-in screen. Pichai said the bootup took seven seconds, and that it takes another three seconds to log into an application. Google is working to reduce even those times. The fast bootup is made possible through the use of flash memory in the SSDs, as well as through bypassing various application background services that initially load with traditional OSes, Papakipos explained.

Google conceives of Chrome OS as a different model of personal computing and is focusing on speed, simplicity and security. Pichai said that Google wants the netbook experience to be fast, like turning on a TV. Simplicity is enabled because users don't have to install programs and there's nothing to maintain. All of the data is stored in the Internet cloud. Security is improved because users can't install binaries. The file system locks down the partition as read only. In addition, if a user loses a netbook, the data isn't lost. The user can log in via another device and have access to that data anywhere.

Chrome OS has some technologies that can help with security threats, according to Papakipos. It automatically updates itself and has a "verified boot" system that checks to see that you are "running what you should be running," he explained. It also uses a cryptographic signature key. If a problem is detected, malware detection is run. Google can also reimage the system and save the user's cache data, restoring user settings. The storage of the user's settings is always encrypted, Papakipos said.

The press briefing also spilled some details about Google Chrome, the company's Web browser. Google plans to roll out Google Chrome for the Apple Mac and Linux-based computers before the end of the year, Pichai said. He claimed that Google Chrome is "40 percent faster than [Microsoft's] IE 8" and that there are more than 40 million users of Google Chrome who use it as their primary browser.

Al Gillen, IDC's program vice president for system software, described Chrome OS as potentially "problematic and challenging" for Microsoft. However, he stressed that it's really just a limitation to the functionality that users get with a traditional rich desktop. Chrome OS is neither new nor revolutionary, he said.

"You take a standard Linux operating system and you're removing the stuff you don't need. You're locking it down and making it relatively tight and secure. That's fine, but the operating system is not revolutionary or different," Gillen said by phone. "You've got a browser that is probably optimized for some Google applications and services. But fundamentally, it's a browser on an operating system."

More details on Chrome OS can be found by reading Google's announcement here.



Opera Beats IE in Browser Web Security‘New Moon’ Star Jamie Campbell Bower Dreads Seeing Himself Onscreen

Ozzie Quizzed on Cloud Strategy at PDC

Microsoft officials briefed financial analysts on the company's cloud computing strategy at its Professional Developers Conference (PDC) in Los Angeles on Wednesday.

Questions were fielded by Ray Ozzie, Microsoft's chief software architect, as well as Doug Hauger, general manager of Windows Azure. Of course, the big news was Microsoft's announcement on Tuesday that its Windows Azure-based datacenters would open for commercial service on Jan. 1, with billing ramping up on Feb. 1.
Ozzie began the briefing with the analysts by laying out Microsoft's oft-stated vision of three screens and a cloud -- a world of the PC, phone and TV all connected through the Internet. Next he described his role in trying to jump-start Microsoft's cloud computing efforts to help support that vision.

"We started investing in this cloud computing transformation, and a belief in this cloud computing transformation, about three years ago now," Ozzie said. "What we believed was that things were happening so rapidly and so significantly that this could be really transformational in a way that would impact computing for the next 20 to 30 years. So we started from the ground up -- literally the ground."

By that, Ozzie meant that Microsoft started its cloud computing efforts with so-called generation-1 datacenters, which are simply racks of computers where Microsoft manually installed the operating systems. He explained that gen-1 is where most companies involved in cloud computing are today. Gen-2 is the stage where multiple servers are added, but Microsoft's recent datacenter deployments have been gen-3, such as the one Microsoft opened several weeks ago in Chicago.

Gen-3 means modularizing containers or, as Ozzie described it, "cloud plug-and-play." Multiple servers are housed in a large building with roll-up doors. Microsoft can back up its trucks with 20-foot cargo containers filled with servers into that building. Microsoft then stacks these containers on top of each other. Ozzie said that this gen-3 approach has cut down Microsoft's supply-chain time to weeks rather than months.

Gen-4 is even leaner than gen-3. With gen-4, the servers are kept in cargo containers on the outside, absent a large warehouse. The goal of this approach is to shrink the capital investment, Ozzie said. For gen-4, Microsoft called on the talents of Dave Cutler, the principal developer behind Windows NT and a Microsoft Technical Fellow, to build a new operating system for the cloud, which became Windows Azure. Ozzie distinguished the use of the Windows Azure cloud computing OS from the old-model approach of plugging a traditional OS into a datacenter. Those traditional models do not provide the value of cloud computing, he said.

As part of this vision, Microsoft retooled Exchange and SharePoint as online services. Ozzie said that it gives enterprises the option to switch to services, and that with the current economic crisis, demand for the online products has been "tremendous." Mail and document sharing will be the most important cloud-based applications for businesses, he said.

Hauger stressed cost savings, agility and ease of implementation for Microsoft's cloud services. While Microsoft has 6 to 7 million developers around the world, the Windows Azure platform isn't just for .NET developers. Microsoft added developer support with Eclipse integration and supports PHP, MySQL, Ruby, Python and Java on Windows Azure.

Windows Azure supports both consumer and enterprise needs, Ozzie and Hauger emphasized. Hauger listed a few enterprise trial users of Windows Azure, including Coca-Cola, which is using the platform for order processing and its product database. A company called RiskMetrics Group performs its financial analyses using the Windows Azure platform for massively parallel high-performance computing of up to 40,000 instances at time. Companies like Kelly Blue Book and Domino's Pizza use Windows Azure to deal with peak demand issues that affect Web operations, according to Hauger. GlaxoSmithKline is using Microsoft's Business Productivity Online Suite (BPOS) service for e-mail federation using Active Directory, he added.

An analyst questioned Microsoft's BPOS services and the kind of margins Microsoft expected to see from them. Hauger didn't cite specifics, but claimed that Microsoft's Dublin gen-4 datacenter offers "dramatic economies of scale." Developers can then charge on top of that, he added.

Another question was about potential problems in porting legacy applications to the cloud. Ozzie said that it was "relatively straightforward to port those apps." Legacy apps designed for "one-box" computers can fail when moved to the cloud, he added, but "most people do a move, extend and transform"-type operation.

"They'll move something to the cloud, and then they'll start to mess around with it and understand what it's like to be in the cloud," Ozzie said. "And then they'll extend it by starting to take some aspect of it and take advantage of the cloud."

Moving things like e-mail, SharePoint and collaboration applications is "very straightforward," Ozzie added. Hauger said that moving some apps, such as moving SQL Server to SQL Azure, can be simple.

"If they're running Microsoft stuff now on premises, then if all they are doing is moving...they could move it as a virtual instance to our cloud or frankly to Amazon's because Amazon is a great [partner]," Ozzie said. He added that they'll still hit a roadblock when they want to transform it.

"The transformation will not be possible," Hauger said.



Taylor Lautner Says Relationship With Kristen Stewart Is ‘Wonderful’Ray Ozzie and Bob Muglia Kick Off PDC 09

Monday, November 23, 2009

Microsoft's New AppFabric Serves Up Composite Apps

During the opening keynote at the Microsoft Professional Developers Conference in Los Angeles on Tuesday, Microsoft President of the Server and Tools Business Bob Muglia introduced AppFabric as a foundational component of Microsoft's Windows Server and Windows Azure strategy.

AppFabric comprises a number of already existing technologies bound together in an abstraction layer that will help establish a common set of target services for applications running either on-premise on Windows Server or in the cloud on Windows Azure. Burley Kawasaki, director of Microsoft's Connected System Division, described AppFabric as a logical next step up in .NET abstraction.
"We really see AppFabric as being the next layer that we add to move up the stack, so we can provide a developer with truly symmetrical sets of services, application level services that you can take advantage of, regardless of whether you are targeting on-premise or cloud platforms," Kawasaki said. "This is an additional level of abstraction that is built into the system model, just like SQL as an example uses T-SQL as a common model, whether or not you are on the cloud."

Kawasaki said that AppFabric will help fix a problem facing Microsoft developers today. ".NET is a consistent model today, but when you do File/New to write an app for Azure, your whole application looks a little different than it would if you were just writing an app on-premise."

Beta 1 of AppFabric for Windows Server 2008 R2 was announced during the opening PDC keynote on Tuesday. A community technology preview (CTP) version of AppFabric for Windows Azure is promised sometime in 2010. The goal, said Kawasaki, is to present a uniform developer experience and service interface for both on-premise and cloud apps.

"They should be fairly identical in terms of capabilities. That should be the goal, so you as a developer do not have to learn different skills. It will be as symmetrical as possible," Kawasaki said. "There will always be things you have to know about the deployment environment you are running on, but if we do our job right again, the AppFabric is taking the lower-level dependencies in mind."

AppFabric incorporates a host of already-familiar technologies to .NET developers, including the Velocity distributed caching engine, the Dublin app-server extensions focused on Windows Workflow Foundation (WF) and Windows Communication Foundation (WCF), as well as management enabled via Powershell scripting and Internet Information Server (IIS) integration.

Kawasaki singled out Silverlight 4, the upcoming version of Microsoft's rich Internet application platform announced by Scott Guthrie on Wednesday, as a platform poised to take advantage of AppFabric.

"A big push in Silverlight 4 is building on top of the WCF service model and the RIA services," said Kawasaki. "One of the benefits you get from that is because AppFabric just manages WCF as a service, suddenly your Silverlight 4 backend now can be more scalable and reliable when using AppFabric as that scale out middle tier. I think that's one of those things that we'll start talking about more now that the Silverlight 4 announcement was made. Increasingly there are a very broad set of services that people are really excited about. People want to build Silverlight 4 apps and scale that and now there is a solution for that."

What should developers be doing with all this? Kawasaki urges developers to jump on the AppFabric for Windows Server beta. "My advice to developers would be to start using beta 1 for on-premises. That is the best way to get started whether you are using on-premise or in the cloud."



Telerik Offers Silverlight Advances and ORM for the Cloud‘The Fourth Kind’: Impossible Dreams, By Kurt Loder