Wednesday, November 25, 2009

Dave Campbell Discusses Dallas

Microsoft Technical Fellow Dave Campbell is pretty worked up about the freshly-announced Dallas project, the data-centric initiative that hopes to radically reduce the friction involved in discovering, manipulating and analyzing data culled from multiple public and private data sources via the cloud.

"Being able to take those things, pull them all together, and let developers build new experiences is what Dallas is all about," Campbell said during an interview following the keynote.

Campbell gave a demo during the morning keynote that showed how Dallas could be used to search out data accessed via ADO.NET Data Services (code-name Astoria) and to manipulate that data using the Excel PowerPivot component. The demo was accompanied by a video presentation by Federal Chief Technology Officer Vivek Kundra, who talked about how opening access to public data has led to significant innovation in medical research, GPS services and astronomical research. government information and the power of Dallas in that.
Clearly, Dallas has some legs already. And Campbell says Microsoft is getting so many requests to participate in the Dallas effort that the company simply can't keep up with them all.

What's interesting about Dallas is that it's less a typical, PDC-borne platform advancement and more an effort to leverage Microsoft's matured data and Cloud services stack in combination with its peerless corporate sway. In essence, Microsoft has worked to craft a viable syndication and licensing model for private data, enabling organizations to make available for broad consumption data that would otherwise remain locked up in quarterly DVDs or rigid subscription-based packages.

Associated Press On Board
Organizations can either host their data with Microsoft or they can allow Dallas to access data stored on-premise. Todd Martin, vice president of technology development for the Associated Press, says his organization opted for the latter approach. He says Dallas provides an opportunity to expose content from more than 1,500 AP news organizations to the channel of Visual Studio developers.

"The newspapers actually participate in the value of an application that is created by a developer," Martin said during a round table Q&A session. "Developers and information workers have easy access to a content API that is of course very well integrated into their workspace. You have this potential for both the developer and the news publisher to benefit from this collaboration."

To make data available, Dallas provides what Campbell described as "an open catalog and market place for data, both public data and commercial data." By providing easy discovery and search of both subscription and public data sources, Dallas makes it possible for developers to pull together disparate data sources.

"Dallas makes the whole world of data better than the sum of its parts by creating a uniform discovery mechanism for data, a uniform binding and access mechanism for data, a uniform way of exposing trial data sets for developers, a uniform licensing model so data from multiple providers can be easily joined and recombined. By delivering data as a service, our aspiration is that Dallas might catalyze a whole new wave of remixing and experimentation by developers."

At the core of Dallas' potential success, Campbell said, that it arms Visual Studio developers with a standards-based way to access, manipulate and deliver data.

"Dallas has a consistent set of APIs. It's a service interface and a REST-based interface. It produces data in ATOM pub right now. And we've encoded structured data elements within the ATOM pub items so they can be consumed directly by applications," Campbell explained in an interview after the keynote.

"We build a service proxy," Campbell continued. "Part of the magic of Dallas is that it is powered by an underlying service model. So when you go up and start to explore data sets there are actual fields you can restrict on, so if you want to filter it one way or the other. We'll produce a service proxy which you can include in your .NET application to make it very, very easy to consume the data."

Campbell said ADO.NET Data Services (formerly code-named Astoria) is "the protocol that powers this." He said the technology makes it easy for developers to create a service to produce and consume data.

The Cloud's First Killer App?
A lot of focus in the arena of cloud computing has been on finding efficiencies. Moving infrastructure to service providers and enabling highly scalable, virtualized server environments. But with Dallas' clear momentum and its ambitious vision, the question begs. Could this be Windows Azure's killer app?

Campbell thinks it could be.

"I think it is--I would propose it as a killer app for the cloud. And here's the thing. It really does represent how we think about the cloud right now," Campbell said. "This one I had an intuition on. And the response we've gotten from the content owners we've shown it to is just tremendous. So it really has the potential to be something big."



‘New Moon’ Star Jamie Campbell Bower Dreads Seeing Himself OnscreenRay Ozzie and Bob Muglia Kick Off PDC 09

Tuesday, November 24, 2009

Early Enterprise Cloud Deployments Likely To Be Hybrid

Are early adopters of cloud services likely to go the pure hosted route, build private clouds run within an enterprise, or deploy a hybrid approach of the two?

That was a topic debated Wednesday at the Interop New York conference. Certain data and applications cannot run in the cloud due to regulatory, compliance or security concerns. Yet other applications make more economic sense to run in the cloud or offer faster deployment options. The hybrid approach allows for the best of both worlds.

"My view is we're debating whether a model like this with the right policy constraints can get the enterprise to a place where it is cloud-like and eventually migrating toward a cloud environment," said Alistair Croll, principal analyst at Bitcurrent, speaking during an Interop session he moderated.
"A hybrid model gets people comfortable in the shorter term," said Tyson Hartman, CTO of Avanade, a subsidiary of Accenture that deploys Microsoft-based solutions. That's been the case with the company's Business Productivity Online Suite (BPOS) for hosting Exchange, SharePoint and Dynamics applications.

Hartman said in an interview following the session that he anticipates Microsoft's new Azure cloud services, launched this week at the Professional Developers Conference (PDC) in Los Angeles, will follow that same pattern. "We are doing a lot of hybrid work with BPOS because people can realize the cloud operating model business benefits," he said. "I think as Azure gets more predominant, we will see these hybrid models emerge."

Indeed, a survey of 500 C-level IT executives by Kelton Research commissioned by Avanade found a 300 percent spike over the past nine months in plans to either test or start deploying cloud services. However, 95 percent of them plan on a hybrid approach, according to a study released last month.

Still, some on the panel questioned whether that will make sense over the long term. "I think there's value in private clouds and value in public clouds but they are very separate things," said Anders Lofgren, chief research officer at TheInfoPro, an IT research firm. "I think the hybrid cloud complicates something that's already really complex."



Interop New York: Customers Press Vendors for Private Clouds‘2012′ TV Show Would Be ‘A Little Bit Like ‘Lost,’ ‘ Director Says

Google Chrome OS Released as Open Source Code

Google's previously under-wraps operating system is now available to developers as open source code, the company announced on Thursday.

In an online press briefing, company officials demonstrated Google Chrome OS publicly for perhaps the first time, although the OS was initially described in July. Chrome OS is currently being designed to work on netbooks and the company is about "a year away" from launching a product, according to Sundar Pichai, Google's vice president of product management.
Chrome OS is a free open source operating system designed for use on netbooks, but it could be repurposed to run on desktops. In a Q&A, Pichai said that Google will be reviewing that possibility "in the middle of next year." The OS will run on x86 hardware, but Google also will make sure than it runs on ARM processors, according to Matt Papakipos, engineering director for Google Chrome OS.

Pichai added that Google doesn't see the OS as being limited to the rather small screen size and keyboard size of today's netbooks.

Chrome OS, which looks a lot like Google's Chrome Web browser, will work with Web applications only. People still need to work offline, and Google is developing a database API to enable offline access in the Google Chrome browser, Pichai said. In 2010, Google plans to have all of the APIs built into Chrome, he added.


[Click on image for larger view.]  

The main Chrome OS screen strongly resembles the Chrome browser. The tabs at top hold both programs and external Web links.

 

Users do not have to install applications on netbooks running Chrome OS. In fact, they can't. Instead, they just run a Web application through their browser, which sits atop Chrome OS. The OS has an applications tab in the top left corner of the screen that lists available applications. In addition, Google plans to make it easy for users to discover new Web applications.


[Click on image for larger view.]  

Clicking the top left tab in the Chrome OS takes users directly to the list of their applications hosted online (the Chrome OS does not run installed apps).

 

Applications can be pinned to the top of the OS screen in tabs. They are distinguished from Web pages because the application tabs don't close. Chrome OS also uses panels as part of the user interface. Panels are "persistent lightweight windows" that can be minimized, according to Pichai. Typical uses for panels include chat windows, applications like notepad, file management and even Google Books, which can be read in full-screen mode.

The new OS will be hardware specific. Google is currently working with its hardware partners on the specs. Pichai said that means that users will definitely have to buy a Google Chrome OS-specific device. In particular, Chrome OS will only run on machines with solid state drives (SSDs). In taking that approach, Google is bypassing a lot of the delays seen with running traditional operating systems. For instance, most OSes spend time looking for a floppy disk on bootup, Papakipos explained.

Google's demo included a laptop that quickly booted to a log-in screen. Pichai said the bootup took seven seconds, and that it takes another three seconds to log into an application. Google is working to reduce even those times. The fast bootup is made possible through the use of flash memory in the SSDs, as well as through bypassing various application background services that initially load with traditional OSes, Papakipos explained.

Google conceives of Chrome OS as a different model of personal computing and is focusing on speed, simplicity and security. Pichai said that Google wants the netbook experience to be fast, like turning on a TV. Simplicity is enabled because users don't have to install programs and there's nothing to maintain. All of the data is stored in the Internet cloud. Security is improved because users can't install binaries. The file system locks down the partition as read only. In addition, if a user loses a netbook, the data isn't lost. The user can log in via another device and have access to that data anywhere.

Chrome OS has some technologies that can help with security threats, according to Papakipos. It automatically updates itself and has a "verified boot" system that checks to see that you are "running what you should be running," he explained. It also uses a cryptographic signature key. If a problem is detected, malware detection is run. Google can also reimage the system and save the user's cache data, restoring user settings. The storage of the user's settings is always encrypted, Papakipos said.

The press briefing also spilled some details about Google Chrome, the company's Web browser. Google plans to roll out Google Chrome for the Apple Mac and Linux-based computers before the end of the year, Pichai said. He claimed that Google Chrome is "40 percent faster than [Microsoft's] IE 8" and that there are more than 40 million users of Google Chrome who use it as their primary browser.

Al Gillen, IDC's program vice president for system software, described Chrome OS as potentially "problematic and challenging" for Microsoft. However, he stressed that it's really just a limitation to the functionality that users get with a traditional rich desktop. Chrome OS is neither new nor revolutionary, he said.

"You take a standard Linux operating system and you're removing the stuff you don't need. You're locking it down and making it relatively tight and secure. That's fine, but the operating system is not revolutionary or different," Gillen said by phone. "You've got a browser that is probably optimized for some Google applications and services. But fundamentally, it's a browser on an operating system."

More details on Chrome OS can be found by reading Google's announcement here.



Opera Beats IE in Browser Web Security‘New Moon’ Star Jamie Campbell Bower Dreads Seeing Himself Onscreen

Ozzie Quizzed on Cloud Strategy at PDC

Microsoft officials briefed financial analysts on the company's cloud computing strategy at its Professional Developers Conference (PDC) in Los Angeles on Wednesday.

Questions were fielded by Ray Ozzie, Microsoft's chief software architect, as well as Doug Hauger, general manager of Windows Azure. Of course, the big news was Microsoft's announcement on Tuesday that its Windows Azure-based datacenters would open for commercial service on Jan. 1, with billing ramping up on Feb. 1.
Ozzie began the briefing with the analysts by laying out Microsoft's oft-stated vision of three screens and a cloud -- a world of the PC, phone and TV all connected through the Internet. Next he described his role in trying to jump-start Microsoft's cloud computing efforts to help support that vision.

"We started investing in this cloud computing transformation, and a belief in this cloud computing transformation, about three years ago now," Ozzie said. "What we believed was that things were happening so rapidly and so significantly that this could be really transformational in a way that would impact computing for the next 20 to 30 years. So we started from the ground up -- literally the ground."

By that, Ozzie meant that Microsoft started its cloud computing efforts with so-called generation-1 datacenters, which are simply racks of computers where Microsoft manually installed the operating systems. He explained that gen-1 is where most companies involved in cloud computing are today. Gen-2 is the stage where multiple servers are added, but Microsoft's recent datacenter deployments have been gen-3, such as the one Microsoft opened several weeks ago in Chicago.

Gen-3 means modularizing containers or, as Ozzie described it, "cloud plug-and-play." Multiple servers are housed in a large building with roll-up doors. Microsoft can back up its trucks with 20-foot cargo containers filled with servers into that building. Microsoft then stacks these containers on top of each other. Ozzie said that this gen-3 approach has cut down Microsoft's supply-chain time to weeks rather than months.

Gen-4 is even leaner than gen-3. With gen-4, the servers are kept in cargo containers on the outside, absent a large warehouse. The goal of this approach is to shrink the capital investment, Ozzie said. For gen-4, Microsoft called on the talents of Dave Cutler, the principal developer behind Windows NT and a Microsoft Technical Fellow, to build a new operating system for the cloud, which became Windows Azure. Ozzie distinguished the use of the Windows Azure cloud computing OS from the old-model approach of plugging a traditional OS into a datacenter. Those traditional models do not provide the value of cloud computing, he said.

As part of this vision, Microsoft retooled Exchange and SharePoint as online services. Ozzie said that it gives enterprises the option to switch to services, and that with the current economic crisis, demand for the online products has been "tremendous." Mail and document sharing will be the most important cloud-based applications for businesses, he said.

Hauger stressed cost savings, agility and ease of implementation for Microsoft's cloud services. While Microsoft has 6 to 7 million developers around the world, the Windows Azure platform isn't just for .NET developers. Microsoft added developer support with Eclipse integration and supports PHP, MySQL, Ruby, Python and Java on Windows Azure.

Windows Azure supports both consumer and enterprise needs, Ozzie and Hauger emphasized. Hauger listed a few enterprise trial users of Windows Azure, including Coca-Cola, which is using the platform for order processing and its product database. A company called RiskMetrics Group performs its financial analyses using the Windows Azure platform for massively parallel high-performance computing of up to 40,000 instances at time. Companies like Kelly Blue Book and Domino's Pizza use Windows Azure to deal with peak demand issues that affect Web operations, according to Hauger. GlaxoSmithKline is using Microsoft's Business Productivity Online Suite (BPOS) service for e-mail federation using Active Directory, he added.

An analyst questioned Microsoft's BPOS services and the kind of margins Microsoft expected to see from them. Hauger didn't cite specifics, but claimed that Microsoft's Dublin gen-4 datacenter offers "dramatic economies of scale." Developers can then charge on top of that, he added.

Another question was about potential problems in porting legacy applications to the cloud. Ozzie said that it was "relatively straightforward to port those apps." Legacy apps designed for "one-box" computers can fail when moved to the cloud, he added, but "most people do a move, extend and transform"-type operation.

"They'll move something to the cloud, and then they'll start to mess around with it and understand what it's like to be in the cloud," Ozzie said. "And then they'll extend it by starting to take some aspect of it and take advantage of the cloud."

Moving things like e-mail, SharePoint and collaboration applications is "very straightforward," Ozzie added. Hauger said that moving some apps, such as moving SQL Server to SQL Azure, can be simple.

"If they're running Microsoft stuff now on premises, then if all they are doing is moving...they could move it as a virtual instance to our cloud or frankly to Amazon's because Amazon is a great [partner]," Ozzie said. He added that they'll still hit a roadblock when they want to transform it.

"The transformation will not be possible," Hauger said.



Taylor Lautner Says Relationship With Kristen Stewart Is ‘Wonderful’Ray Ozzie and Bob Muglia Kick Off PDC 09

Monday, November 23, 2009

Microsoft's New AppFabric Serves Up Composite Apps

During the opening keynote at the Microsoft Professional Developers Conference in Los Angeles on Tuesday, Microsoft President of the Server and Tools Business Bob Muglia introduced AppFabric as a foundational component of Microsoft's Windows Server and Windows Azure strategy.

AppFabric comprises a number of already existing technologies bound together in an abstraction layer that will help establish a common set of target services for applications running either on-premise on Windows Server or in the cloud on Windows Azure. Burley Kawasaki, director of Microsoft's Connected System Division, described AppFabric as a logical next step up in .NET abstraction.
"We really see AppFabric as being the next layer that we add to move up the stack, so we can provide a developer with truly symmetrical sets of services, application level services that you can take advantage of, regardless of whether you are targeting on-premise or cloud platforms," Kawasaki said. "This is an additional level of abstraction that is built into the system model, just like SQL as an example uses T-SQL as a common model, whether or not you are on the cloud."

Kawasaki said that AppFabric will help fix a problem facing Microsoft developers today. ".NET is a consistent model today, but when you do File/New to write an app for Azure, your whole application looks a little different than it would if you were just writing an app on-premise."

Beta 1 of AppFabric for Windows Server 2008 R2 was announced during the opening PDC keynote on Tuesday. A community technology preview (CTP) version of AppFabric for Windows Azure is promised sometime in 2010. The goal, said Kawasaki, is to present a uniform developer experience and service interface for both on-premise and cloud apps.

"They should be fairly identical in terms of capabilities. That should be the goal, so you as a developer do not have to learn different skills. It will be as symmetrical as possible," Kawasaki said. "There will always be things you have to know about the deployment environment you are running on, but if we do our job right again, the AppFabric is taking the lower-level dependencies in mind."

AppFabric incorporates a host of already-familiar technologies to .NET developers, including the Velocity distributed caching engine, the Dublin app-server extensions focused on Windows Workflow Foundation (WF) and Windows Communication Foundation (WCF), as well as management enabled via Powershell scripting and Internet Information Server (IIS) integration.

Kawasaki singled out Silverlight 4, the upcoming version of Microsoft's rich Internet application platform announced by Scott Guthrie on Wednesday, as a platform poised to take advantage of AppFabric.

"A big push in Silverlight 4 is building on top of the WCF service model and the RIA services," said Kawasaki. "One of the benefits you get from that is because AppFabric just manages WCF as a service, suddenly your Silverlight 4 backend now can be more scalable and reliable when using AppFabric as that scale out middle tier. I think that's one of those things that we'll start talking about more now that the Silverlight 4 announcement was made. Increasingly there are a very broad set of services that people are really excited about. People want to build Silverlight 4 apps and scale that and now there is a solution for that."

What should developers be doing with all this? Kawasaki urges developers to jump on the AppFabric for Windows Server beta. "My advice to developers would be to start using beta 1 for on-premises. That is the best way to get started whether you are using on-premise or in the cloud."



Telerik Offers Silverlight Advances and ORM for the Cloud‘The Fourth Kind’: Impossible Dreams, By Kurt Loder

Sunday, November 22, 2009

Interop New York: Customers Press Vendors for Private Clouds

Three prospective customers representing large enterprises listened to pitches for cloud services from Amazon, Google and Microsoft. But the customers responded that they have too many misgivings about security and compliance to make commitments to public cloud services.

The scenario played out Thursday during the keynote panel session at Interop New York where key officials of the three vendors pitched their respective cloud offerings and then were grilled by the prospective enterprise customers on the panel.

"Based on the economics and the technical service model, there's a lot that's hugely attractive," said Louis Gutierrez, CIO Emeritus of the Commonwealth of Massachusetts and CIO Emeritus of Harvard-Pilgrim Healthcare. "[But] issues around risk management really are central to any movement in that direction."
Rico Singleton, deputy CIO for the state of New York, agreed. "Can you give me a private cloud that can provide all the benefits that you provide now on my private network closed to the outside, and still be able to give me similar ROI?" Singleton asked.

"It can be done but I would strongly encourage you to do the easy things first. Don't make it more complicated than it is," said Don Dodge, a developer advocate at Google who joined the company this week. Dodge was among those laid off by Microsoft earlier this month, much to the surprise and chagrin of the developer community.

Dodge cited the city of Los Angeles, which earlier this month said it would migrate 30,000 employees to Google's managed Gmail service for $50 per year. But Singleton questioned the cost.

"Once you start looking at BlackBerry services and archiving services and record retention services, that's not included in your $50," Singleton said. "You've got to look at the total cost of being able to provide the mail services." Even more concerning, though, is having the data outside of its control.

Deploying more transaction-oriented services will be even more difficult, he added in a follow-up interview. "We are always looking at ways that we can reduce our cost, in terms of transaction-oriented services like licensing and permitting, but I haven't seen any providers that have service-oriented applications that are conducive to the cloud," he said.

"It's going to be a tough sell," added John Merchant, assistant vice president at The Hartford, a large insurance company, in an interview. "As a Fortune 500 company with highly regulated data and a very conservative outlook, it's going to be difficult for any insurance company or any financial institution of any size to migrate any data to the cloud."

It may be some time before the cloud providers are able to answer these concerns. Amazon Web Services Vice President Adam Selipsky acknowledged it cannot offer datacenters that are hosted just in New York state.

Yousef Khalidi, a distinguished engineer for cloud infrastructure at Microsoft, said he is aware of the desire of larger enterprise customers for private clouds. Microsoft, which rolled out its Azure cloud service this week at the Professional Developers Conference in Los Angeles, said the next phase of its cloud evolution will address that need.

"We believe there is a place in the spectrum for private clouds that offer many of the benefits of the cloud including a scale-out architecture," Khalidi said. "Unlike public clouds, private clouds can be kept under lock and key. We have the benefit of capacity on demand, global reach and all of the benefits of a large shared multi-tenant infrastructure."

In a follow-up interview, Khalidi said Microsoft has not set a timetable for delivering a private cloud offering. "Stay tuned," Khalidi said. "We have said that we believe in this full spectrum and we will deliver using our software stacks. We have a lot of IP involved here but the dates and the like, we haven't discussed yet."

Meanwhile, Khalidi believes the early adopters of Azure will be those building Web 2.0-type applications and those used by multiple business partners such as extranets and collaborative portals.

"If you're a company that works with suppliers, all of these companies have these scenarios and many of these are amenable to moving to the cloud sooner than others," he said. "Others may follow when the technology and regulations catch up, others may never go to the cloud which is fine with us. We're not going to try hit everyone with the same thing."



‘2012′ TV Show Would Be ‘A Little Bit Like ‘Lost,’ ‘ Director SaysAmazon Sets Stage for Cloud Battle With Microsoft

Silverlight 4 Beta Launched

Scott Guthrie weathered a few demo-day glitches to produce an informative and well-targeted Day 2 keynote. Guthrie dove into details of the upcoming Silverlight 4, which he announced went to beta today and is available for immediate download. The final version of Silverlight 4 is scheduled to ship in the first half of 2010 Guthrie said.

The next version of Microsoft's rich Internet application platform will also add a host of long-requested capabilities, including programmatic printing support, programmatic access to the clipboard, rich text editing, and support for mouse wheel input for all standard Silverlight controls.
Perhaps most compelling was the addition of local file system access and out-of-browser execution. These enhancements, which help bring Silverlight up to par with Adobe AIR framework, drew a round of applause from the PDC audience. That capability will enable devs to assign an application as trusted, enabling it to gain access to local system resources. When run on the client, the Silverlight app will throw a dialog box requesting system access, Guthrie said.

"This mode works on both Windows and the Mac. All developers need to do is add a checkbox that says the app is trusted," Guthrie said. "You have local file system access. You can now read and write from the file system."

Guthrie, in typical fashion, kept the hits coming. He mixed slow, somewhat entertaining demos with rapid-fire feature descriptions.

Also coming in Silverlight 4 is drag and drop support, as well as drop target support to enable streaming of selected files. Silverlight 4 will allow developers to host HTML as a control within applications, providing the ability to interact with HTML elements. Guthrie showed a demo of a Web page background image and later a streaming video that was broken into jigsaw puzzle pieces. Both image and . He later double-down on this demo, show a streaming video broken into jigsaw pieces.

Silverlight 4 promises to improve performance, Guthrie said. The next version will take full advantage of the just in time (JIT) Common Language Runtime (CLR) compiler in .NET, potentially doubling performance of processor intensive applications. Improvements to Silverlight startup will also significantly speed app loading, while the Profiler API will provide much improved app optimization for developers. This announcement drew a round of applause as well.

The next version of Silverlight adds support for the Google Chrome browser, even as it provides heightened integration with the .NET stack. Silverlight 4 provides support for RIA Services, which now uses Windows Communication Foundation at its core to enable more robust applications.

The latest version of Silverlight 4 also improves the media handling capability of the platform. Silverlight 4 adds multicast streaming to enable delivery of media to tens of thousands of clients without crushing the network. Guthrie also demoed integrated Web cam and microphone support, which provides access to interactive graphical effects. He also showed how a barcode app integrates with the Silverlight Web cam to scan a barcode pattern on a book and immediately bring up the price for the book at several online stores.

Silverlight 4 looks like a winner. Devs have long complained about issues large and small with Silverlight, from lack of mouse wheel support to the need to add printer output to the inability to access local system resources and run outside of the sandboxed browser environment. Silverlight 4 looks to put an end to those complaints, and seems to position the platform for use as a mainstream line of business development and delivery platform.



‘The Carter’ Documentary Gets ‘Intimate’ With Lil Wayne, Producer SaysAdobe Releases Betas of Flash 10.1 and AIR 2

Pluralsight Launches New .NET Training Content and Subscriptions

Pluralsight announced several new features to their On Demand! .NET developer training library at the Microsoft Professional Developers Conference 2009. These include new training content, improved navigation of training offerings, and a flexible subscription plans for customers.

The Pluralsight team increased the amount of content in the On Demand! library to 20 courses, increasing the offerings by almost 300 percent. New content includes courses on iPhone ASP.NET development, SharePoint services, Silverlight 3, BizTalk 2009 and Managed Services Engine (MSE). There are also new how-to offerings and beginner tutorials in addition to Pluralsight’s advanced development training.
Finding what you need among these new courses is simplified by some new features of the Pluralsight Web site, specifically the ability to search courses by technology or author via the search toolbar. In addition, downloadable videos are available in formats optimized for Zune and iPhone. Subscribers also have access to course slides, demo projects, and hands-on labs.

Access to On Demand! training is now available through three subscription offerings. The Standard subscription allows access to how-tos, tutorials and course videos online and is available for $99 per month or $499 per year.

A Professional subscription ($999 per year) adds the ability to download videos for Zune and iPhone. And the Premium subscription ($1499 per year) gives developers access to the hands-on labs and weekly Office Hours, where developers can ask questions about webcasts, On Demand! courses, labs, or even their own projects.

For those who want to peruse the content before committing to a subscription there’s also a free guest pass available.

For additional information on the subscription plans and course offerings visit http://www.pluralsight.com. To get up-to-the-minute updates on new course offerings and other Pluralsight news, you can follow them on Twitter at http://twitter.com/pluralsight



Adobe Releases Betas of Flash 10.1 and AIR 2Kanye West ‘Just Went For It’ In ‘We Were Once A Fairytale,’ Spike Jonze Says

Thursday, November 19, 2009

Beta Wave Hits Office and SharePoint Developers at PDC09

The final leg of the PDC09 keynote today focused on Office and SharePoint 2010 development. Microsoft released the first public betas of both technologies. The bits were made available to MSDN and TechNet subscribers on Tuesday.

The company also released the public betas of its Project 2010 management software and Visio 2010 modeling tool. The Office Mobile client for Windows Mobile 6.5 is was also made available today on Windows Marketplace for Mobile. It includes the new SharePoint WorkSpace Mobile 2010 in addition to mobile versions of Excel, Word, PowerPoint and OneNote.
"We really do believe in this notion of three screens and having a consistent experience across all of them," said Kurt DelBene, Microsoft's senior vice president of the Office Business Productivity Group, referencing the three screens and a cloud vision put forth in Tuesday's keynote. "And we are delivering that through the 2010 set of products on client and server and also through the Office Mobile clients that you can take a look at today."

Microsoft also announced an Office 2010 feature called Outlook Social Connector, which enables users to access SharePoint 2010, Windows Live and third-party social networking data from within Outlook. DelBene is encouraging third-party providers and developers to build extensions for the technology and announced that Microsoft is releasing the Outlook Social Connector SDK today. Linked-in is slated to offer a provider for the connector in 2010, according to Microsoft.

Derek Burney, Microsoft general manager of SharePoint Tools and Platforms, demonstrated Visual Studio 2010 integration with SharePoint 2010, by building a dashboard that track's performance data of a race car. The SharePoint 2010 application, which had a Silverlight 4 front-end, accessed telemetry data from the Windows Azure cloud. Barney demonstrated how that cloud data could also be accessed from Excel 2010 by adding a SharePoint button to the Excel ribbon. Earlier in the keynote Scott Guthrie, Microsoft Corporate Vice President of the .NET Developer Division, announced today's release of the Silverlight 4 beta.

DelBene quantified the ecosystem opportunity for SharePoint developers and partners today as a $5.6 billion in revenue, expected to reach $6.1 billion in 2011. He said Microsoft has sold 100 million licenses to roughly 17,000 SharePoint customers.

Microsoft estimates that there are more than 100,000 SharePoint developers. "We anticipate with the next release that, that will cross over the million mark," DelBene said.

The company also announced plans to further its Duet partnership with SAP. The companies will work together to provide Duet Enterprise, SharePoint 2010 and Office 2010 templates and prepackaged solutions that help developers integrate SAP data models and processes with SharePoint. It is expected to ship in the second half of 2010.



‘Michael Jackson’s This Is It’ Banks $100 Million Around The GlobeSharePoint 2010 Dev Center Ramps Up, Adds BCS Resources

Preview of Reactive Extensions for .NET Available

Microsoft announced the availability of a Reactive Extensions for .NET (Rx) preview on Wednesday, in conjunction with its Professional Developers Conference 2009, held this week in Los Angeles.

Rx is a .NET Framework extension library that developers can use to build “reactive applications”—applications that can react to changes to the collections in their environment. In the past, this event-based behavior has typically been handled via locks and event handlers. With Rx, developers can write declarative code that leverage changes in the properties of observable collections to fire off push-based, asynchronous behavior in their applications.
Using the Task Parallel library for concurrency, Rx builds on LINQ standard sequence operators in the .NET Framework 4 and Silverlight 4 to enable push-based notifications. Other .NET entities can leverage these notifications, greatly simplifying the creation of asynchronous applications.

Any .NET-compliant language can use Rx, from the familiar Visual Basic and C# to newer languages like F#.

An example of Rx in use is the Silverlight Toolkit. Microsoft used Rx to build the Silverlight Toolkit Drag and Drop Framework, which uses the Windows Presentation Foundation (WPF) to create and react to drag-and-drop operations within a Silverlight application. Rx was also used by Microsoft in the Silverlight unit testing framework to write reliable, readable asynchronous test code. Rx drastically reduced development and testing time for the Silverlight team.

In addition to the observable collection library extensions, there are a number of experimental items included in this release of Rx. These include types and namespaces to extend Rx functionality to enumerable collections, as well as a .NET Framework 3.5 version of PLINQ.

The MSDN DevLabs site provides Rx preview downloads for .NET Framework 3.5 SP1, .NET Framework 4 Beta 2, and Silverlight 3. Both Visual Studio 2008 SP1 and Visual Studio 2010 Beta 2 are supported development environments.



Telerik Offers Silverlight Advances and ORM for the Cloud‘2012′: The Reviews Are In!

PDC09 Preview: Azure Launch Among Highlights at Microsoft's Developer Conference

Three screens and a cloud: That's the vision that will be put forth this week by Chief Software Architect Ray Ozzie at Microsoft's Professional Developers Conference as attendees gather in Los Angeles to learn more about the company's developer platform strategy and how to take advantage of the Windows Azure cloud computing platform, Windows 7, Visual Studio 2010 and the .NET Framework 4.

PDC09, Nov 17 – 19, officially kicks off with an 8:30 PST keynote by Ozzie and Bob Muglia, Microsoft's president of the Server and Tools Business. Scott Guthrie, the corporate vice president of the .NET Developer Platform and Kurt Delbene, senior vice president of the Office Business Productivity Group, along with a "special guest" are slated to give the keynote on Wednesday. Both keynotes will be streamed live.  Workshops are being held today.
This conference is just one-year after the PDC08 confab, held the last week of October in the same location. "This year is about catching up with a knowledge deficit on real product, rather than creating a pipeline of early-release bits," says Andrew Brust, chief of new technology for consultancy twentysix New York and Redmond Review columnist/blogger for Visual Studio Magazine and RedDevNews.com.

Software plus services--the vision espoused at PDC08--which enables companies and their developers to move workloads seamlessly between on-premise infrastructures and the cloud, is beginning to take shape. The Windows Azure Platform, Microsoft's platform-as-a-service, will be officially launched during the Ozzie/Muglia keynote on Tuesday. The Azure Platform, which now consists of Windows Azure cloud operating system, .NET Services connectivity and SQL Azure relational data storage, is scaled back significantly in version 1 from what was announced last year; several changes stem from developer feedback.

On Friday, Microsoft released the November CTP of the Windows Azure SDK and Tools for Visual Studio 2008 and Visual Studio 2010 Beta 2. Among the new features is support for variable-size virtual machines. "Azure VMs now scale up and out," says OakLeaf Systems' blogger and principal Roger Jennings, in an email. "Something Azure developers have been asking for and needed to compete with Amazon Web Services EC2."

Microsoft is also expected to unveil its business portal for Azure's consumption-based pricing model, which was announced at the company's Worldwide Partners Conference in July. Expect several software vendors to take the stage to show third-party support for the Windows Azure Platform.

Microsoft's cloud will be officially launched but developers can also expect to learn more about the three screens, a browser front-end (to the back-end cloud) that runs on a phone, PC and TV. In Microsoft's vision, the programming models and tooling across on-premises, mobile and outside infrastructure is based largely on skills already familiar to its Visual Studio developers. Even so, the number of platforms, servers and tools updates in the last year has been astounding.

"Since last year we’ve had official releases of ASP.NET MVC, Silverlight 3, RIA Services and more. Last month we saw the release of Windows 7 and Server 2008 R2. This month it was Exchange 2010 and Azure’s coming [this] week," says Brust, in an email. "Then we have the impending hit parade of new releases (Visual Studio 2010/.NET Framework 4.0 and all the parallel programming APIs and tools, the .NET Reactive Framework, Office 2010, Office Web Apps. SharePoint 2010, BPOS 2010, SQL Server 2008 R2, and SQL Server Modeling).

"Microsoft has a lot of housekeeping, education-wise and messaging-wise, to do and that’s what this PDC is about," says Brust. "There will also be the more traditional futures-focused content but that looks like it will be secondary this time around."

Among the bits offered up at the show, developers can expect the Windows Azure SDK and tooling, SharePoint 2010 beta 1, Silverlight 4 CTP, and the first SQL Server Modeling CTP, which is the new name for "Oslo." Windows Mobile 7 and Silverlight for Mobile should be in the mix, but a discussion of both technologies was expected at PDC08 and developers were disappointed.

"Developers are all excited about multi-touch interfaces, mobile technology and creating new exciting user experiences," says Infragistics' product manager Andrew Flick, in an email. "I think we can expect to see demos and new products around these things."

Infragistics is sponsoring the Tuesday night reception. The company is also celebrating its 20th anniversary, so "we're gonna party like it's 1989," says Flick.

In addition to the keynotes and sessions, for the first time this year Microsoft's Channel 9 is broadcasting live all day during the conference.

-Michael Desmond contributed to this article.



Microsoft Opens Windows to Eclipse DevelopersCatherine Hardwicke Discusses Robert Pattinson, Kristen Stewart Chemistry

Tuesday, November 17, 2009

Android, BlackBerry and iPhone Take Share from Windows Mobile

Despite the fact that smart phone sales grew by 12 percent over the past year, the share of new devices shipping with Microsoft's Windows Mobile operating system have plunged, according to third quarter figures released by Gartner last week.

Only 7.9 percent of smart phones sold last quarter were equipped with Windows Mobile, down from 11.1 percent during the same period last year. According to the report, 3.2 million devices were Windows Mobile-based, compared with 4 million last year.

Microsoft's share is eroding as Apple's iPhone continues to grow. The iPhone accounted for 17.1 percent with more than 7 million shipped during the quarter, compared with a 12.9 percent share based on 4.7 million shipped last year.  
It appears Windows Mobile is also falling short at the expense of Google's Android platform, which had no market share last year but accounted for 3.5 percent last quarter based on 1.4 million shipped in the third quarter.

"Android has shown a lot of potential in the consumer space and that is where Microsoft has been struggling the most," said Gartner research director for mobile devices Carolina Milanesi, in an email.

Early indications suggest that momentum for Android will continue, with the recent release of the Motorola Droid, the first smart phone based on Android 2. A number of handset vendors once committed to Windows Mobile have started adding Android phones to the mix of their offerings, including Motorola, HTC, LG Electronics and Samsung.

Google's Android offers more favorable license fees than Windows Mobile, as does Nokia's Symbian, which also is moving to an open source model, Milanesi noted.  "This leave Microsoft at a cost disadvantage," she said.

Also gaining a healthy jump in share was Research in Motion, maker of the BlackBerry, which shipped 20.8 percent of smart phones, compared with 15.9 percent. Some 8.7 million BlackBerries shipped in the quarter, up from 5.8 million last year.

Microsoft's share decline should stabilize as more smart phones based on the company's newest OS upgrade, Windows Mobile 6.5 hit the market, Milanesi said. The first phones were released in September.

But Microsoft's prospects for seeing its share improve won't become more clear until its next-generation mobile OS ships, she noted. Dubbed Windows Mobile 7, Microsoft chief software architect Ray Ozzie today indicated the company will start to offer more details on Windows Mobile 7 next spring. Ozzie made his remarks about Windows Mobile 7 in passing during his opening keynote address at Microsoft's Professional Developers Conference 2009 (PDC 2009) in Los Angeles.

Microsoft isn't the only mobile player to have lost share. Symbian, which accounted for 49.7 percent of all smart phones shipped worldwide saw its share drop, though at a still respectable 44.6 percent with 18.3 million units shipped.

In a positive development for Windows Mobile users, Microsoft's Windows Marketplace for Mobile is now available to users on phones running Windows Mobile 6.0 and 6.1 as promised. Since its release, the marketplace was only available for those with Windows Mobile 6.5-based phones.



‘New Moon’ Ticket Buyers Favor Robert Pattinson, Ashley Greene In New PollWill Google’s Android 2 Be A Mobile Spoiler?

Adobe Releases Betas of Flash 10.1 and AIR 2

Adobe today released the public betas of its namesake Flash Player 10.1 and its next-generation Adobe Integrated Runtime, dubbed AIR 2.

Flash 10.1 is the first of its runtime releases based on Adobe's Open Screen Project, aimed at providing high-definition content across PCs and mobile devices using HTTP streaming. It also supports Adobe's new Flash Access digital rights management solution and offers H.264 hardware decoding on Windows-based PCs, netbooks and mobile devices.

The beta releases come as Microsoft is expected to disclose the future of its Silverlight RIA platform at the Professional Developers Conference in Los Angeles this week. Both companies are in a heated battle to offer developers tools to build rich content. Adobe Flash is nearly ubiquitous on desktop and notebook computers. The company has stepped up its battle with Microsoft with the release of AIR.
The beta of AIR 2 is intended to let developers to improve its desktop rich internet application runtime environment with support for mass storage, native application processes and support for peer-to-peer and UDP-based networking. Unlike Flash, AIR is not as ubiquitous -- Adobe said it is installed on 200 million systems. Both FlashPlayer 10.1 and AIR 2 also introduce support for multi-touch and gesturing.

Adobe had announced both last month at its annual Adobe MAX conference in Los Angeles. The company said the new Flash Player 10.1 will support Microsoft's Windows Mobile, Palm webOS, Google Android and Nokia's Symbian OS. Also, Adobe Research in Motion said they will co-develop a version of Flash that will run on the BlackBerry. В None of those features are in this first beta and are not expected to appear until sometime next year.



Penn Badgley Calls Teenage Halloween Costume ‘Worst Idea Ever’Salesforce.com and Adobe Pact Adds Flash to the Cloud

Ray Ozzie and Bob Muglia Kick Off PDC 09

Microsoft Chief Software Architect Ray Ozzie this morning kicked off the 2009 Microsoft Professional Developers Conference in Los Angeles, with a keynote speech that focused heavily on Windows Azure and cloud computing. Ozzie's presentation provided a high-level overview that focused largely on strategic messaging and providing updates to developers tracking Microsoft's cloud efforts.

Ozzie's high-level tour was followed by a more nuts-and-bolts dive into Windows Azure features and issues by Microsoft President of the Server and Tools Business, Bob Muglia. Muglia's presentation included a number of detailed demos that highlighted newly announced features and spotlighted the opportunity for developers to migrate their on-premise .NET code to public and private Windows Azure cloud environments.
Among the key issues addressed during the two-hour long session was the announced production date for Windows Azure, the launch of the beta of Application Fabric middleware and the announced available of the "Dallas" CTP, a data catalog and marketplace that the company hopes will inspire developers to explore and manipulate available public and private data sources. Muglia during his presentation also announced work on project "Sydney," expected to go into beta next year, which will enable customers to connect servers inside the data center with cloud services running inside of Azure.

Ray Ozzie: Hit 'Em High

Ozzie kicked off the PDC 09 event with a high-level pitch for Windows Azure and Microsoft's unified development message, while making an occasional plug for Windows 7 as a superior client for enabling compelling user experience. He revealed that Windows Azure will officially go to production on January 1, with the service available for one month on a no-fee basis while Microsoft works through initial deployment assessment. Microsoft will start billing for Windows Azure access on February 1, Ozzie said.

Ozzie stressed that Microsoft was working to provide a single, cohesive platform for developing apps on premise and on the cloud, and both Ozzie and Muglia talked at length about migrating existing Windows Server applications to Windows Azure.

"We at Microsoft have one simple strategy -- that is to focus on leverage and seamlessness in everything we do," Ozzie said.

Ozzie hinted at a number of new announcements at the show, including a Scott Guthrie presentation for Wednesday that he said would address the next version of Silverlight. He also gave a nod toward Microsoft's struggling mobile platform, noting that it would be addressed at the next MIX Conference in 2010. "You'll hear about developing apps for the next generation of our Windows phone," Ozzie said.

Ozzie also pointed out the long list of improvements to Windows Azure since the initial CTP was released at PDC 08 last year. He noted the reformulation of SQL Azure, the geo-replication of storage across paired Azure data centers, the creation of the container data center model, and the introduction of Azure X drive, mountable NTFS volumes.

Ozzie seemed particularly enthused about a newly announced project, code named "Dallas," which promises to give Windows Azure developers the ability to uniquely explore, manipulate and present data stored on public and private sources.

"Dallas makes the whole world of data better than the sum of its parts by creating a uniform discovery mechanism for data, a uniform binding and access mechanism for data, a uniform way of exposing trial data sets for developers, a uniform licensing model so data from multiple providers can be easily joined and recombined," Ozzie explained. "By delivering data as a service, our aspiration is that Dallas might catalyze a whole new wave of remixing and experimentation by developers."

Microsoft Technical Fellow Dave Campbell provided a demo showing how developers can use the Dallas Service Explorer to look at different classes of data in data feed, and preview the data to visualize it. In the demo, Campbell showed how Dallas can invoke Microsoft's PowerPivot add-in for Excel to render and manipulate data. He then went on to use drag-and-drop tools to build a quick WPF datagrid app to render out a custom dataset.

Bob Muglia: On the Down Low

Bob Muglia announced the beta of Application Fabric, a new application server for Windows Server that Muglia said extends the familiar IIS environment to provide a platform for highly available, scaled out, middle-tier services like Windows Workflow Foundation and Windows Communication Foundation services. A database cache is also provided in the App Fabric beta. A beta version of App Fabric for Windows Azure will be available next year, Muglia said.

"What App Fabric will do is it will take and extend the environment you are very familiar with in IIS and provide you with a platform for building scaled out, highly available, middle tier services such as WCF-based services and Windows Workflow-based services. The idea is that this creates an infrastructure, a very easy to manage infrastructure, where we will do that failover for you, we will keep the system highly available, and we will do the balance between those things, so you can build your applications in a straightforward way," Muglia explained. "In addition to middle-tier services like workflow and WCF we'll also provide a database cache which is an important part of speeding up and providing better performance for the applications you are building."

Among other things, Muglia announced that Windows Identity Foundation has gone to release to manufacture and that Go Live licenses are available for Visual Studio 2010 and .NET 4.

Throughout the session, both Muglia and Ozzie talked about helping developers migrate applications from Windows Server to Windows Azure.

"Microsoft is focusing on taking the investments you made in your existing environment in Windows Server and moving that forward," Muglia said.



Developers Dig into Visual Studio 2010 and .NET 4 Beta 2‘This Is It’ Proves Michael Jackson Was The ‘Greatest,’ Lou Ferrigno Says

Monday, November 16, 2009

Windows 7's UAC Slammed by Sophos

Microsoft and software security firm Sophos are at it again, this time arguing about the effectiveness of Windows 7's user account control (UAC) feature.

UAC is an administrative access control that provides security warnings to end users each time a system change is about to be made. It first showed up in Vista and Windows Server 2008, enabling better administrative control of user network privileges. Microsoft refined the UAC in Windows 7 after users complained about seeing annoying pop-ups.

However, the issue for Chester Wisniewski, a Sophos security staffer, was purely about the security protection afforded by UAC in Windows 7.
"UAC's default configuration is not effective at protecting a PC from modern malware," Wisniewski wrote in a Sophos blog post last week.

Sophos came to that conclusion based its testing of UAC. Those tests involved a clean install of Windows 7, running it without antivirus protection. Next, the Sophos team added "10 unique samples" of malware to the PC. The UAC failed to block eight of the ten viruses from running, according to the blog.

Paul Cooke, Microsoft's director of Windows enterprise client security, responded to Sophos' claims in a blog post this week.

Cooke pointed out that malware usually gets onto a workstation via Web browsers. Had the malware been encountered via Internet Explorer 7 or IE 8, those browsers would have notified the users of the threat via Microsoft's SmartScreen filter prior to download. Cooke claimed that Sophos' test results are skewed because the samples were apparently just added to the computer and run.

However, Cooke did suggest a common ground for agreement, saying that Windows 7 shouldn't be run without antivirus software. He accused Sophos of grandstanding to sell its software, and inserted a plug for Microsoft Security Essentials, a free consumer antimalware solution that was rolled out in late September.

"While, I'm not a fan of companies sensationalizing findings about Windows 7 in order to sell more of their own software, I nevertheless agree with them that you still need to run anti-virus software on Windows 7," Cooke wrote. "This is why we've made our Security offering available for free to customers."

Sophos makes security solutions for small businesses and the enterprise, so the consumer-oriented Microsoft Security Essentials isn't supposed to be a direct competitor to Sophos' products.

The UAC spat is just the latest disagreement between Sophos and Microsoft. In September, a Sophos official complained about the security of XP Mode, a virtualized Windows XP desktop that runs on Windows 7. In that discussion, the Sophos official called XP Mode "a potential security disaster."



Kanye West ‘Just Went For It’ In ‘We Were Once A Fairytale,’ Spike Jonze SaysWindows SMB Subject to Denial-of-Service Attack

Windows SMB Subject to Denial-of-Service Attack

Microsoft is continuing to investigate holes in its Server Message Block (SMB) file-sharing protocol used in Windows.

Late Friday, Microsoft put out a yet another Security Advisory, saying it was looking into "new public reports of a denial-of-service vulnerability" in SMB.

The reported exploits touch SMBv1 and SMBv2 on Windows 7 and Windows Server 2008 R2 operating systems, according to the software giant.

Vista, Windows Server 2008, XP, Windows Server 2003 and Windows 2000 are not affected.
"Microsoft is aware of public, detailed exploit code that would cause a system to stop functioning or become unreliable," said Dave Forstrom, a spokesman for Microsoft Trustworthy Computing. "If exploited, this DoS vulnerability would not allow an attacker to take control of, or install malware on, the customer's system but could cause the affected system to stop responding until manually restarted."

Last Friday's advisory is the second such advisory since Redmond released one in September. This also marks the second time in as many months that news about vulnerabilities in the SMB program has emerged.

Forstrom said the default firewall settings on Windows 7 will help block attempts to exploit this latest DoS issue.

He added that while Microsoft is not currently aware of active attacks, customers should "review and implement the workarounds outlined in the advisory until a comprehensive security update is released."



‘New Moon’ Star Robert Pattinson Laughs Off Pregnancy RumorsPatch Tuesday: Expect Six Security Fixes

Windows 7 Download Utility Contains GPL Code

Microsoft confirmed on Friday that open source GNU code landed in its utility software used to install Windows 7.

Microsoft's tool, called the "Windows 7 USB/DVD Download Tool," facilitates the creation of a Windows 7 ISO image, which can be stored on bootable USB or DVD media. It's particularly helpful for installing Windows 7 on netbooks, which typically lack CD or DVD drives.

The tool uses General Public License version 2 (GPLv2) code, confirmed Peter Galli, open source community manager for Microsoft's Platform Strategy Group.
"After looking at the code in question, we are now able to confirm this was indeed the case, although it was not intentional on our part," Galli explained on Microsoft's Port 25 blog. 

Galli said that the tool had been developed by a "third party," and Microsoft failed to detect the GPLv2 code on reviewing the project. The tool previously was available through the Microsoft Store, but it now seems to have been removed. At press time, CNet's download.com Web site still offered a free download of the tool.

Windows blogger Rafael Rivera Jr. first noted in his blog that the tool's code had been "lifted" from the ImageMaster project on Microsoft's open source CodePlex site. The ImageMaster project has since been removed from CodePlex.

However, rather than making the tool's code go away, Microsoft plans to make it publicly available, in accordance the GPLv2 license. Users of GPL code can freely modify it for their purposes, but then they have to share the modified code, according to the license.

"As a result, we will be making the source code as well as binaries for this tool available next week under the terms of the General Public License v2 as described here, and are also taking measures to apply what we have learned from this experience for future code reviews we perform," Galli wrote.

Violations of the GPL are a daily occurrence, according to Bradley M. Kuhn, policy analyst and technology director at the Software Freedom Law Center. He provided some reporting guidelines for those helping track GPL violations in a recent blog post.

In other apparent software intellectual property news, the Groklaw Web site suggested on Wednesday that Microsoft had received a patent on sudo, a command line interface commonly used in open source Linux operating systems. Groklaw, which tracks free software and open source legal issues, pointed to Patent No. 7617530, which was recently granted to Microsoft by the U.S. Patent and Trademark Office. The patent's description tracked with that of sudo, according to Groklaw.

However, Seattle-PI reporter Nick Eaton interviewed one of the maintainers of sudo, Todd Miller, about the patent. Miller thought that Microsoft's patented technology differed from sudo.

"I really don't think that that patent applies to sudo," Miller told the Seattle-PI. "I read through it and it seems to be something completely different."



Kanye West ‘Just Went For It’ In ‘We Were Once A Fairytale,’ Spike Jonze SaysUpdates Help Coders Put Windows 7 Tech Into Vista apps

Friday, November 13, 2009

Opera Beats IE in Browser Web Security

Microsoft's efforts on combating server-side Web vulnerabilities, as well as patching its Internet Explorer client, may be paying off.

A report from application security firm Cenzic, announced this week, looked at various Web security issues in the first half of this year. Vulnerabilities in the top four browsers was one focus the study, "Web Application Security Trends Report: Q1-Q2, 2009" (PDF download).

The report found IE placing fairly well. It was second only to the Opera browser in protecting against Web vulnerabilities.
"Of the browser vulnerabilities, Firefox had 44 percent of the total, but perhaps the biggest surprise was Safari, which formed 35 percent of the browser vulnerabilities. Internet Explorer was third, with 15 percent, and Opera was at 6 percent," the report noted.

Mozilla's Firefox clocked in as the most vulnerable browser on the Web, according to Cenzic's report -- a disappointing showing for IE's closest rival. Firefox reportedly has an estimated 330 million users and recently passed its fifth anniversary, having been launched on November 9, 2004.

Internet Explorer is still the most used browser, followed by Firefox, Apple Safari, Google Chrome (which Cenzic didn't study) and Opera.

In addition to looking at browser security, the report pointed to other areas of concern. Cenzic found that 78 percent of the total vulnerabilities were due to Web components. Web component vulnerabilities have increased compared with such findings from last year's report.

Microsoft at least seems somewhat attuned to the issue. A large theme in Microsoft's September patch cycle had to do with plugging such Web component vulnerabilities.

Cenzic also found bugs in Web servers, browser plug-ins and Microsoft's ActiveX control. ActiveX has been another priority for Microsoft's security team, which issued a security advisory on the matter in July.

The most striking thing about the report's findings is the broad apathy shown on the part of enterprise pros to addressing emerging threats on the Web, according to Mandeep Khera, chief marketing officer at Cenzic.

"In spite of the fact that vulnerabilities are so easily identifiable and widely exploited by hackers -- and there are now low-cost, turnkey SaaS solutions available -- businesses are not focused on securing their Web applications," he said in an e-mail statement. "[The vulnerabilities] are a serious and potentially lethal blind spot for businesses."



Microsoft Extends Office Mac 2004 Support‘New Moon’ Ticket Buyers Favor Robert Pattinson, Ashley Greene In New Poll

Interop Vendor Adds Transaction Bridging

Java/.NET interoperability solutions provider JNBridge is adding features to the next release of its core interop tool that will integrate distributed transactions between enterprise Java and .NET platforms, the company announced this week. JNBridge 5.0 will support what the company calls "generalized cross-platform transaction bridging," essentially making transactions between Java- and .NET-based components and applications work whether those transactions are Java-to-.NET or .NET-to-Java.

A transaction in this context is an individual unit of information, an "all or nothing" sequence of operations that must succeed or fail in its entirety, explained Dr. Wayne Citrin, JNBridge's chief technology officer. "Think of an electronic banking transaction," he said. "If all the operations of the transaction are successful, then the transaction can be committed. If any one operation fails, the whole transaction must be aborted, or rolled back."
The industry-standard protocol used by transaction monitors to guarantee that a particular sequence has succeeded as a unit is known as "two-phase commit," because it consists of a commit-request phase and a commit-or-abort phase. JNBridgePro 5.0 is designed to manage the two-phase commit protocol for transactions between distributed Java and .NET platforms, while keeping it mostly transparent to the use, Citrin said. An "abort," also called a rollback, on either side will cause actions on both sides to be rolled back; a commit on both sides will cause both sides to be committed. This capability works with all vendors' JEE implementations, he said.

"The idea is that if there are two active transactions, they are going to be unified," he said. "The Java side doesn't know about the .NET side, and the .NET side doesn't know about the Java side."

Citrin, who called transactions "the glue that makes e-commerce work," argued that there have been few successful remedies for the incompatibilities between enterprise Java and .NET transactions and transaction managers. He pointed to TIP (Transaction Internet Protocol), now obsolete, and WS-AT (Web Services Atomic Transaction), which requires cross-platform transactions to be exposed Web services.

Ovum analyst Tony Baer agrees that incompatibilities between Java- and .NET-based transactions has been an issue for companies involved with e-commerce, and that JNBridge is one of the few vendors attending to this interop niche.

"It's often the case nowadays, with applications that are more distributed -- and especially when you start getting into a lot of service-oriented stuff -- that at some point you're going to need to write some direct transactional applications," Baer said. "For example, you might have a dedicated application with some logic written in VB or C# that's going against a Java back-end, or vice versa."

"This [JNBridge] capability means that you can write transactional applications without having to write transactional logic. They've automated the type of thing you take for granted when you code natively. It's going to bridge the impedance mismatch when it comes to transactionality -- things like rollback, two-phase commit, and all that sort of wonderful stuff. It also means that, as you change your code, you don't have to change your transactional logic, your application will be less brittle."

"Frankly, I don't see anything else handling this whole transactionality issue right now," Baer added. "They've added a useful piece to the interoperability puzzle."

The Boulder, Colo.-based company also unveiled new versions of its two Java Messaging Service (JMS) adapters that will support cross-platform transactions: the JMS Adapter for .NET 2.0 and the JMS Adapter for BizTalk Server 2.0. The .NET adapter is designed to integrate JMS implementations directly with the .NET Framework; the Biztalk adapter does the same with the Microsoft's Biztalk Server.

The JMS is an API for enterprise messaging systems, which, Citrin said, supports "a very specific, narrow kind of transaction."

"It groups messages together into what is called a local transaction," he said. "We were able to integrate that with .NET transactions from the program that was consuming those messages."

The new BizTalk Server Adapter 2.0 comes with built-in support for request/response and solicit/response messaging patterns, which match outgoing and ingoing messages. Both adapters also provide a more flexible architecture that allows .NET and Java portions of the JMS client to run in separate processes, or on different machines.

The company announced the new products Monday, but plans to release them next week at Microsoft's Professional Developers Conference (PDC) in Los Angeles next week. More information is available on the company Web site here.



Bridge Connects Java to ADO.NET Data Services‘Iron Man 2′ Director Jon Favreau Talks ‘Tricky’ CGI-Completion Phase

Thursday, November 12, 2009

Altova Updates MissionKit Suite

Altova in October released an update to its popular MissionKit suite of XML, data, modeling and Web services tools. MissionKit 2010 arrives with a broad range of feature improvements intended to make the suite more compelling and efficient for customers, according to Altova CEO Alexander Falk. The approach distinguishes the new suite from MissionKit 2009, which focused on adding support for eXtensible Business Reporting Language (XBRL).

"We compiled a list of what we call the most-wanted features," explained Falk in an interview. "These are not tiny features. They are really 70 features that will be huge time savers and productivity gains for people."
Among the notable improvements in MissionKit 2010 is support for JavaScript Object Notation (JSON), a text-based data interchange format that is a viable alternative to XML in lightweight application scenarios. The XMLSpy 2010 module includes a JSON editor as well as a bi-directional JSON/XML converter for moving files between the two formats. XMLSpy 2010 also adds support for Web Services Description Language (WSDL) 2.0, including a WDSL 1.1 to WSDL 2.0 conversion utility. XMLSpy also adds XBRL document generation in its XBRL Taxonomy Generator.

The MapForce module adds simultaneous, batch processing of multiple input and output formats, eliminating the need for developers to hand-run conversions to multiple formats. It also adds automatic validation of Electronic Data Interchange (EDI) code directly within generated code, ensuring that invalid EDI messages are not processed.

Falk said that the Altova StyleVision module underwent more change in the 2010 release than any other. He singled out the addition of a forms-based document layout designer, which lets developers create paper-like forms from scratch or from existing templates. StyleVision also adds the ability to render HTML code directly within form documents, for additional flexibility.

'You can now create document layout container elements first and then connect them to data elements later," Falk said.

Other improvements include the addition of Systems Modeling Language (SysML) support in the UModel 2010 module, and in DatabaseSpy and DiffDog 2010 an expanded database differencing engine, which makes it possible to compare and address database structures and schemas.

"If you have made changes in your development environment and then migrate those changes to your production server, we can automatically generate the SQL script to alter the tables accordingly to move those changes from one environment to another," Falk explained.

Altova MissionKit 2010 is available in four SKUs. The top-end MissionKit 2010 for Software Architects Enterprise Edition costs $1739 and covers the entire waterfront of Altova modules. The Professional Edition of this product costs $869 and excludes the DiffDog, SchemaAgent and SemanticWorks modules. Altova MissionKit 2010 for XML Developers Enterprise Edition costs $1619 and excludes only the UModel and DatabaseSpy modules. Finally, the Professional Edition of the XML Developers product costs $739 and drops the DiffDog, SchemaAgent and SemanticWorks modules from the XML Developers Enterprise Edition feature set.



Mono Tools Plug-in for Visual Studio‘Zombieland’ Team Hopes To Make A Sequel

Q&A: Google's Enterprise Software Plans

Google recently described its efforts in the enterprise software space, including details about its e-mail service contract win with the city of Los Angeles.

What follows is an edited transcript of a talk I had late last week with Matthew Glotzbach, product management director for Google enterprise. Google is known for providing free Web-based applications to the general public, but these free consumer apps are also key its enterprise software development strategy, Glotzbach explained.

In the interview, Glotzbach talks about the ideas behind Google Apps, Google App Engine, Chrome Frame, Google Apps Sync for Microsoft Outlook and the mysterious Chrome OS. Some of these products are starting to gain ground on similar Microsoft solutions, or even bypassing them.
Q: Does Google have a name for its cloud computing platform?
Glotzbach: Where we really started in terms of a commercialized offering, in what I would call "cloud computing," is our product suite that we refer to as Google Apps, which is a bundle of products. Gmail is our e-mail offering. Google Calendar is our calendaring system. Google Talk is our instant messaging and voice-over-IP technology. Google Docs is our office productivity and collaboration technology (word processor, spreadsheet and presentation software). Google Sites is our team collaborative and team site capability. Google Video for Business is a "YouTube for business" [type of application].

So we bundle that group of technologies together under the name of Google Apps and we give that to organizations, small-to-medium businesses, large enterprise, education, public sector, etc. It includes…the types of administrative controls you'd expect and integration and interoperability capabilities -- so things like synchronization with on-premises directory servers, be it Active Directory or user provisioning in groups, single sign-on capabilities, archiving and e-discovery capabilities, APIs, a full administrative console, reporting, etc. That's our primary computing offering on the commercial side today.

We launched Google Apps Premier Edition, which is our for-pay version, back in February of 2007. For businesses, we charge $50 per user per year -- and that's per year, so it ends up being about $4.17 per month. The Google Apps Standard Edition is really geared towards clubs, organizations, affinity groups and families where you still have your own domain name and you have a group of users but you're not a business. The Standard Edition has most of the things that the Premier Edition has, but it doesn't have some of the APIs and more of the large enterprise integration points; it doesn't have things like archiving and e-discovery as you'd expect in a for-business offering. We give that one for free. We also have one that we call Education Edition, which is basically our Premiere Edition product, but we provide it at no cost to educational institutions, both primary K through 12, as well as secondary universities, etc.

We do have App Engine, which falls in the platform-as-a-service or the infrastructure-as-a-service-type category. Basically, it's an application hosting development platform where a developer -- whether they be an enterprise IT developer or an ISV or someone doing hobby development over the weekend -- can build an application on our platform and have it hosted by Google to take advantage of our computer resources and our storage services. We've had that in the market for one-and-a-half to two years now. And we continue to grow and expand that offering, but we don't really have a formal enterprise offering around our App Engine product at this time.

Would calling Google's cloud platform an 'experimental' platform be accurate?
I wouldn't say 'experimental.' We tend to introduce technologies into the consumer world first and use that as a maturation ground. You could say 'testing ground' to some extent, but by that measure, frankly, that's everything that Google does. We are constantly testing and iterating -- that's really the fundamental of our model that we are constantly evolving things. So, we tend to bring things out into the consumer world. And we've had great success. There are hundreds of thousands of applications built on App Engine serving millions of users, or something like that.

The city of Los Angeles last month elected to go with Gmail over a Microsoft e-mail platform. Has the contract been finalized?
It's been voted on and unanimously approved by the City Council and I believe it's in the final contracting stages.

Did Google have to meet an RFP and address security issues with hosted services?
Yes, there was an extensive RFP process conducted by the city. There were over 15 public bids across a number of vendors from which Google was selected as the preferred bid and then entered into the contracting phase. Obviously, Microsoft was one of those vendors -- although I heard at one point that Microsoft was seven of those proposed solutions. They were all scrutinized on a number of dimensions -- security was one of the big ones. I will say that the city's report, based on the findings of that process, stated that the hosted solutions from Google would be significantly more secure than the on-premise solution that the city has today.

And I think that's an important thing that we are seeing more and more. I think that as cloud computing becomes a mainstream way of doing business, what organizations are realizing is that there is a misperception of security. Just because something is in your own basement server room doesn't necessarily make it more secure than something that's hosted by a third party. The analogy that comes to mind is that of a bank. Your money is significantly safer in a bank than under your mattress. And it took everybody a few years to come to that realization a century ago. But now, you don't think about putting your money anywhere else and you don't think about going to an ATM and using technology to extract your money, and you don't care whether your or my money are all mixed together in a bank because we can go and get money out wherever we are.



Miley Cyrus Films ‘Sex And The City’ CameoMicrosoft and Capgemini Ink ERP Partner Deal

Microsoft Releases SQL Server 2008 R2 November Preview

Yesterday the Microsoft SQL Server 2008 R2 November Community Technology Preview became available to the general public, having been released to MSDN and TechNet subscribers on Monday.

According to a Microsoft Data Platform Insider blog posting, the November CTP is "feature complete," with new improvements such as PowerPivot for Microsoft Excel and Microsoft SharePoint Server. PowerPivot will enable easier business intelligence applications in Excel, with sharing and collaboration via SharePoint.
Developers will have to wait for Excel and SharePoint PowerPivot downloads, though. The blog said those would be available "soon."

The previous CTP was made available in August. Enhancements to the August CTP include wizards “to help you quickly set up and enroll instances and Data-Tier Application components into central management,” and dashboard viewpoints to monitor utilization of applications and instances.

Read more in-depth coverage for information on more new features, different packages, versions and pricing.

The final release of SQL Server 2008 R2 is scheduled for the first half of 2010. Download the November CTP here.



‘Couples Retreat’ Takes #1 Spot At Box OfficeSharePoint 2010 Dev Center Ramps Up, Adds BCS Resources

Wednesday, November 11, 2009

Mono Tools Plug-in for Visual Studio

Novell on Tuesday announced the release of Mono Tools for Visual Studio, a Visual Studio 2008 add-in that enables Windows-based developers to create .NET applications for Linux, Unix and Mac OS X. The solution promises to greatly simplify cross-platform .NET development, by allowing Visual Studio developers to use the same tooling, interface, libraries and resources for cross-platform development that they use for traditional .NET coding.

"From within Visual Studio you just say "Run on Linux," and a little pop up comes up asking you which Linux machine you want to take [the code] to," said Novell VP of Developer Support Miguel de Icaza, of the Mono Tools interface. "You are still developing against .NET and the .NET libraries. It is completely the .NET tool chain and .NET compiled code right up to the debugger. And then the debugger integrates into the Visual Studio interface."
Mono Tools includes integrated test and debug, allowing developers in Visual Studio to detect incompatibilities in code in the move from .NET and Windows to Mono and Linux. Code can be compiled and run on a local, Windows-based Mono stack, as well as on remote Linux servers.

Mono Tools is based on work done by the Mono Project, a Novell-sponsored initiative to create a cross-platform, open source implementation of the Microsoft .NET Framework. The Mono effort leverages the ECMA-approved C# and Common Language Runtime (CLR) industry standards, providing compatibility with .NET Framework 3.5.

Portions of the .NET foundation stack, including Windows Presentation Foundation and Windows Workflow Foundation, are not supported by Mono. De Icaza noted that existing .NET applications containing unsupported code features would need to be adapted before moving cross-platform via Mono Tools or other solutions. Mono does support Windows Communication Foundation, with the notable exception of Microsoft's Windows CardSpace (code named InfoCard) identity system.

By Developer Demand
Joseph Hill, product manager for the Mono Project at Novell, said Mono Tools was developed in response to programmer demand.

"We did a survey a while back. We found a lot of nascent use of Mono across the board -- a lot of interest in it and a lot of interest in cross-platform environments," Hill said. "But the thing holding it back was lack of support from Visual Studio."

De Icaza said Mono Tools for Visual Studio could end up being an important factor in attracting talent to the Mono developer community.

"We don't want people to move away from Visual Studio. They might love Resharper or their favorite add-in for databases," de Icaza said. "We can bring these features to .NET developers who might otherwise find themselves with a steep learning curve."

Mono Tools also attempts to appeal to ISVs and others with its ability to create pre-rolled application appliances. Mono Tools' SUSE Studio interface lets developers package Mono applications with a tuned version of the SUSE Linux operating system, creating a fully packaged app-and-OS combination that can be run with little or no on-site configuration, de Icaza said.

"We call this the appliance model, where developers instead of delivering a piece of the puzzle can deliver the complete solution," de Icaza said.

Mono Tools for Visual Studio is available in three editions. Professional Edition is an individual license for $99, Enterprise Edition is intended for individual developers in an organization and costs $249, and Ultimate Edition provides five enterprise developer licenses and includes a limited commercial license to redistribute Mono across platforms. It costs $2,499.



Microsoft Opens Windows to Eclipse DevelopersLauren Conrad Talks ‘L.A. Candy’ Movie: ‘I’m Taking It Step By Step’

VS2010 ALM Acquires Teamprise Clients

As the world celebrates the 20th anniversary of the "fall of the wall," Microsoft announced today in Berlin that it has acquired the Teamprise Client Suite, which supports Eclipse and other cross-platform environments for Team Foundation Server.

The announcement was made during the Developer Session keynote by Jason Zander, general manager of Visual Studio for Microsoft's Developer Division, on the first day of Tech Ed Europe 2009, Nov. 9 – 13.

The Teamprise Client Suite consists of an Eclipse plug-in, a standalone explorer client (outside of the IDE) that provides similar functionality and a command-line client for scripting. According to Microsoft, the technology enables source control, work-item tracking, build and reporting by integrating the Team Provider menu into Eclipse 3.0 and higher, and popular Eclipse-based IDEs--Adobe Flex Builder, BEA Workshop, JBoss and Rational Application Developer. The Teamprise explorer and command-line clients work on Windows, Mac OS X and Unix (Linux and Solaris).
"This is a brilliant move by Microsoft," says Al Hilwa, program director, Application Development Software for IDC. It enables cross-pollination in companies that have Java and .NET environments or ALM standardization in enterprises that want to adopt Microsoft's Visual Studio Team System.

"In a way you are seeing .NET maturing in the enterprise," he says, with more professionalized development and serious testing around it. "This is a milestone in maturation."

It is also important, according to Hilwa, "because Microsoft has been cranking most of its revenue in the team space." According to IDC estimates, Microsoft's application development (Visual Studio, ALM, SCM) and IT portfolio management software (Microsoft Project) represented a $1.271 billion business in the 2008 calendar year. Enterprise sales are the primary driver.

Brian Harry, Microsoft technical fellow and TFS product unit manager for TFS, explained in his blog today how Teamprise will fit into the Visual Studio ALM offerings going forward:

"With the transaction complete, we are turning our attention to creating the first “official” Microsoft version of the Teamprise Client Suite (new brand TBD). We expect to release an update this coming spring that will support a large portion of the TFS 2010 feature set while still being compatible with TFS 2005 and 2008 servers. When the Microsoft branded release is available, we will be providing free upgrades for all customers who own a Teamprise client product and an associated TFS CAL and will begin full Microsoft support for the product."

Teamprise will be included In Visual Studio 2010 Ultimate with MSDN, the successor to Visual Studio 2008 Team System Team Suite. It will also be sold for $799 (U.S.), separately.

The Teamprise technology was purchased from the Teamprise division of SourceGear, the company that develops SourceOffSite for Microsoft Visual SourceSafe. Teamprise v3.3 will continue to be sold and supported by the Teamprise division of SourceGear until the first Microsoft version is released. The Microsoft-branded product is expected in the same timeframe as the planned VS2010 launch, which is scheduled for March 22, 2010.

Microsoft has worked closely with the Teamprise developers since TFS originated in 2002, according to Harry. Most of the Teamprise development team has been hired by Microsoft. Former Teamprise lead architect and the Team System MVP of 2009, Martin Woodward, is the Microsoft program manager for Teamprise. Woodward will continue to work out of Northern Ireland.



Microsoft Plans Scaled-Down Team Foundation ServerTegan And Sara Talk ‘Twilight,’ Gun For Spot On ‘Eclipse’ Soundtrack

Security Development Lifecycle Gets an Agile Update

Microsoft on Monday released version 4 of its Security Development Lifecycle (SDL) initiative, a set of software development guidelines intended to improve application security. SDL version 4.1a specifically targets agile development scenarios, which Microsoft says places unique demands on the code security process.

SDL 4.1a documentation is available for download from the Microsoft Download Center Web site.

Established in 2004 as part of Redmond's Trustworthy Computing initiative, SDL was implemented to ensure that Microsoft's internal product teams conformed to rigorous best security practices when planning, writing and testing code improvements to all Microsoft software products. The program was established after a series of damaging, widespread attacks exploited flaws in Microsoft applications, including SQL Server, Internet Information Server, Office and the various Windows operating systems.
In the latest version of the SDL document, Microsoft describes the SDL as "a holistic and practical approach... that introduces security and privacy early and throughout all phases of the development process." SDL 4.1a targets challenges that arise when agile teams, aligned around weekly sprints and a highly iterative dev process, try to implement formal SDL practices.

"As an increasing number of organizations adopt the Agile development process for some or all of their development projects, Microsoft has evolved its SDL process to be effective in this accelerated development model," said David Ladd, principal security program manager of Microsoft’s Security Development Lifecycle team. "A well-managed software security program is a good investment at any time and can help minimize ongoing security-related maintenance costs while providing customers with a better security experience."

SDL 4.1a was on display at the Microsoft Tech-Ed Europe conference in Berlin, Germany, this week. Microsoft SDL Team Senior Security Program Manager Bryan Sullivan on Monday gave a presentation titled SDL-Agile: Microsoft's Approach to Security for Agile Products.

Sullivan in his presentation synopsis wrote that agile dev teams have struggled to implement SDL, because activities like threat modeling and security incident response planning add overhead that can overwhelm agile processes.

The Trustworthy Computing group at Microsoft also updated security guidance around cloud computing, in the form of a whitepaper titled “Security Considerations for Client and Cloud Applications.” That whitepaper is available for download here.



Jamie Foxx And Gerard Butler Traded Roles In ‘Law Abiding Citizen’CollabNet ALM Upgrade Adds Dynamic Planning