Thursday, October 30, 2008

PDC: Office Communications Server Gains New Developer Hooks

Developers this week heard more about the planned upgrade of Microsoft's platform for integrating e-mail, instant messaging, conferencing and telephony at this week's Professional Developers Conference in Los Angeles.

The upgrade is dubbed Office Communications Server (OCS) 2007 R2, and it is currently available only for private beta testing. OCS was dwarfed by many other high-profile launches at PDC this week. However, Microsoft did conduct a handful of technical talks on the planned OCS release.

Earlier this month, Microsoft had announced that OCS R2 would be an incremental upgrade, although the release will offer some noteworthy new capabilities for .NET developers looking to add conferencing-, presence- and telephony-oriented capabilities to business applications.


OCS R2 comes with new APIs and features integration with the latest release of Microsoft's Visual Studio development environment. Microsoft plans to officially launch a new OCS release in February, the company said at the VoiceCon Conference last month in Amsterdam.

Thousands of organizations have begun using OCS 2007 as their primary voice-communications platform since its release one year ago, according to Stephen Elop, president of Microsoft's Business Division, in a video statement released at the VoiceCon show.

Tom Nolle, president of the Voorhees, N.J.-based telecommunications consulting firm CIMI Corp., said that OCS has gotten off to a respectable start in Microsoft-centric shops.

"Those with SharePoint, for example, seem to be adopting it almost completely, at least in my experience," Nolle said in an e-mail.

"There aren't any issues in terms of negatives that I'm aware of with respect to adoption, but it will be important for enterprises to review their overall voice, UC [unified communications] and collaboration strategies given the new release and the directions. [But it looks] promising."

New and Improved
Among the new features in OCS 2007 R2 are built-in audio teleconferencing, desktop sharing and what Microsoft calls "persistent group chat." The new chat feature is aimed to let teams participate in specific topic-based chat sessions that persist over periods of time.

On the voice side, the new release will allow for improved call management, including response groups that manage incoming calls based on predefined workflows. It will have call delegation and VoIP-based call trunking.

There are two key new capabilities for the developer experience, said Anand Ramakrishna, principle group manager for Microsoft's Office Communications Server Developer Experience. First, the object model will allow developers to tweak behaviors and do more advanced kind of scenarios so they "can get down and write code," Ramakrishna said.

The other key developer capability centers around improvements to the API set called the Universal Communications Management API. The new API addresses support for presence, teleconferencing and call controls.

Developers will be able to integrate these new and existing features into their applications with Visual Studio 2008 using the latest capabilities in .NET Framework, Ramakrishna said.

One such example is Windows Workflow Foundation.

"We created Windows Workflow activities that you can drag and drop onto your workflow agent or your designer, and then model it using those APIs," Ramakrishna explained. "You can bring up your workflow canvas and drag and drop the UC activities onto your workflow -- in conjunction with your other workflow activities."

For instance, this will let developers build UC into their workflow models in SharePoint, while taking into account human latency in business processes.

"We're bringing [UC] and integrating that so that you can do intelligent routing and intelligent notification from Windows Workflow," Ramakrishna said. Applications can range from the routing and approval of forms to integration with CRM-type apps, he added.

In addition to call management, presence and teleconferencing, Ramakrishna pointed out that the new API set supports speech recognition and notification.

Developer ChecklistNew in OCS 2007 R2 Visual Studio 2008 integration APIs for presence, conferencing, call control and speech Support for automated routing using Windows Workflow Foundation


‘High School Musical 3′ Named Most Likely To Succeed At Box Office
Elop Touts OCS 2007 R2 as PBX Replacement
Patch Tuesday To Address Four Critical Issues

PDC: Microsoft Previews Windows Server 2008 R2

Microsoft on Tuesday showed off a pre-beta version of Windows Server 2008 R2 at its Professional Developers Conference in Los Angeles. The company is also promoting the upcoming release (also known as "Windows 7 Server") on its various blogs and Web sites.

While Microsoft isn't talking specific dates yet (the live version is expected to debut in late 2009/early 2010), it is talking features -- including confirming that Windows Server 2008 R2 will be 64-bit only.

"32-bit is done," wrote Technical Product Manager Oliver Rist on a Windows Server blog post. "Frankly it was high time. Customers have been unable to purchase a 32-bit server CPU for over two years now, and the advancements in CPU architectures really dictated that we squeeze as much performance out of customers' hardware purchases as possible. The move to 64-bit is a first step."


Redmond is also playing up the new virtualization tools it will include in R2, particularly Live Migration, which it calls a "marquee" feature. "Think physical host migrations of running VMs happening in milliseconds -- no service or user connection interruptions," Rist wrote. "With Live Migration, data centers can truly go virtual and largely divorce management considerations between software and hardware, and all managed from inside a single OS frame."

Other virtualization-related features Redmond expects to include is a new version of Hyper-V -- "think mucho better management [and] beefier resources for VMs," Rist wrote -- plus the addition of a "true" Virtual Desktop Infrastructure (VDI) to Terminal Services. According to Rist, this will allow IT pros to deploy virtual desktops and applications "with such tight integration, most users will be unable to tell the difference between centrally hosted apps and those installed locally," Rist commented.

Another focus of Windows Server 2008 R2 will be what Microsoft is calling "streamlined management" -- with new "role-specific" server management UIs, a new graphic interface for PowerShell, plus an Active Directory Domain Services console and improved Group Policy tools.

IIS 7.0 is also getting tweaked for the upcoming release, offering "new PowerShell management support" and "new failover clustering updates," according to Rist.

Rist called another new feature, Direct Access (DA), the "sleeper feature" of the release. Working with the upcoming Windows 7 client OS (also previewed at PDC this week), DA will make remote computing "invisible" from the client perspective.

"Using technologies like SSTP and IPv6 combined with way-easy management UIs in Windows Server 2008 R2, admins can build remote computing policies that let users plug into any network, anywhere and see their local network resources -- completely secure, no clunky VPN required," Rist wrote. "As long as there's an outward network connection, DA takes care of everything in the background and automatically."

More information about Windows Server 2008 R2 can be found on Microsoft's Web site here.


Windows Essential Business Server Hits RTM Status
Paramore’s ‘Twilight’ Song ‘Decode’ Premieres On Stephenie Meyer’s Web Site

Microsoft Joins Standards Effort on Ubiquitous Messaging

Microsoft will join a working group to support an open standard for multiplatform messaging in the enterprise, according to a company statement issued this week.

Redmond joins the Advanced Message Queuing Protocol (AMQP) Working Group, a collaborative effort developing specifications for interoperable messaging middleware at the application layer.

The AMQP's message-oriented middleware specification is being driven by financial services industries. However, it is expected to benefit other industries as well, including insurance and healthcare, or other organizations where enabling ubiquitous messaging is a priority.


Microsoft's commitment is "another step toward AMQP becoming the preferred connectivity for business messaging on the Internet," according to Adrian Kunzle, head of engineering and architecture at J.P. Morgan, in a prepared statement.

The AMQP Working Group is a consortium of various enterprise technology stakeholders, including some Microsoft partners. AMQP players include companies such as Cisco Systems, Credit Suisse, Deutsche Borse Systems and Goldman Sachs, among others.

A new business messaging protocol developed by the group will "provide businesses with a simple and more powerful way of connecting messaging-dependent applications both within and between firms," according to Microsoft's announcement.

The AMQP model addresses server semantics for any server implementation. It will help vendors achieve interoperability via a "wire-level format with modern features," according to the AMQP's Website.

Microsoft was asked to join the AMQP by its current members, including Microsoft's customers in financial services, according to Redmond's statement.


Former Microsoft Exec Joins BizNet Engineering Team
PostPath Deal Adds to Cisco-Microsoft UC Fight

Wednesday, October 29, 2008

PDC: Microsoft's Cloud-Based SQL Services Redefined

Microsoft is evolving its strategy for relational data services in the cloud.

As the company's vision for cloud-based computing is shared with over 6,000 developers this week at its Professional Developers Conference, Microsoft is announcing that SQL Server Data Services is now the storage component of a broader suite of data-related services called "SQL Services." SQL Server Data Services is renamed SQL Data Services (SDS).

SQL Services is part of the Azure Services Platform for building cloud applications intended to run on Windows Azure, Microsoft's new cloud-based operating system, unveiled here on Monday. The Azure Services Platform consists of Live Services, .NET Services, SQL Services, SharePoint Services and Dynamic CRM Services. PDC attendees can register for the Azure Services Platform and request .NET Services, Live Services and SDS provisioning (SDS is still in private beta).


Microsoft's data storage in the cloud is an evolving technology that faces a nascent market despite some early success by Amazon's DB2.

"We've been focused on the back-end infrastructure for several years," said Dave Campbell, Microsoft Technical Fellow and SDS team leader. Microsoft is using its back-end infrastructure for its own internal services, some of which will roll out early next year, according to Campbell.

"The interesting thing about the front-end [is that] we are learning so much," he said. "What are the use cases? What are the optimal design points? So we are still trying to sort that out.

"Now, once we make the call there," he continued, "we will be able to settle that down quite quickly because most of investment and the long tail of settling things out has been in the back-end."

The SDS team is looking at issues such as balance in the data model, approachability versus flexibility, ways to separate physical and logical data administration, the power required to run it in an automated fashion without runaway queries, and so on.

"We are looking at, 'Where do we draw the line?' and that line will shift over time," Campbell said.

SQL Services Labs Portal
A closer look at some of the projects that Microsoft has underway is available at the SQL Services Labs portal, which went live on Monday. Microsoft is working on several incubation projects, many of which are likely to become part of SQL Services, the Microsoft Synchronization Framework or Live Mesh in the future. These projects include work on a data access interface for ADO.NET Data Services (JSON and AtomPub) as well as offline data synchronization using the Microsoft Sync Framework. Other projects focus on reporting against SQL Data Services and Data Mining.

SQL Data Services started as an incubation project about two years ago with participation from both Microsoft Research and the SQL Server team. SDS today is developed on an eight-week sprint, according to Campbell.

"We found that we were able to really innovate quickly because we can put something out there and get feedback immediately...and eight weeks later have something that addresses it," Campbell said. "Where it was two, three or four years with the shrink-wrapped product, it is a matter of weeks here."

The SDS data model, which is based on flexible entities, is inherently multi-tenant, Campbell explained. It supports REST and SOAP wire formats. An authority (DMS domain name) can have a number of containers and entities within those containers. Each entity has an identity (Id), a distinguished attribute called a Kind and a number of properties associated with it. "You can look at entities within that container as if they were logical tables discriminating on the Kind," Campbell said.

"At PDC, as we have evolved the service, we have announced support for relational joins across Kinds and containers," he said. "The beauty of this is it's very, very approachable and very, very friendly for people who are designing, sort of, outside-in and really want schema flexibility in the system."

SQL Data Services is described by Microsoft as an open platform. To that end, the SDS team announced support for a Ruby toolkit, released sample source code for the SDS SDK including a REST Library, ResourceProvider and ActiveProvider. The source code is available for download at Github and as gems on RubyForge. Support for PHP is planned, according to Campbell.

The SDS public beta is likely to become available in a matter of weeks or months, according to Campbell, with a general availability target in 2009.


Jennifer Hudson Angles For A ‘Sex And The City’ Comeback, Calls ‘Secret Life Of Bees’ A ‘Powerful Film’
PDC: First Look at Live Mesh and Live Framework
Could ‘Runaways’ Movie Be The New ‘Goonies’? Marvel President Hopes So

Most Enterprises Unprotected Against E-mail Security Risks, Study Says

Given the prevalence of spam and the sensationalism that attends just about any incident of unauthorized data leakage, you'd think most shops would have taken steps to proactively protect against both threats.

According to a new survey from market watcher IDC -- sponsored by gateway security specialist Secure Computing Corp. -- you'd be wrong.

The IDC study collected responses from 100 IT pros (including security decision makers) employed by companies with 500 or more employees. It found that nearly three-quarters (72 percent) of organizations don't have solutions in place to prevent data leakage over e-mail.


What's more, IDC said, only slightly more than 10 percent of organizations have actually deployed effective anti-spam technologies. "Organizations need to increase their efforts in combating e-mail security risks," said Brian Burke, IDC's program director for security products, in a prepared release. "While organizations have expressed concern about inbound and outbound e-mail security, their current solutions are not getting the job done. Only 11 percent of those surveyed had adequate inbound protection, and over 70 percent have nothing in place for data loss prevention on e-mail. Such organizations need to take advantage of new solutions and delivery models."

Secure Computing markets a number of solutions to prevent such problems, but the IDC survey doesn't discuss specific vendors. Instead, it touts general technology prescriptions, such as cutting-edge anti-spam or data leakage prevention tools.

For example, 85 percent of respondents say they're "Very Concerned" or "Extremely Concerned" about data leakage over e-mail. In spite of this, IDC found, just over a quarter (28 percent) of shops have actually implemented technology solutions designed to safeguard against leakage -- although more than half said they planned to do so starting next year.

The scope of the problem is probably much bigger than many IT pros realize. According to IDC, the vast majority -- perhaps as much as 80 or 90 percent -- of data loss incidents are accidental. Most of the companies IDC surveyed seem to agree, rating the risk of accidental data leakage as higher than that of deliberate theft. Just 5 percent of respondents said they were "Extremely Concerned" about intentional theft from within, while 44 percent admitted being "Extremely Concerned" about accidental data loss.

According to IDC, more than a quarter (28 percent) of large shops say their spam complaints have increased significantly this year. The reason, IDC said, is that many of them rely on older technologies that can't keep pace with both the increasing volume and sophistication of spam attacks.

The good news, according to the report, is that today's most sophisticated tools can block almost 100 percent of unsolicited spam. The bad news, conversely, is that just 11 percent of shops have deployed solutions (or implemented policies) that meet this standard -- while nearly two-thirds (60 percent) say they can't even block 95 percent of spam communications. Given the ever-escalating volume and variety of spam, this means that more unsolicited e-mails, IMs and other communications are getting through than ever.

IDC concludes with a technology prescription: "Organizations must accelerate their adoption of next-generation e-mail security solutions. The cost of not doing so is increased malware infection through spam and increased data leakage."


Virginia Supreme Court Rules Against Anti-Spam Law
McAfee To Acquire Secure Computing
‘Choke’: Addicted To Love, By Kurt Loder

Vista SP2 Goes Out in Limited Beta

Microsoft will release the second service pack for Vista "to a small group of Technology Adoption Program customers" today.

SP2 for Vista is being made available to selected testers for beta evaluation on Wednesday, according to Microsoft's Mike Nash, corporate vice president, Windows Product Management, in a statement released last week. The beta release will include updates for both client and server (Windows 2008), just as SP1 for Vista did in February.

Nash said that Microsoft is committed to "continually improving Windows," and that the final release date for SP2 will be based on quality ascertained by customer and partner feedback.


New features in SP2 include an improved Windows Search 4.0, a feature pack for Bluetooth 2.1, Blu-Ray enhancements, a new Windows Connect Now for Wi-Fi configuration and a system to support UTC timestamps.

In a statement on Microsoft's Springboard blog, Celine Allee wrote that Windows SP2 is anticipated to go live in the first half of 2009.

"We expect that Windows Vista SP2 will retain compatibility with applications that run on Windows Vista and Windows Vista SP1 and are written using public APIs," stated Allee in her Oct. 24 post. "Consequently, you can (and should) continue your plans for adopting Windows Vista SP1, and roll SP2 into your deployment image when it becomes available." Windows Vista SP1 is a prerequisite for installing SP2.

Windows Vista is widely seen as a stepping stone to Windows 7, previewed this week at Redmond's Microsoft Professional Developers Conference in Los Angeles. John Lister noted in an "Infopacket" post that it was "unexpected" that Microsoft would issue a service pack so quickly after the first update in the spring. He notes in the October 28 post that the release of SP2 shows that Microsoft will likely continue to support and improve Vista in an effort to defer XP users from skipping Vista in anticipation of Windows 7.

Nash advised in his post last week that he will be sharing more about Windows Vista SP2 and Windows 7 in the near future.


‘Windows 7′ No Longer a Code Name
‘High School Musical 3′ Named Most Likely To Succeed At Box Office

PDC: Visual Studio 2010 CTP Adds Parallel Computing Tools

Microsoft released a Community Technology Preview (CTP) of Visual Studio 2010 and .NET 4.0 at its Professional Developers Conference (PDC) on Monday, and developers may be pleased with the new parallel computing capabilities now available in the CTP.

To better deal with the problems of concurrency, where threads of code run at the same time and sometimes share resources, Microsoft has added some new .NET Framework libraries to work with Visual Studio 2010.

The point of the new libraries is to provide developers with profiling and debugging tools for parallel computing while using the familiar Visual Studio interface.


The new .NET libraries include "the Task Parallel Library and Parallel LINQ, as well as the Parallel Pattern Library and Concurrency Runtime for developing native applications with C++," according to an Oct. 27 blog by Soma Somasegar, Microsoft's senior vice president of the Developer Division. The new Concurrency Runtime supports Windows operating systems down to XP.

In addition to that bit of news, this week's PDC features several sessions on parallel computing. On Monday, developers got a practical example of how to use Visual Studio Team System (VSTS) 2010 to debug a parallel app. The demo was presented by Hazim Shafi, Microsoft's principal architect on the parallel computing platform team. Shafi presented on the topic," Microsoft Visual Studio: Bringing out the Best in Multicore Systems."

Multicore is a bit limited these days, as it refers to about eight hardware processors working simultaneously.

Shafi noted that developers are being dragged into parallel computing by the limitations of hardware. Manufacturers can't make CPU chipsets faster that that won't overheat and melt. Hence, to exploit speed in code, developers have to think about taking advantage of that extra processing capacity enabled by multicore systems.

"The most important point to remember now and today is that single-thread performance is going to be relatively flat for the foreseeable future," Shafi said. "So we no longer have the free lunch of a doubling in performance every year and a half that we used to have for free, basically. What that means to us as software professionals is that the burden has shifted to us. We have to do more work now in order to exploit the latest performance that exists in these processors."

For developers, the worst thing to deal with is a sequential app that you have to parallelize, he said. In that case, you don't want to rewrite all of your code.

In a multicore world, the new problem for developers is that you have to worry about things like synchronization, he added.

To illustrate that problem, Shafi demonstrated a ray-trace graphics application. The demo, when run, showed two mirrored spheres. One sphere was bouncing and carrying the reflection of the other. The demo showed an out-of-sync bouncing sphere, and Shafi showed how the VSTS 2010 development environment can be used to debug such an application.

The profiler in VSTS 2010 can display a summary page that shows the resource use of the app. In the demo, the summary page showed how much time was spent in loop-computing the luminescence of every pixel in the lighted space of the ray-trace animation.

To measure performance issues, VSTS 2010 has a CPU utilization analysis view that shows the varying time spent using the core. There is also a core execution analysis view that shows you how your threads were scheduled on your hardware system.

You can switch to a thread locking view in VSTS 2010 to determine when a thread is running or not running.

VSTS 2010 also has profiler graphs that will show I/O delays in a color-coded display. You can click on the graph where a delay occurs and it will point to the code-line number in VSTS 2010 associated with that delay. Shafi said he used to do this kind of debugging by hand, and while there's still work to do, VSTS 2010 points to the problem and "it's relatively painless," he said.

"What we're trying to do here is release you from the painful aspects of parallelism," Shafi told the developer crowd.

Some coding changes made by Shafi involved substituting "for" loop statements with "parallel_for" statements -- a seemingly easy change to make.

Shafi didn't present VSTS 2010 as a cure-all for parallel computing woes, but he did offer some advice on how best to avoid problems. He recommended planning for parallelism in the future in applications, but "don't parallelize unless it's absolutely necessary," he said. Identify the areas of applications amenable to parallelism. The basic debugging approach is to reduce the I/O, improve caching and use efficient algorithms, he added.

At the end of the demo, after some code fixing, Shafi showed a bouncing ray-traced sphere that kept in perfect pace with its reflection. It all looked so easy!


‘Burn After Reading’ Edges Out Tyler Perry At Weekend Box Office
It’s Official: Visual Studio 2010 and .NET Framework 4.0
Cray Unveils CX1 Supercomputer for Office Use

Tuesday, October 28, 2008

First Look: Google Chrome

I've used quite a few of Google's other products (Gmail, Google Earth/Maps, Docs, etc.) so I was more than ready to try its browser, Google Chrome.

Several seconds after navigating to the Chrome Web site, I had the browser package downloaded (the file weighs in at a light 475KB, which is rather impressive these days). After an easy installation, Chrome automatically imported my Firefox bookmarks, which was a nice touch.

At first glance, Chrome has the makings of a nice, lightweight browser comparable to Konqueror or Safari. My average memory footprint was about 28MB on Windows XP, which was considerably lighter than Firefox.


In terms of its looks, Chrome loosely resembles Vista's Aero. From a usability standpoint, Chrome is fairly elegant; it has a minimalist interface with the back, forward and refresh buttons displayed prominently. Like practically all modern browsers, Chrome features tabbed browsing.

Because Chrome is still in beta, I was expecting to see some areas that needed improvement -- and I did. The default start page has a nice feature that displays thumbnails of your most frequently viewed Web sites (though it takes awhile for this list to populate), but aside from that, it's strangely bare. Since the people who are most likely using Chrome probably use other Google tools, I was expecting to see tie-ins for Gmail or Docs. I also missed Firefox's extension capabilities.

I also noticed that there was no way to subscribe to an RSS feed in Chrome. Chrome's configuration interface looked a little bit sparse and didn't have as many options and features as other browsers. Furthermore, I had to test chrome in a virtualized XP environment, since there is no native Linux version of Chrome at this time.

On the other hand, Chrome did better than Firefox on the Acid 3 test, scoring an impressive 79 out of 100 over Firefox's 69 (although the Linktest failed). Like Safari, Chrome uses the Webkit rendering engine. Chrome also routinely downloads a blacklist and whitelist meant to protect users from phishing sites.

Chrome has the makings of a good browser, but still lacks a few important features. Google's beta-quality software is usually very good (Gmail is fully usable despite having been in beta for years now) so you may want to consider Chrome if you need something fast and lightweight.


Google’s Chrome Browser Contains Microsoft Code
Google Plunges Into Browser Market
‘City of Ember’: Down Town, By Kurt Loder

Security Concerns May Slow Cloud Computing Adoption

Microsoft may have reaffirmed its commitment to cloud computing with the launch of its Windows Azure operating system at this week's Professional Developers Conference (PDC), but overall business adoption of the cloud concept may be stymied, in part, by security issues.

Speaking about his company's "Digital Disruptions" report released this week, Computer Sciences Corp.'s Chief Innovation Officer Lem Lasher said questions over data security will be a major impediment to enterprise adoption of cloud computing.

"In time, IT can shift its focus and security can evolve," Lasher said. "But until wide adoptions of 'datacenters in the sky' become a reality, there will continue to be a need for functionally rich, actual, traditional PCs on which people will hold data and perform all sorts of tasks across the businesses."


For this reason, the critical mass in the enterprise realm that Microsoft has captured in the past -- and which it hopes to tap by moving enterprises from the software-in-a-box OS to the Web -- may not be so easily moved. After all, there is an independent security vendor ecosystem to consider.

For its part, Microsoft admitted Tuesday at PDC that many applications, including third-party security programs stacked on its Azure Services Platform that exceed their allocated storage or processing hours, could drop off because cloud computing is designed to be a customized, utility-based computing service.

To that end, if businesses can't keep track of certain applications on a "floating" OS (most security apps are designed to do their jobs invisibly), or if they have to customize a system for the applications that are used most frequently, it will also be difficult to track who is doing what. This is one of the dangers inherent in sending data outside an enterprise firewall onto remote servers -- which also happens to be the hallmark of cloud computing.

This is also problematic for enterprise IT auditing. Many companies across various industries have rigid compliance standards to meet, which means that harnessing, calling up and protecting data both within and without company networks is a tall order.

Still, there are many benefits to cloud computing, such as mobile accessibility for users and improved collaboration. And a survey released this summer from Gartner suggests that the very nature of cloud computing will create a plethora of snap-on security functionality possibilities with the ability to quickly scale or change applications as needed.

In that vein, Lasher said, "There are great possibilities, but there is also great risk. And until security and data issues are satisfactorily resolved, there won't be full-scale adoption. We'll see a more hybrid world, in which cloud computing will be used for certain types of applications, but it will be very application-specific and will have to evolve with the concept."


‘Twilight’ Tuesday: Catherine Hardwicke Gets Swept Up By Bella And Edward’s ‘Obsessive Love’
Coming to Terms With Cloud Computing
Amazon To Host Microsoft Solutions in the Cloud

PDC: First Look at Live Mesh and Live Framework

David Treadwell, Microsoft's corporate VP of the Live Services Platform, announced the new Live Framework during a Tuesday keynote that addressed Windows 7 and other tools and technologies that will comprise the front-end infrastructure of the company's unfolding Software plus Services strategy.

The Live Framework is described as a "uniform" way for developers to program against Live Services, which are integrated with Microsoft's Live Mesh platform. The new framework, released as a community technology preview (CTP) on Monday, is designed to support multiple programming languages, devices and platforms.

Announced in April, Live Mesh is downloadable PC software that extends Windows to the Web to enable synchronization and sharing of data, apps, people and devices. Microsoft is releasing the first open beta of Live Mesh this week. It will offer limited availability to Mac and Windows Mobile 6 clients, according to the company.


"The Live Mesh application is built completely on top of Live Framework," said Ori Amiga, group program manager of the Live Developer Platform at Microsoft.

The Live Framework is composed of the Live operating environment -- similar to the .NET CLR -- and a programming architecture that includes a resource model and managed libraries, specifically, the Silverlight Kit, the .NET Kit and the AtomPub .NET Library. The CTP is supported on Windows XP, Vista, Internet Explorer 7, IE 8 and Firefox.

Live Services, which are designed to handle user data and app resources (Live ID, Live Messenger, etc.), are part of the newly announced Azure Services Platform for Microsoft's Web tier. The Azure Services Platform sits on top of Windows Azure, Microsoft's cloud operating system, which was unveiled at PDC on Monday.

Developers can sign up for Live Mesh provisioning here, and go here for the Live Framework CTP, which requires Visual Studio 2008, .NET 3.5 and Silverlight 2.


‘Twilight’ Tuesday: Robert Pattinson, Kristen Stewart And Taylor Lautner Will Present At The VMAs
PDC: Microsoft Releases Geneva Beta
Sync Framework: It’s Still at Version 1.0

PDC: Microsoft Outlines Windows 7, Slew of New Dev Technologies

Coming off yesterday's launch of its Azure cloud-based services, Microsoft today articulated how it will bridge PCs and mobile devices with an extraordinary blitz of announcements that included the first demonstration of Windows 7, its Live Framework, a bevy of new offerings for developers including WPF support for Visual Studio, and the first preview of Office 14.

The goal was to show how Microsoft plans to deliver on its long-stated strategy of bridging native clients with other devices and the Web.

"The PC is adapting once again to be even more relevant and more valuable in this era that is so centered on the Web," said Ray Ozzie, Microsoft's chief software architect, in today's keynote session. "It's our objective to make the combination of the PC, phone and Web of clearly more value to our customers than just the sum of their parts."


Steven Sinofsky, senior vice president of the Windows and Windows Live Engineering Group, gave an extensive demo of Windows 7, showing what promise to be well-received tweaks and enhancements to the user interface, including a cleaned-up Taskbar, streamlined access to applications and system resources, and the addition of the Microsoft Touch interface for touchscreens.

The Touch demo showed how the Windows UI senses and adapts to touch commands, and illustrated application-level support in Word, Internet Explorer and a reworked update of MS Paint.

Sinofsky delivered a few mea culpas regarding Windows 7, which he said was finally up to snuff with the Service Pack 1 release. Sinofsky addressed shortcomings in Vista's initial compatibility, device support and even shipping schedules.

PDC attendees will leave with a pre-beta version of Windows 7, which Sinofsky urged developers to play with and provide feedback. A beta will be made available for general download, though Sinofsky declined to provide a specific timeline.

"We finish the one we're working on and that informs when the next one will be done," Sinofsky said. "We think that three years from the general availability of Windows Vista is the right timeframe for Windows 7."

From a user perspective, Windows 7 caters heavily to networked home users, who have traditionally struggled to get multiple PCs and devices to play nicely with each other. Automated Windows 7 client and device discovery promises to expose available devices and their capabilities. It wasn't clear in the keynote what device manufacturers will have to do to make their device functions fully visible and accessible to Windows 7 clients.

Guthrie Takes Charge
Showing how Microsoft will bridge the traditional client with cloud-based services, Scott Guthrie, corporate vice president of the .NET Developer Division, gave an extensive list of announcements that covered Windows 7, Visual Studio 2010, NET 4.0, WPF, Silverlight 2, ASP.NET and jQuery, among others.

Guthrie announced that attendees would receive a CTP of Visual Studio 2010, the first public release of the upcoming version of Microsoft's flagship IDE. The new version, Guthrie said, was built using WPF and will enable new features such as multi-monitor support, richer code-editing and refactoring support. Richer code visualization is also promised.

The features are built under a new component model called the Managed Extensibility Framework (MEF), which Guthrie said will provide "the ability for you to very easily assemble applications from extensions and to be able to dynamically add extensions from within applications." According to Guthrie, MEF will enable developers and ISVs to plug their own functionality into Visual Studio, a prospect that will enable "really interesting scenarios and a tremendous amount of agility," Guthrie said.

The .NET Framework got ample attention in Guthrie's presentation, as well. He emphasized performance improvements delivered by the recent .NET 3.5 SP1 release, which Guthrie said provides existing .NET applications a 40 percent boost in cold-startup performance. He also highlighted streamlined installations, both of the .NET Framework itself on unequipped machines, and of individual .NET applications. .NET 3.5 SP1 will be built into Windows 7, Guthrie said.

Looking forward to .NET 4.0, Guthrie promised enhancements to WPF, including the addition of multi-touch and Deep Zoom support. He also called out the addition of dynamic language runtime (DLR) support for C# and other historically static programming languages.

Also announced was the release of the new WPF toolkit that works with .NET 3.5 SP1 across all supported versions of Windows. The toolkit includes new data grid, date picker and calendar controls, as well as support for Visual State Manager, which allows "the exact same control templates that you built with Silverlight for WPF, including support within Expression Blend as well as our developer tools." Visual State Manager enables developers to easily define and tune the behavior of controls in response to specific states.

Guthrie also announced the final release of the open source jQuery JavaScript library for ASP.NET. jQuery is being released under the permissive Ms-PL license.


Sync Framework: It’s Still at Version 1.0
Silverlight 2 Now Available

Monday, October 27, 2008

Spam Attacks on the Rise in Q3

Identity thieves and hackers appear to be coming at Windows users from all fronts, most commonly with spam. They were particularly active during the third quarter of 2008, when they did it eight times more frequently than in the previous quarter, according to a report released Monday by Sophos.

The study found that one out of every 416 e-mails in Q3 contained bogus or malicious attachments. This is up significantly from the period between March and June, which tallied just one such e-mail for every 3,333.

"For Apple Mac and Unix lovers, these major spam attacks just mean a clogged-up inbox, not an infected operating system. But organized criminals are causing havoc for Windows users in the hunt for cold, hard cash," said Graham Cluley, senior technology consultant at Sophos, in an e-mail to Redmondmag.com.


It's hard to get a clear read on the number of successful attacks because enterprises don't want to tip their hand and embarrassed users aren't exactly chomping at the bit to report these occurrences. But the Sophos report can't simply be dismissed as an aberration or a marketing tool for anti-virus software; a separate report, also released Monday, appears to confirm the pattern of spam growth.

Enterprise gateway security firm Secure Computing Corp.'s Q3 2008 Internet Threats Report found that spam volumes returned to record highs in the quarter with "fairly steady monthly increases throughout the summer." Over 5,000 new zombies were created every hour, according to Secure Computing's report. Among the most common attacks were the Agent-HNY Trojan, which was responsible for more than 25 percent of all e-mail attachment malware in the quarter.

The EncPk-CZ Trojan is another example of the type of spam that seeps into inboxes (one such malicious virus came disguised as a Microsoft security patch). The Sophos report stated that EncPK strains accounted for 12 percent of all the disclosed e-mail-borne bugs over the past three months.

As for Secure Computing's study, the company said the increase in the spam it tracked was due to malicious e-mails with the subject line "Your bank has failed," "Breaking News," "Delivery Status Notification" or "Election scoop" -- with the latter invariably mentioning the name "Obama." It turns out that 80 percent of election-related spam currently bears the Democratic presidential nominee's name. Secure Computing's TrustedSource Labs estimates the number of worldwide U.S. election-related spam e-mails at approximately 100 million messages per day.

Meanwhile, Sophos' Cluley said that hackers count on end users to "click without thinking, thus exposing themselves to hackers hell-bent on gaining access to confidential information and raiding bank accounts."


Guillermo Del Toro Says ‘Hobbit’ Script Is Coming Together ‘Magically’
Fake Microsoft Security Update Makes Rounds
SQL Injection Attack Hits BusinessWeek Site
Jesse McCartney Finally Hits Big Screen In ‘Keith’ — But Is He ‘The Next Leonardo DiCaprio’?

Organizations Cutting Back on IT Security Staff

The IT security function, as a percentage of total IT staff at enterprise organizations, appears to be in decline, according to an addendum on security in a research report released this week by Computer Economics.

The findings of the report, titled "IT Staffing Ratios and Trends," were gathered from respondents at 200 IT organizations and covered what the Irvine, Calif.-based research shop identified as 14 key IT functions, including security.

The study found that even as overall IT staffing across other functions -- such as database administration and application development -- are still increasing, staffing levels for security pros have been in steady decline over the last three years, coming in at just 1.5 percent in 2008 (compared to 1.8 percent in 2007 and 2 percent in 2006).


This leads to the question: Is security no longer a top priority in lean times?

"Clearly, there are a variety of reasons for this decline but the lack of a focus on security isn't one of them," said Frank Scavo, president of Computer Economics. "Qualified security personnel are in short supply and security as a mechanism of an IT department inside an organization is becoming less specialized. Plus, you have people in other areas whose duties also include a security element."

Scavo added that if an enterprise does has a staff member or a small group of workers whose sole purpose is the design, implementation and monitoring of a comprehensive security program, then "you don't need a lot of people for that."

For instance, a systems administrator and network administrator who can configure security parameters for the processing environment can, in tandem with an outside consultant or a third-party security software, eliminate the need for a full-fledged security staffer.

In that vein, the reality that specialized security personnel must take into account is that even though they are still in high demand, the high cost involved in deploying in-house security professionals in a real-time, 24/7 environment can be prohibitive.

To that end, security software and outsourced security functions are becoming an ever-increasing alternative to hiring actual people -- which may also account for the decline in in-house security staffers.

In the end, though, there's still another level of security involved in, well, deploying security.

"Needs in the enterprise security space are definitely changing," said Kelly Kavanagh, a senior analyst for information security strategy at Gartner. "Yet and still, the IT executive should so some footwork himself. This means logging on and making sure who you're hiring is not a couple of guys with beepers in an office suite. If you can, get references. We are talking about security, after all."


Survey Indicates Strong Demand for Networking Talent
‘Tropic Thunder’ Rolls Over Box-Office Competition For Second Week
Small Increase Predicted in Demand for IT Workers

PDC: Microsoft Releases Geneva Beta

Microsoft today released the first beta of its federated identity services framework aimed at simplifying the way enterprises deploy authentication services.

The company's Geneva framework is aimed at bringing claims-based federated identity management that extends to individuals, enterprises and online services. The framework allows organizations to deploy various standards-based identity protocols -- including SAML, WS-Federated, WS-Identity and OpenID -- and provides a common exchange across various gateways and security token services (STSs).

Geneva will bring a write-once anywhere model that should take the burden off developers from addressing identity into their applications, said Kim Cameron, Microsoft's chief architect of identity and a Microsoft distinguished engineer, speaking at Microsoft's Professional Developer Conference in Los Angeles.


"The question became how we could develop a framework where we could insulate [developers] from this turbulence that was consuming their time with things that were not core to them," Cameron said. "The goal was, you would be able to have a framework where you can write your application once and [have] it work automatically in all scenarios."

Geneva consists of three key components: a server edition, a framework consisting of .NET classes and the Windows CardSpace Geneva client. It was released to beta today, though it's not feature-complete. A full beta release is planned for the first half of 2009, while the company intends to release it to manufacturing by the end of next year.

The company unveiled .NET Access Control Services, which is part of Microsoft's newly launched cloud-based service called Windows Azure (until today known as Project Red Dog). It is a tool that allows developers to determine access controls, Cameron said. The first CTP was released today, and a refresh is due out by year's end.


‘Eagle Eye’ Sets Sights On Box-Office Crown
‘Moving Midway’: Race Relations, By Kurt Loder
Microsoft Unleashes Tool For Web Developers

PDC: Microsoft Calls New Cloud Computing OS a 'Turning Point' for Company

At its Professional Developers Conference in Los Angeles this week, Microsoft is unveiling the end-to-end vision for its Software plus Services platform.

On Monday, key executives and partners gave a two-and-a-half-hour keynote on what Chief Software Architect Ray Ozzie termed the "back-end" infrastructure for the company's emerging cloud platform. The biggest news: Microsoft is running a kernel operating system (Project Red Dog) on its connected servers in its datacenters. The new service-based operating system is called Windows Azure.

"It's the transformation of our software, it's the transformation of our strategy and our offerings across the board to fundamentally embrace services," said Ozzie, who described the new Azure platform as a turning point for Microsoft.


Azure is Microsoft's operating system for the Web tier, he explained, joining Windows Server in the enterprise and Vista and Windows Mobile in what he called the "experience" tier.

"Windows Azure is our lowest-level foundation for building and deploying a high-scale service," Ozzie said, "providing core capabilities such as virtualized computation, scalable storage in the form of blobs, tables and streams and, perhaps most importantly, an automated service management system, a fabric controller that handles provisioning, geo-distribution and the entire lifecycle of a cloud-based service."

Services Platform
Microsoft also announced the Azure Services Platform for developers, which sits on top of Azure. It is comprised of several components including Live Services, .NET Services, SQL Services (Reporting and Data Analysis in addition to SQL Data Services which has been renamed), SharePoint Online and Dynamic CRM Online. The .NET Services, for now, consist of a service bus for connecting on-premise apps to the cloud, access control that enables federation across existing identity providers into the cloud and workflow that will be extended to cloud services.

Developers will build applications based on service models and patterns using the functionality in the Azure Service Platform and familiar tooling such as ASP.NET and Visual Studio 2008, as well as the new "Oslo" modeling platform.

The Azure platform manages apps separately from the operating system. The platform offers a simulation of the cloud for debugging and testing on the local development desktop. Developers provide Microsoft with the code for the service and the architecture of the service model, and the company provided automated service management.

Early adopters that showed Silverlight applications running on the Azure platform during the keynote and a follow-up session included Bluehoo, which highlighted a social networking mobile app that uses Bluetooth to find people in close proximity with similar interests; and FullArmor, a startup that created a Policy Portal already in use by the Ethiopian government.

First Look at Windows Azure
PDC attendees are getting a first look at the Azure SDK and key platform components, specifically Live Services, .NET Services and SQL Services. The Azure Services Platform was launched at noon PST on Monday.

The first CTP will showcase a "fraction of the features," according to Microsoft. For example the Azure storage features in the CTP support blobs, queues and simple tables, but more advanced features such field streams, caches and locks are not exposed in the preview.

Attendees can sign up and request service activation free of charge, and Microsoft will respond with a token within two weeks. After two weeks, the CTP, which supports ASP.NET and .NET languages, will become open to all MSDN subscribers, according to Microsoft. Support for native code and PHP are planned but not in the initial CTP.

"It was a surprise to me that they labeled it Windows," said one developer who works for a Fortune 100 tech company. "It wasn't Microsoft Azure, it was Windows Azure. I see this as a direct competitor to Google Code to some extent and obviously Amazon Elastic Compute Cloud.

"When you think of it, it is physical but I can't see how people can shift their trend reports from the service architecture to the cloud," he continued. "[Microsoft has] shown a Web site that says here's how you run this thing, but we haven't seen a lot of the application lifecycle. So there is a promise and it is very easy to get in, but what happens since then?"

Many developers had questions about more advanced functionality and Azure's usage in production environments today despite its status as an early preview. "Will I be able to run Windows Azure in my datacenter?" one attendee asked. The answer is no, according to Manuvir Das, director of the Azure platform at Microsoft, who said he could not comment on whether that will change.

Microsoft will be unlocking access to new capabilities in the coming months. The Azure and Azure Services Platform roadmap will be determined in part by developer feedback, the company said. Microsoft will not charge developers during the technical previews, although there is a quota for service usage. The business model for the commercial product, expected in the 2009 calendar year, will be based on applications' resource consumption and service-level agreements.


Windows 7 Bits To Be Released at PDC’08
‘Flash of Genius’: War Story, By Kurt Loder
‘The House Bunny’: Butterflies, By Kurt Loder
Amazon To Host Microsoft Solutions in the Cloud

Saturday, October 25, 2008

IT Pros List Coolest Security Jobs

Looking for the coolest jobs in IT? A new survey of information technology and network security specialists suggests that the place to look is on the front lines of cyberspace -- and that the variety of work is greater than many might suspect.

The survey was conducted by the SANS Institute, the IT security training and research organization. The results will be published later this year in the form of an information booklet aimed at interesting students to consider a career in IT security.

The institute asked information security practitioners in government and non-government positions to describe their jobs and the most interesting aspects about working in them. It also asked security specialists to select which jobs they thought were "very cool."


The top-ranking "coolest" IT security jobs according to government security employees:

1. Information security crime investigator/forensics expert.
Why it's cool: "The thrill of the hunt! You never encounter the same crime twice!"

2. System, network and/or Web penetration tester.
Why it's cool: "You can be a hacker, but do it legally and get paid a lot of money!" "The power to understand how systems can be penetrated and misused is something less than 1 percent of people in the entire security industry know, let alone the average citizen."

3. Forensics analyst
Why it's cool: "It's CSI for cyber geeks!" "It's like being one of the good spies on James Bond." "Trying to find evidence without altering the system and maintaining the chain of evidence is challenging."

4. (Tie) Incident response, incident handler
Why it's cool: "This may be the top of the 'top gun' jobs because it lets you move into a cooler, analytical environment where you can go deep with your knowledge." "You get visibility with your organization when they happen." "Like the secret agent of tech geekdom."

4. (Tie) Security architect
Why it's cool: "You get to design the solution, and not just for the perimeter." "You get to work with all the tech experts as a team, to plan the technology directions." "You get to research and play with new 'toys' all the time."

6. Vulnerability researcher
Why it's cool: "You get to tear apart malware and find out how it ticks." "Reverse engineers take a deep look into code segments to determine what is really happening under the hood." "It's a very exclusive club."

7. (Tie) Network security engineer
Why it's cool: "If there's one person indispensable, it's the network person." "This is where the action is and where everything is in a state of flux with newer and newer technology."

7. (Tie) Security analyst
Why it's cool: "This job has influence at the top of the organization." "If you want to make a difference but don't necessarily want all the managerial BS, this is the job for you." "It is the only clear path to the real top gun of security: chief information security officer."

7. (Tie) Sworn law enforcement officer specializing in information security crime
Why it's cool: "Ability to catch the bad guys ... the end result is a rush." "This is where the geeks among us can really show up the jocks." "Security specialist and you get to carry a gun!"

10. (Tie) CISO/ISO or director of security
Why it's cool: "I can get a lot done with little to no push back." "You get to decide where to build the 'watch towers,' how many rangers are stationed in the park, where fires can be safely built, and the rules of engagement."

10. (Tie) Application penetration tester
Why it's cool: "You're an 'ethical hacker.'" "It takes equal parts technical ability and creativity," "Combines applying different thought processes to system analysis with exploration tools, and a sort of dangerous level of knowledge."

The next highest ranking positions:

Security operations center analyst Prosecutor specializing in information security crime Technical director and deputy CISO Firewall/IPS administrator Security evangelist Vulnerability assessment analyst Security auditor Security assessment consultant Technical security teacher Security savvy software developer Security maven in the application developer organization Disaster recovery/business continuity analyst/manager

"Of particular interest to me," observed Alan Paller, the SANS Institute's Director of Research, "are the low rankings that government people give the CISOs." The chief information security officer position "ranked much higher in the non-government world."

He also noted the high ranking that non-government people give application penetration testing, "illuminating the fact that the government hasn't yet focused [as much as commercial organizations] on the critical new attack vector of application-based attacks," he said.

The top-ranking "coolest" IT security jobs according to non-government security employees:

1. (Tie) System, Network, and/or Web penetration tester
1. (Tie) Information security crime investigator/forensics expert
3. Forensics analyst
4. Vulnerability researcher
5. Application penetration tester
6. Security architect
7. CISO/ISO or director of security
8. (Tie) Incident response, incident handler
8. (Tie) Sworn law enforcement officer specializing in information security crime
10. Security evangelist




Malware Disguised as Social Networking Tops Emerging Security Threats
Sacha Baron Cohen Detained By Italian Authorities After Crashing Catwalk

Friday, October 24, 2008

Microsoft Weathers Bad Economy in Its First Quarter

Microsoft yesterday reported a nine percent increase in overall revenue for its first quarter of fiscal-year 2009, which ended on Sept. 30, 2008. Revenue was $15.06 billion compared with $13.762 billion in the previous first quarter.

With securitized bad U.S. housing debts dragging down the world economy, Microsoft's result seemed to be a minor uplift, but not enough. The company still plans belt-tightening measures because of lowered IT spending prospects.

The Windows operating system wasn't the profit center for the quarter, according to Microsoft's 10-Q Form.


"Revenue growth was driven primarily by increased licensing of the 2007 Microsoft Office system and increased revenue associated with Windows Server and SQL Server," the form stated.

Overall operating income increased three percent from quarter to quarter, reaching $5.999 billion vs. $5.849 billion in the previous first quarter. Diluted earnings per share for the first quarter of fiscal-year 2009 were up seven percent, at $0.48 compared with $0.45 in the previous quarter.

Chris Liddell, Microsoft's chief financial officer, said in a Web broadcast on Thursday that "revenue, operating income and earnings per share [were] all meeting or exceeding the high end of our guidance, despite what is obviously a very challenging environment."

Liddell noted a deterioration in spending in the last few months of Microsoft's first quarter and predicted a continued general economic slowdown. In response, Microsoft has adjusted its guidance downward.

In the best-case scenario, company officials are assuming a mild recession and modest growth rate for IT products. At the bottom end, a deeper recession is predicted.

For the second quarter of 2009, Microsoft's stated revenue guidance is $17.3 billion to $17.8 billion. Operating income guidance in 2Q 2009 is $6.1 billion to $6.4 billion. Diluted earnings per share guidance is pegged at the $0.51 to $0.53 range.

Liddell predicted that PC hardware demand in Microsoft's fiscal second quarter would be around 10 percent to 12 percent. However, demand will be higher in emerging markets, growing to the "mid-to-high teens," he said.

Bill Koefeod, General Manager of IR at Microsoft, noted that it is too soon to say whether the "netbook" market was cannibalizing the traditional PC market. Netbooks are typically lower cost portable computers, often running Linux or Windows XP instead of Microsoft's current Vista operating system.

Liddell said that Microsoft was planning to reduce its operating expenses in reaction to the slowing economy. Measures would include lowering Microsoft's head count, marketing costs, capital expenditures and travel costs.

In response to a question from a financial analyst, Liddell clarified that Microsoft "won't have as many people" in the near future but that the job cuts would be "an FY '10 issue." Possibly, his response means that Microsoft will cut jobs in September of 2009.

Microsoft continues to incur costs in its Online Services Business Division -- the only part of its business that racked up operating losses for the quarter. The division had an operating loss of $480 million for the first quarter of fiscal-year 2009. In the prior first quarter, the Online Services Business Division showed an operating loss of $267 million. The division is Microsoft's main force aiming to wrest online advertising share from Google via the Microsoft Live Search product.

Microsoft's CEO Steve Ballmer has said that Microsoft would "be willing to lose between five and ten percent of our total operating income for several years" to catch up with its online search advertising competitors, which include both Google and Yahoo.

Microsoft's Client offerings, which include the Windows operating systems (but not the Windows servers) only showed a two percent revenue increase, quarter over quarter. The Windows sales were partly offset by costs from marketing ($123 million) and research and development ($45 million).

Microsoft's first-quarter financial reports can be accessed here.

Thursday, October 23, 2008

Off-Cycle Microsoft Patch Targets Worm-Like Windows Bug

Redmond on Thursday released a critical out-of-cycle security patch affecting Windows 2000, Windows XP and Windows Server 2003 systems.

The software giant said weaknesses in server service mechanisms within these OSes could allow for remote code execution (RCE) exploits through the use of a "specially crafted" remote procedure call (RPC) request.

RPC technology, first adopted by Microsoft in the mid-1980s, allows subroutine code to execute on other computers on a shared network. What's unique about this RPC vulnerability is that subroutines can be executed without programmer interference. It allows an almost automatic remote interaction between CPUs in a shared processing environment.


An attacker could exploit this vulnerability in the affected Windows OSes and run arbitrary code without authentication. Redmond is hastening an out-of-cycle patch because the vulnerability is reminiscent of self-replicating malware or a "wormable exploit," as Microsoft calls it.

"Based on the number of Windows systems that are potentially exposed to a massive attack, it was in Microsoft's best interest to just go ahead and patch it," said Jon Oltsik, an analyst at Milfort, Mass.-based IT research firm Enterprise Strategy Group. "This exploit that applies to this fix is not in the wild to a great degree but the thinking behind the bulletin was probably, 'why wait.'"

Security experts say that for users running newer versions of Windows, such as Vista and Windows Server 2008, the potential attack associated with this bulletin cannot be anonymous and must use authenticated user credentials to exploit the vulnerability. However, they do warn that this does not mean it's impossible to exploit the vulnerability in a newer Windows OS. It just won't be as easy.

Nevertheless, the common consensus among observers is that IT pros should install the patch now.

"In normal situations, administrators could typically test the patch against their production network to ensure the patch does not break functionality," said Jason Miller, security data team manager at St. Paul, Minn.-based Shavlik Technologies. "But in this situation, enterprise IT workers should patch this vulnerability immediately to their servers and workstations."

It's not often that Redmond issues off-cycle or out-of-band patches. It's done so just a handful of times since 2006. Coincidentally, 2006 was the year a similar patch pertaining to this issue was released. Thursday's patch replaces that September 2006 hotfix.

Because the fix is critical and will require a restart, security pros say IT managers and staff should collaborate to ensure seamless installation and testing. They recommend coordinating with desktop or end-point support personnel, as well as with network administrators and off-site consultants, where applicable.


September Patch To Fix Windows GDI Exploit and More
Shia LaBeouf Defends ‘Nuking The Fridge’ And Other Fantastical ‘Indy 4′ Moments
Sacha Baron Cohen Detained By Italian Authorities After Crashing Catwalk

IBM Sends SaaS With Hosted Messaging Rollout

Big Blue upped its ante in the software as a service (SaaS) space this week with the release of a hosted version of its venerable corporate e-mail software. The new Lotus Notes Hosted Messaging service, announced on Wednesday, is the latest SaaS application from IBM.

Priced at $10 or less per seat, the hosted version of Lotus Notes is expected to compete with Google Gmail and Microsoft's Exchange Online in the medium-to-large e-mail business segment (1,000 to 10,000 users).

IBM's SaaS offerings to date include Sametime Unyte, a Web conferencing software, and Bluehouse, a collaboration service for extranets.


Lotus Notes uses the Lotus Domino Messaging infrastructure to manage the service. Both applications are at the forefront of IBM's foray into "new markets such as iPhone users, social networking and unified communications clients," according to IBM's announcement.

Non-hosted versions of Lotus Notes and Domino have more than 140 million users and are used by "more than half of the largest corporations in the world," the announcement added.

IBM's Vice President Kevin Cavanaugh stressed the benefits of SaaS.

"Software delivery models such as cloud computing, software-as-a-service and hosting are evolving because companies want flexibility," Cavanaugh stated in the release.

Sparks may yet fly in the SaaS market, especially as larger software firms get involved.

For instance, In June, Google's Gmail messaging service displaced an existing premises-installed Microsoft Outlook and Exchange e-mail solution at one of Australia's largest IT networks -- that of the New South Wales Department of Education. A three-year $9.1 million Google hosted e-mail deal supplanted a three-year $33 million deal struck with Microsoft.


‘Twilight’ Tuesday: Robert Pattinson, Kristen Stewart And Taylor Lautner Will Present At The VMAs
Report Outlines SMB SaaS Strategies for Vendors

Wednesday, October 22, 2008

Yahoo To Cut Jobs After Flat 3Q Performance

Yahoo's revenue was up by one percent, year-over-year, according to its third-quarter results published on Wednesday, but its workforce will still get cut by 10 percent.

The company posted $1,786 million in revenue for the quarter, up from $1,768 million in 3Q 2007.

About 1,500 jobs will be eliminated by the Sunnyvale, Calif.-based online advertising and Internet search company. Yahoo's stock finished trading on Wednesday at $12.39 per share, a near low point for the year. The company's stock hasn't been the same since May, when Microsoft called off an unsolicited takeover bid for all of Yahoo.


Yahoo employees getting pink slips must be wondering if it wouldn't have been better for the company to have succumbed to Microsoft's offer, which was announced in late January. After that announcement, Yahoo's stock rode high at $28.38 per share. Microsoft eventually had offered to pay $33 per share for Yahoo before backing out.

While Yahoo resisted Microsoft's overtures, it paid a penalty. Buried in Yahoo's 3Q announcement is a $37 million cost for advisor fees to deal with Microsoft's offer and an alternative deal with Google.

The Google talks are still ongoing, with a plan to share online ad-search revenues that is being reviewed by the U.S. Department of Justice.

For some, Yahoo's proposed 1,500-employee cutback isn't enough. IT tech stock pundit Henry Blodget of the Silicon Alley Insider Web site suggested that the company should "cut its workforce by at least 3,000 employees." It would return the company to its second-quarter 2007 size, he wrote.

"Yahoo is fat. It will still be fat after it lays off 1,500 employees," wrote Blodget. "We hope the company does not fall into the death-by-a-thousand-cuts trap that has plagued AOL for years, but it certainly seems headed that way."


‘Beverly Hills Chihuahua’ Still Top Dog At Box Office
Tech Stock Flameout Sparks Microsoft-Yahoo Rethink

VMware Earnings Beat Expectations In a Difficult Economy

Virtualization bellwether VMware announced earnings for the first full quarter under CEO Paul Maritz, the Microsoft veteran who replaced Diane Greene as president and CEO back in July. One of the most closely tracked companies on Wall Street's radar screen did not disappoint expectations with reported revenues for the third quarter of $472 million, a 32 percent increase achieved under increasingly difficult market conditions.

"Our revenue for the first nine months of 2008 increased 50 percent compared to the same period last year and now exceeds our revenue for all of 2007," noted Mark Peek, the company's Chief Financial Officer during an earnings call with analysts.

Despite unprecedented and severe recessionary pressures occurring in the US economy, VMware's U.S. revenues grew 24 percent. In addition, international revenues grew 42 percent, largely based on European sales.


VMware's stock has been a high flyer after the company launched what is widely described as the most successful IPO since Google last year. But, along with other tech issues, the stock has lost significant value after reaching a market high of $125 per share in October of 2007. More recently share value has been caught in the downdraft of a protracted stock market plunge, recently falling below $20.

VMware will have continued challenges into 2009 given Microsoft's entry into the market with Hyper-V, a product which significantly undercuts VMware's ESX hypervisor offering on price. Redmond sharpened the competitive knife further yesterday by releasing VMM 2008, a management offering designed to compete with VMware's vCenter.

Addressing these issues on the earnings call, Maritz said "I can also report to you that during this period, we did not see any major losses to Microsoft. Currently we take Microsoft very seriously and keep our eyes very closely focused on them, [but] Microsoft is still behind in terms of their product roadmap and we do not see them catching up to us until the next 12 to 24 months." Nevertheless, going forward, in addition to competition from Microsoft, Maritz and VMware will also have to contend with the double whammy of severe recessionary pressures and reduced IT budgets.


Yahoo To Cut Jobs After Flat 3Q Performance
‘Tropic Thunder’ Rolls Over Box-Office Competition For Second Week
Google Shows Strong 3Q Revenue
‘Twilight’ Stars Robert Pattinson, Kellan Lutz Describe New Flashback Scenes

Bad Times Call for a Security Check, Experts Say

"Disaster breeds opportunity," so the old slogan goes. But when it comes to uncertain economic times, companies should double-check their internal policies and procedures, security experts say.

A key question is whether layoffs, or the prospect of them, will increase the likelihood of information theft by employees. Slavik Markovich, CTO and founder of security firm Sentrigo Inc., advocates data monitoring as a preemptive measure.

"Now would be good time to refresh policy procedures with employees, and monitor their activities, especially those of privileged users," Markovich said. "It is equally important to monitor all access to data that is commonly targeted by insiders, even if seemingly it is not central to the company's business. Because data can be stolen without disappearing, monitoring is the only way to track events in a way that allows a quick response and mitigation."


Security experts pointed to scenarios where lax internal security can be a problem, leading to data theft, sabotage and security chaos. The threat rises with economic instability, according to Tyler Reguly, nCircle's head security engineer, in an e-mail comment.

"Now, however, in the face of deteriorating economic conditions as well as acquisitions and layoffs, the relative importance of insider threats is rising," Reguly wrote.

Data Theft
Unscrupulous people fearing job loss may "take insurance" in the form of stolen data they can sell later. The data could be used for identity theft or poaching customers, or they could bring sensitive information to competitors.

A recent survey by security firm Cyberark quantified the potential threat. Nearly 90 percent of IT administrator respondents admitted they would take sensitive information with them if their jobs were on the chopping block. At-risk information ranged from customer databases to the CEO's personal information. More than 33 percent said they would take the password list granting access to sensitive information, such as financial reports, accounts and HR records.

Sabotage
In the IT community, one practice is to carry out company sabotage for the sake of demonstrating employee necessity. Planting logic bombs is one example. It sends the message, "You're the only one who understands this; we need you back at the office."

A different case of sabotage happened this past July. Terry Childs, a disgruntled system administrator working for the city of San Francisco, suspected that he was about to be laid off. In response, he blocked access to the city's IT network infrastructure using a password only he knew, menacing the city's multimillion-dollar wide area network.

Security Chaos
Company mergers suggest impending layoffs, where the aim is to cut "redundant employees" in the name of market efficiency. It's fertile ground for security problems.

Cumbersome and drawn out mergers with long transitional periods may create an occasion for data to be mishandled, lost or stolen. Security protocols can be breached under the guise of helplessness or collaboration. For instance, someone may call and ask for help logging into a system.

"There is always risk and while you need to move fast, you also need to consider IT controls and scope very carefully and make sure you have conversion and security controls built into your plan," said Michael Cangemi, a past president of both Financial Executives International and the Information Systems Audit and Control Association.


Sacha Baron Cohen Detained By Italian Authorities After Crashing Catwalk
Malware Disguised as Social Networking Tops Emerging Security Threats
Card Use Can Stem ID Theft, Microsoft Says
Peter Facinelli Compares ‘Twilight’ Patriarch Carlisle Cullen To His Breakthrough Role In ‘Can’t Hardly Wait’

Tuesday, October 21, 2008

Microsoft Gets Patent on Broadcast Censorship Solution

On-line gamers using profanity in audio chat sessions may have their act cleaned up if Microsoft has anything to say about it.

On Oct. 14, Microsoft was issued a patent for an Automatic Censorship of Audio Data for Broadcast, according to the United States Patent Office. The new audio filter goes beyond the "bleep" by actually replacing objectionable words with generally acceptable words or reducing the "undesired speech" so that it is "no longer perceivable by a listening audience," according to the patent.

A statement in the patent described the point of the invention, which promises to cut the human censor out of the loop in broadcast media.


"Humans are remarkably adept at identifying words and phrases that are considered unacceptable. However, for certain types of broadcasts, it would be preferable to employ a more automated approach that avoids the need to pay for or provide a human censor to monitor the broadcast…."

The "Big Brother" ramifications of this invention have the on-line reader community up in arms. One comment noted that there's no beep to tell listeners that the censorship occurred.

The patent's description points to audio chat in online games as a possible use for the censorship solution. It notes the difficulty of censoring chats "in massive multiplayer games."

Whether or not Microsoft will use the new audio censorship technology for any specific applications is not yet clear.

Matt Rosoff, Microsoft's lead analyst, consumer products, corporate news, noted in an e-mail that Damon Danieli, the patent applicant, works for Microsoft Research and that the patent was part of a general research program and had nothing to do with a specific product.

"I imagine it could be used during live broadcasts to censor language that the broadcaster doesn't want to be heard," he stated. "The most obvious application would be a Webcast, so I could imagine Microsoft incorporating this technology into the Windows Media platform."

For more information on the censorship patent, go here.


‘Flash of Genius’: War Story, By Kurt Loder
Microsoft and Novell Roll Out Virtualized Linux Solution
‘Twilight’ Tuesday: Robert Pattinson, Kristen Stewart And Taylor Lautner Will Present At The VMAs

Alcatel-Lucent Beefs Up Enterprise Laptop Security

Enterprise IT professionals concerned about laptops holding proprietary corporate data should be heartened by a new 3G mobile security technology update announced on Monday by Alcatel-Lucent.

The company expanded the capabilities of its OmniAccess 3500 Nonstop Laptop Guardian (OA3500 NLG) product to run on high-speed GSM (Global System for Mobile communications) high-speed packet access network-enabled devices. Previously, the product was available in a CDMA (Code Division Multiple Access) version that Sprint Nextel sells to corporate customers.

The expansion will permit the OA3500 NLG to run on more networks internationally.


Both versions of the product use the 3G network to remotely control and secure mobile laptops from the enterprise IT data center. To do that requires security that's a step above from what is available in most other products today, said Dor Skuler, general manager of mobile security for Alcatel-Lucent.

"The types of mobile security solutions today on wireless encrypt the traffic when you're on 3G but don't address the laptop security as a whole and, of course, don't do anything if you're using other interfaces," he said. "Our solution has a server and a card that allows IT to control the laptop 24-7, even when it's turned off or offline. IT can remotely wake up the (PCMCIA) card that's stored on the laptop and wipe it clear, thereby protecting the unit in case it's lost or stolen."

Additionally, the card has GPS capabilities so IT can track and recover stolen or lost devices.

The OA3500 NLG is "a corporate solution that requires a server to be installed in the data center," Skuler said.

The OA3500 NLG solution uses mobile VPN (virtual private network) technology "so when you use 3G or Wi-Fi or a LAN, VPN is automatically established through the card so there is no end user configuration," Skuler added. "A VPN secure tunnel is created from the laptop and terminates on our server."


Sacha Baron Cohen Detained By Italian Authorities After Crashing Catwalk
Fortinet Helps DBAs Sniff Out DB Exploits
Windows Essential Business Server Hits RTM Status

Sync Framework: It's Still at Version 1.0

Microsoft claimed on Tuesday that it did not release the next version of its Sync Framework on Oct. 13, as reported by some news outlets last week.

Mary Jo Foley's blog noted last week that an earlier version of the framework appeared "to have been superseded by the October 13 ones on the Microsoft Download Center." That change didn't constitute a new version, Microsoft contends.

According to Microsoft's blog, some files were renamed on the Microsoft Download Center to reflect additional hardware support for the framework. The renaming caused the confusion, but the Sync Framework remains at its Version 1.0 release, the blog explained.


Quibbling about the Sync Framework comes amidst heightened expectations about what Microsoft may announce next week at its Professional Developers Conference (PDC).

Microsoft officials have been pushing a "cloud computing" theme for this year's PDC, which begins on Oct. 27. The Sync Framework could play a role in Microsoft's overall cloud computing vision.

The main pronouncements expected at the PDC will be the release of Windows 7 bits and something called a "Cloud OS" by Microsoft's CEO Steve Ballmer, who has not elaborated on what the term means. Others have described the cloud OS an Internet-based stack that goes by the Microsoft code name, "Windows Strata."

Microsoft is planning three sessions at the PDC related to the Sync Framework. In essence, the framework uses metadata to track changes and update data between applications, for both online and premises-based apps.

The Sync Framework supports offline synchronization via Sync Services, which updates ADO.NET-enabled databases. Developers can also use the framework to enable peer-to-peer collaborations using two or more SQL Server databases. If SQL Server 2008 is used, the framework enables change tracking in that database.

The Sync Framework sounds a lot like Live Mesh, a device- and platform-neutral data connectivity technology championed by Microsoft's Chief Software Architect Ray Ozzie. Live Mesh, which was unveiled to developers in April, will be the topic of 10 PDC sessions this year, according to a search on the event's Web site.

The relationship between the Sync Framework and Live Mesh isn't altogether apparent, and a question posted on a Microsoft forum to that effect went unanswered on Tuesday. However, a Microsoft Developer Network Channel 9 video describing the Sync Framework noted that it is built on C# and is "the basis" of Live Mesh.

Developers can use the Sync Framework on the Windows platform for free. It can also be used with "other platforms through commercial licensing and porting kits," according to an MSDN page.

Some observers have suggested that app developers have already created their own sync protocols for the most part and that Sync Framework just represents a late addition to the game.


‘Windows 7′ No Longer a Code Name
‘Iron Man’ Recasting Makes Us Wonder: Does Swapping Actors Ever Work?
‘Twilight’ Tuesday: Robert Pattinson, Kristen Stewart And Taylor Lautner Will Present At The VMAs

Data Breaches at State, Local Agencies Expose Data about Millions

Data breaches at state and local government agencies exposed the personal information of nearly 3.8 million Americans in the first three quarters of this year, according to the Privacy Rights Clearinghouse.

Most of the exposures came from a single incident in July at the Colorado Division of Motor Vehicles that compromised information on 3.4 million people. But even discounting that incident, the number of records exposed in breaches at state and local agencies outstripped those reported at federal agencies in the same period.

The figures underscore the need for standardized and improved data security at state and local government agencies, said Abe Kleinfeld, president and chief executive officer of nCircle Network Security Inc., of San Francisco.


"I don't think we are seeing an unusual amount of data breaches" at the state and local levels, Kleinfeld said. "The danger is the kind of data they have. It is becoming increasingly important that states begin developing some kind of program."

The company compiled the data on state and local breaches from the Privacy Rights Clearinghouse, which documented 20 breaches through September.

In the same period, the clearinghouse reported five incidents of breaches at federal agencies that exposed the records of 23,024 people. The largest was in May at the Marine Corps Reserve Center in San Antonio, where a contractor improperly accessed and stole 17,000 records. Another incident at the International Visa Service in Atlanta involved an employee's theft of data on 1,000 people.

The lower federal numbers illustrate improvements in the government's data security, which Kleinfeld said can be attributed largely to the standardized processes and controls required under the Federal Information Security Management Act.

"There are a lot of complaints about FISMA, but I think it is hard to argue that security has not improved in the federal government," he said. "It has improved."

States remain vulnerable because there is no similar overarching standard for data or information system security, he said. "We need some kind of program like FISMA that extends across state and local governments," he added.

Imposing a nationwide standard for government data security would be difficult, and it is unlikely to happen in the short run, Kleinfeld said. But FISMA-like requirements could eventually be extended to state and local agencies that administer federal programs or share data with federal agencies.

In the meantime, Sen. Norm Coleman (R-Minn.) introduced a bill last month that could help. S. 3460, the State Cyber Security Protection Act of 2008, would give the Homeland Security Department $25 million to fund a pilot program to support cybersecurity efforts at the state level. The bill has been referred to the Homeland Security and Governmental Affairs Committee.

A program to share best practices among agencies at all levels of government and create cybersecurity templates, even if they are not mandated, would be a big step forward in data security, Kleinfeld said.

Security breaches with exposure of personal data at the state level, as reported by the Privacy Rights Clearinghouse through September, include:

Florida Department of Children and Families -- 1,200 records exposed. Maryland Department of Assessments and Taxation -- 900 records exposed. Wisconsin Department of Health and Family Services -- 260,000 records exposed. Virginia Department of Social Services -- 1,500 records exposed. Wisconsin Department of Revenue -- 5,000 records exposed. South Carolina Department of Health and Environmental Control -- 400 records exposed. Nevada Department of Public Safety -- 109 records exposed. Utah Division of Finance -- 500 records exposed. Pennsylvania Department of State -- 30,000 records exposed. Rhode Island Department of Administration -- 1,400 records exposed. Oklahoma Department of Corrections -- 10,597 records exposed. Baltimore Highway Administration -- 1,800 records exposed. Oklahoma Corporation Commission -- 5,000 records exposed. Connecticut Department of Labor -- 2,100 records exposed. California Department of Consumer Affairs -- 5,000 records exposed. Texas Department of Public Safety -- 826 records exposed. Florida Agency for Health Care Administration -- 55,000 records exposed. Colorado Division of Motor Vehicles -- 3.4 million records exposed. California Department of Consumer Affairs -- 5,000 records exposed. Pennsylvania Department of Public Welfare -- 2,845 records exposed.


McAfee To Acquire Secure Computing
Sacha Baron Cohen Detained By Italian Authorities After Crashing Catwalk