Wednesday, October 8, 2008

Tuesday, October 7, 2008

Fun with openSUSE 11.0

Not a single post I make about Ubuntu goes by without at least one of you making some comment about my distro of choice and suggesting that I try some other distro. Well, never let it be said that I don’t listen to you - so this week I decided to take openSUSE 11.0 for a spin.
I first downloaded the 64-bit LiveCD KDE4 version of openSUSE and took this one for a spin. Unfortunately, I couldn’t get the LiveCD to load either on a physical PC or a virtual PC - each time KWin crashed at startup, threw up an error message and the system locked up. Rather than getting caught up trying to figure out what was wrong I abandoned 64-bit KDE4 and instead sent for the 32-bit GNOME openSUSE LiveCD instead.
Check out the openSUSE 11.0 gallery
After a less than promising start, I was expecting more problems with openSUSE, but the 32-bit GNOME version seemed to play well on both physical and virtual systems, so I stuck with this.
After running Ubuntu for a few months it’s hard not to compare openSUSE to it, and the first thing that struck me was how sluggish running the LiveCD of openSUSE felt compared to all Ubuntu LiveCDs I’ve tried. I even went back to an Ubuntu LiveCD to check out if it was just my memory or whether openSUSE did indeed feel sluggish, and it did. No idea why. However, that said, a LiveCD is a temporary thing so I didn’t dwell on the performance issues of the LiveCD too much and just hoped that an actual installation of openSUSE wouldn’t feel as kludgy.
On to the installation.
Installing openSUSE is a snap and I didn’t have any problems. Comparing it to installing Vista or Ubuntu, I’d say that the process is no more complicated, although what I would go as far as to say is that the setup process isn’t as friendly as Ubuntu’s, and it consists of more steps that Vista’s setup process. Given this I’d say that Ubuntu is more friendly to the newbie, but getting openSUSE onto a system shouldn’t be a problem for anyone who has previously installed an OS or a major suite of applications.
Once installed I was pleased to find that openSUSE had picked up the pace quite considerably and the sluggish kludgyness I’d experience with the LiveCD was gone. openSUSE 1.0 was both snappy and responsive, apart from the first time I ran OpenOffice, which caused things to enter that “swimming through molasses” phase and made me wonder more than once whether I’d locked up the system. I hadn’t, and the feeling passed after a few minutes.
So, what do I think of openSUSE?
Overall, I like openSUSE 11.0. After deciding to ignore my troubles with the 64-bit LiveCD KDE4 version of openSUSE and the slowness of the LiveCD, openSUSE certainly seems like a nice, well-rounded OS. Also, while overall I feel that Ubuntu is more newbie friendly, openSUSE starts off by being more pleasing on the eye - the green look (to me at any rate) seems more elegant and less scary.
Then there are the GNOME menus. I have to say that after months of using Ubuntu, I prefer the GNOME menu as seen on openSUSE. Maybe as I use the two distros side by side this feeling will wear off, but right now I prefer openSUSE. Come to that, I think I prefer the entire default openSUSE theme over the Ubuntu one.
I’m told that because of Novell/Microsoft ties, OpenOffice as shipped with openSUSE has more features than the stock OO.o shipped with Ubuntu. I need to investigate this further to have an opinion on the matter (although I can say right away that I don’t have an issue with the politics of this deal …).
I still have a lot of investigating to do, however, in the interim I think that if I had to choose between Ubuntu and openSUSE, Ubuntu would be the winner - familiarity is a key factor.
Things I’ve learned …
A few things I’ve learned, in no particular order …
There are other worthwhile distros apart from Ubuntu.
The more OSes you add to the mix, the harder it becomes to be OS agnostic.
The perfect OS is probably mythical …
(I refers to Adrian Kingsley-Hughes)

Wednesday, October 1, 2008

Demo exploits posted for unpatched MS Word vulnerability


A security researcher has released demo exploits for what appears to be a critical – unpatched — memory corruption vulnerability affecting the ubiquitous Microsoft Word software program.
The proof-of-concept exploits accompany a warning that the flaw affects Microsoft Office 2000 and Microsoft Office 2003. In addition to the rigged .docs, there are two videos demonstrating an attack scenario that crashes the program.
From the advisory:
An attacker could exploit this issue by enticing a victim to open and interact with malicious Word files.
Successfully exploiting this issue will corrupt memory and crash the application. Given the nature of this issue, attackers may also be able to execute arbitrary code in the context of the currently logged-in user.
Here are the proof-of-concept documents (download and run at your own risk!):

crash-word-1.doc
crash-word-2.doc
crash-word-3.doc
crash-word-4.doc
[ ALSO SEE: Free Sourcefire tool pinpoints hostile MS Office files ]
The SANS Institute issued a warning in its @Risk newsletter, noting that the issue occurs in the way Microsoft Word handles unordered (bulleted) lists.
Successfully exploiting this vulnerability would allow an attacker to execute arbitrary code with the privileges of the current user. Note that, on recent versions of Microsoft Office, Word documents are not opened upon receipt without first prompting the user.
I’ve asked Microsoft for confirmation of this issue and will update this post when I hear from them.
UPDATE: Microsoft e-mailed the following statement on this issue:
Microsoft is investigating new public claims of a possible vulnerability in Microsoft Office. We’re currently unaware of any attacks trying to use the claimed vulnerability or of customer impact. We will take steps to determine how customers can protect themselves should we confirm the vulnerability.
Once we’re done investigating, we will take appropriate action to help protect customers. This may include providing a security update through the monthly release process, an out-of-cycle update or additional guidance to help customers protect themselves.

Gauging the ThinkPad: Before (IBM) and after (Lenovo)


Has Lenovo lost whatever mojo the ThinkPad had? That simple question raised a lot of discussion at TechRepublic and it’s worth pondering. The problem: Gauging Lenovo’s performance depends on a lot of anecdotes with few concrete answers.
As background, John Sheesley asked a simple question: Has Lenovo ruined the ThinkPad? John outlined the history–IBM unloaded its PC unit to Lenovo in 2005–and noted that the latest ThinkPads just don’t seem to have the fit and finish as before. The questions about Lenovo have popped up before, but are quite current today since I smoked (literally) three T42s–older ThinkPads–on Thursday. The guts of the laptop started smoking so I have a loaner that will be upgraded to another Lenovo in the next few weeks.

Review: Lenovo ThinkPad X301
The talkbacks were lively and John called shenanigans after a bunch of responses spoke glowingly about Lenovo. He thought he was surrounded by a bunch of Lenovo plants.
Lenovo spokesman Ray Gorman replied:
I believe ThinkPads are just as good as ever and voted accordingly. Although I suspect there are other Lenovo employees who have voted in this forum, I would wager that you have also attracted voters who are employees of our competitors. While in either case, it’s fairly predictable how each would vote, the interesting fact is that there are really only two notebook PC brands that have their own fan forums and passionate enthusiasts. I’m pretty confident declaring that ThinkPad is one of those two.
In many respects, Lenovo isn’t different from any other PC maker–perception is reality. For instance, a lot of folks have had trouble with Dell’s customer service in the last five years. I haven’t had any problems. Obviously, if I get hit with a survey Dell will fare better than someone who was burned.
But the question about ThinkPad quality in the Lenovo era isn’t easy to answer. The data is inconclusive and what you really have is a bunch of folks opining about the ThinkPad when it was part of the IBM empire against the latest from Lenovo.
To settle this score I went to the place that has the most objectivehistorical data I could find: Consumerreports.org, a service I highly recommend. A Consumerreports.org subscription at $26 a year–$19 if you get the magazine–pays for itself many times over.
I perused the Consumer Reports ratings on laptops in articles published in 2004 and 2008. In 2004, the then-IBM ThinkPad did seem to rate higher in the niches covered by Consumer Reports. In a August 2008 story, Lenovo’s ThinkPad X300 scored second to last out of eight laptops in the 13.3-inch model category. Lenovo had second place and fourth place finishes out of five 14.1-inch models evaluated. Among 17 15.4-inch laptops rated, Lenovo had a sixth place finisher (ThinkPad T61), 11th place (ThinkPad R61) and last (IdeaPad).
Bottom line: Your feelings about Lenovo are a crapshoot. Lenovo’s X300 was the second most pricey 13.3-inch laptop behind the MacBook Air, but Consumer Reports dinged it in many areas. But corporate workhorses like the T61 did pretty well in the rankings.
Here’s a look at the 2003 repair history for laptops from Consumer Reports, which published the ratings in September 2004.

And then there’s the repair history for laptops tracked via a Consumer Reports survey from 2003 to 2007. Consumer Reports published it in August 2008.

One notable point: These repair figures are all within a 3 percent margin of error. Is it possible that’s because every PC vendor is outsourcing manufacturing to the same contractor? Nevertheless, Lenovo seems to be carrying the ThinkPad torch at least as well as IBM did.
Thoughts?

Tuesday, September 30, 2008

Nokia to unveil touchscreen phone next week


Reuters is reporting that Nokia will unveil its first touchscreen phone at a media event next week in London.
The phone, whose leaked pictures were posted on this blog earlier this week, is the result of plenty of market pressure since the iPhone’s debut more than a year ago. According to a Gartner analyst in the article, Nokia’s position as the leader of the mobile market has got the world waiting for their next move.
That move is reported to happen Oct. 2.
Nokia announced back in July that it would introduce its first touch-screen phone this year, and that it would be sold for a cheaper price than rival touch-screen models in order to tap into a higher-volume market.
Of course, the pressure’s on ever since HTC and T-Mobile introduced the Google Android-powered G1 this week, which retails for $179, slightly cheaper than the iPhone.
Would you buy a touchscreen Nokia phone? And for that matter, will touchscreen smartphones replace standard cell phones altogether? Tell us in TalkBack.

Sunday, September 28, 2008

Highest-Earning Pirates


High seas piracy was the colonial era's version of investment banking. Through good positioning, aggressive go-getters could make millions from global trade and commerce in diverse sectors. They were frequently chased off shore to the Caribbean by angry governments. And in the end, they were sometimes sunk without a trace.
Soon after government-hired pillagers like Hernando Cortes started plundering the new world in 1503, an entire class of sailor realized he could profit by stalking the ships carrying the spoils.

Those riches couldn't be sent using wire transfers. So when Cortes wanted to send a bounty of Aztec gold to Charles V, he had to load it onto ships and sail it across the sea, where men like Jean Fleury were waiting. In 1523, the French privateer fell upon a Spanish treasure fleet--a score that helped him net $31.5 million in present-value dollars over his career, making him the sixth highest-earning pirate of all time.
The highest-earning pirate ever was Samuel "Black Sam" Bellamy, an Englishman who made his bones patrolling the New England coast in the 18th century. By our calculations, "Black Sam" plundered an estimated $120 million over the course of his career. His greatest windfall occurred in February of 1717, when he captured a slave ship called the Whydah, which reportedly held more than four and a half tons of gold and silver. Bellamy, known for his relative generosity, took the Whydah as his new flagship and gave one of his old vessels to the defeated crew.
In second place, with lifetime earnings of $115 million: Sir Francis Drake, a 16th century British privateer who saved England from the Spanish Armada and went on to a profitable life of plunder at the behest of Her Majesty's Government. Fellow Englishman Thomas Tew places third with earnings of $102 million. His biggest score came in 1693, when he pilfered a ship full of gold en route to the Ottoman Empire from India.
Our wealth estimates are based on information gathered from historical records and accounts from 17th and 18th century sources like Daniel Defoe, as well as contemporary historians like David Cordingly. Whenever possible, we used official records of pirate's claims. So when a 1718 North Carolina ledger says wares seized from Edward "Blackbeard" Teach sold at market for 2,500 pounds following his death, that source was trusted above Blackbeard's claims to a magistrate that a great treasure lay in a location known only to him and the devil. By our count, he amassed a total of $12.5 million in loot over his career.
Depletion of fortune due to rum and wenches was not assessed, nor were divisions of treasure among the crew. Plunders were often split in equal shares, with the captain receiving double--not much of a premium for leadership. A good lesson to modern shareholders: The best way to achieve fair compensation and rule out golden parachutes is to have your leaders expecting murderous revolts if they hoard profits.
All money and goods were converted into present value U.S. dollars. Present values were determined using the retail price index developed by the British House of Commons and MeasuringWorth, a research project founded by University of Illinois Chicago economics professor Lawrence H. Officer.
For the most part, pirates didn't make much money, and they certainly didn't save it. The amount of cash they needed to keep on hand to cover their liabilities cut into their fortunes. Crew members that lost limbs in battle could be compensated at a rate of 1,500 pound per limb. Since infection could easily cause death, the remedy for gunshot wounds was often amputation. With 100 men on a ship, if 10 limbs were lost during battle, that was a 15,000 pound loss, or about $3 million in today's dollars. It's easy to see why pirates tried to take ships without firing a shot. A few drunken sailors getting themselves nicked could negate the entire profit.
Pirates didn't have 401(k) plans, so burying a pile of gold was sometimes the smartest way to save for the future--that is, when they had one. Samuel Bellamy's treasure sank with him off Cape Cod, most of Bartholomew Roberts' fortune ($32 million) was taken after he died in battle in 1722 and Stede Bonnet's wealth ($4.4 million) was absorbed into the South Carolina treasury after his 1718 execution. Jean Fleury's Aztec gold wasn't recovered and was probably spread thin over brothels and saloons from Cuba to France; it's likely been melted down over the last 500 years into gold bars lining national treasuries and formed into wedding rings the world over.
Most pirates died without honor or coin. It was an existence filled with murder, treachery, disease (both tropical and venereal), and it ensured a short life, even by the standards of the day. But for the chance to be rich and unbound from a life of farming or military service, it was an easy choice for many--even if it did come with scurvy.
The 20 Highest-Earning Pirates

Apple as the incumbent against open source G1


Apple’s reaction to the G1 so far, even its more extreme manifestations, is that of an incumbent politician facing a no-hope reformer. (Picture from our fabulous Apple blog.)
Despite all the hullaballoo, Google’s got a phone that doesn’t ship until next month, on America’s modern equivalent of the Dumont network.
Apple’s response to its channel is, in essence, who’s your daddy? The support from potential G1 partners has been tepid, just the usual suspects. Most have been silent.
In a political race, when an underfunded underdog challenges an entrenched incumbent, this is the right move. Don’t pay any attention, don’t engage in debate.
Trouble is this is not politics.
This is business, where you can change your financial vote at any time. The trailing candidate goes away after the election, but a big-pockets developer with real allies does not.
It’s true that the financial meltdown gives Apple an even stronger position. But Google’s stock remains strong enough to finance its planned network build-out.
And if the race gets closer, Apple’s current actions will come back to haunt it.

Is Chrome a security risk?


My lovely bride of 30 years worked from home yesterday, hoping to save our city some gas.
An e-mail came in from her administrator around mid-day which she decided to share with me.
It told all users to shut down Chrome.
The e-mail called Chrome a security risk. It told all users within the company to use Firefox or Internet Explorer, to shut Chrome down.
I don’t know how serious those concerns are. Without identifying my wife’s employer I will say it’s a conservative company, very security conscious, and often proactive.
But this is a good time to ask how well Chrome is doing. Google Analytics says 1 in 40 visits to ZDNet Open Source are now done with Chrome. It’s currently on build 2200, Version 0.2.149.30. (Click the wrench, then the About tab.)
Personally I have noticed that Chrome often crashes Shockwave and Flash pages. Thanks to its redundant tab-based design, whole browser sessions don’t die, but these plug-in crashes are more common than with Firefox.
I have also found that, despite its promise, it pays to shut Chrome down every once in a while and re-start it. The lack of add-ons can be annoying, as when I’m asked for personal information or want to search a page for a word or phrase.
Other reviewers have not been so kind. Some bloggers are already calling it a failure, and the criticism is global in scope.
On the other hand, this open source browser is already being forked, as with a German version dubbed Iron.
This, to me, is good news. It may be the most important news.
It is wrong to evaluate Chrome as you would a new TV show. It is wrong to consider it solely in terms of Google because, like Firefox, this is an open source product subject to the open source process.
But what I think or what any other reporter thinks really does not matter. What do you think? Are you using Google Chrome now? Do you plan to? When? And if not, why not?
(I refers to Dana Blankenhorn)

Slingbox PRO-HD now shipping with HD streaming capability


The Slingbox PRO-HD is now shipping, for US$299.99, and you can now enjoy full HD quality video anywhere you may be traveling. Our own Josh Taylor posted a review of the new unit this morning. Dave Zatz also posted some pics and offers some thoughts on the new Slingbox PRO-HD and links to a full review on Sling Community. The Slingbox PRO-HD offers HD streaming, multiple input, and a built-in digital tuner that allows you to watch TV independently of your cable box in case someone else is watching your TV at home and you still want to connect to your home system.

My Slingbox Classic is looking quite dated now and I may soon have to pass it along to a family member and get myself an updated Slingbox. I wonder what the video quality would be on one of the new high resolution Windows Mobile devices like the Touch Diamond or the Touch Pro HD? The difference between the Classic and PRO HD is incredible and I hope the mobile clients support this improvement too. I just read more details of Josh’s post and see that the upstream won’t support HD outside your home network so it is really designed for working with the upcoming SlingCatcher more than remote viewing.
I also was just sent a HAVA Platinum HD unit to test out and am considering the HAVA Wireless HD that looks to have most of the same specs as this new Slingbox PRO-HD. One thing I can’t wait to test out on the HAVA unit is the free mobile clients for S60, Windows Mobile, and the Nokia Internet Tablet. HAVA doesn’t have a Mac client, but there is no Mac HD client yet for the Slingbox either.
These both look like great solutions for placeshifting your video content and now that the new season of shows has started up and I have some fall travel coming I need to get my system up and running soon.
(I refers to Matthew Miller)

Run, don’t walk, and pick up a REDFLY Mobile Companion for $199.95


I wrote up my first thoughts of the Celio Corp REDFLY Mobile Companion back in March, then I bought my own in May and then the price dropped from US$499 to US$399.95 in August. Well, now you can pick this device up for only US$199.95 and IMHO that is a steal for anyone with a Windows Mobile device looking to be productive on the go. I understand this is a “seeding” price that is only good until 31 October and I guess the intent must be to get the device out there and have people talk it up.
Celio keeps working on and releasing device drivers so development is continuing with the REDFLY and I sure hope this major price drop is not any indication of trouble in the near future. I like using my REDFLY on my commute and on business trips and want to see driver support continue for years, along with expanded drivers for S60 and maybe even the Android OS.
I think this is a perfect enterprise device since you can send employees out on the road with it and their phone to give presentations and work on Office documents without worrying about security issues or even damage to the device.
I keep reading this price and am just amazed it dropped down this far. I was happy to pay what I paid for mine and really hope this lower price takes price out of the equation for potential buyers as I want to see continued development of drivers and support for this excellent product. I think at just under US$200 the purchase is a “no brainer” for any Windows Mobile enthusiast or enterprise user.

(I refers to matthew miller)

Former Google product manager ‘disappointed’ by T-Mobile G1


Ulf Waschbusch, a former Google Mobile Product Manager and current MySpace mobile employee, says, in so many words, that the HTC-made, Google Android-powered T-Mobile G1 is far from an iPhone killer — in fact, it’s just downright disappointing:
The reason many people see the G1 as ugly and old-fashioned is simply… because it IS! It’s a design unchanged for a while (it’s now available in Zune-brown along with white and black). The hardware itself though went through many iterations I am sure, as it’s top-notch (3G on AWS, GPS, 3MP autofocus camera etc.).
Waschbusch writes that he’s a fan of how the hardware works and Android OS, but that “the G1 Hardware is somewhat…well…dated” in looks, paling to HTC’s own Touch or Touch HD. “I just don’t like the design/looks of the device,” he writes.
Which, for a mass-market product aimed at consumers, might be a big problem. After all, what family truly cares about the ins and outs of Android? They just care about making calls, checking e-mail, taking photos — the typical package. Doesn’t matter who’s behind it or how “groundbreaking” we all say it is.
Gizmodo reports that Waschbusch also expressed his frustrations with other aspects of the G1 in his Facebook status:
Ulf is disappointed but not surprised about the ‘G1′. Where’s the cheap data plan? Where do I plug in my headphones? No video player? How do I get contacts in it?
Precisely my concerns, too, when I played with the device first-hand at the launch event (same goes for Josh). And if consumers don’t really take to it, who cares what us tech-inclined people think?

Monday, June 23, 2008

Green building gear in Gotham


Bright lights in the big city. LEDTronics makes LED lights in all different shapes for commercial clients like casinos and high-end homes.

LEDs are still very expensive for regular household use, but good options are expected to be available in the next several years.

LEDs are very energy-efficient; a string of Christmas tree lights only uses a few watts. And they last a long time. LEDTronics representatives said that their LEDs will last 5.7 years with lights on 24 hours a day, or 17 years if they are on 8 hours a day. In addition, the bulbs are made of plastic and so are less fragile than glass lamps.

There are green roofs and "living walls" like this one. Both provide insulation to a building and, in the case of walls, another aesthetic option for building designers.

They can also help manage water, either by cutting down on run-off or to recycle gray water, the water from sinks and baths.

This green wall system from G-Sky is mounted on a frame with a drip-feed water system that can be remotely controlled. A moisture sensor is there to prevent overwatering.

Representatives from G-Sky said they normally grow plants for six months before installing them in restaurants or commercial buildings. These poor plants don't look so happy because they were stuck in there a day before for the trade show.

The gray water filtration system for the G-Sky green walls. The gray box on the bottom right provides an Internet connection for remotely controlling the water-feeding system. Normally, there is a fertilizer feed in here as well.

A new way to heat your home. Radiant heat, where a heating element is placed under floors, has been around for some time and is usually done by sending hot water through tubes placed under the floor.

This is an electric radiant heating system called the Step Warmfloor.

These plastic strips are stapled under the floor and a low-voltage current runs through it to provide heat. The installation is done by an electrician. The company says the system is more efficient than hot water radiant heat and easier to install.

This is a solar panel that's designed to generate electricity from a city balcony rather than a rooftop.

The SolaRail from EPV Solar is encased in glass, weighs 90 pounds, and can generate 42 watts under peak conditions. It's used in a building in the Tribeca part of Manhattan where each balcony has 14 modules. It's a good example of how thin-film solar cells open up possibilities for building-integrated photovoltaics (BIPV).

Another form of distributed generation is combined heat and power systems. This micro-turbine from Capstone Turbine runs on natural gas to produce both electricity and heat. It's 82 percent efficient and has far lower levels of nitric oxide and sulfur oxide compared with diesel generators. The heat can be used for space heating or, with an absorption chiller, for cooling. Several are already installed in New York City in buildings, and a few have been tested in buses. It can also be configured to run off of methane from landfills or waste water treatment plants, according to the company.

Why is a carpet company at a green building conference? Because Shaw Floors has designed a carpet to be completely recyclable. The backing under the nylon carpet has a toll-free number that customers can call to get the distributor to take back unwanted carpet. The backing is separated and recycled by grounding it into reusable pellets. The nylon is also broken up into smaller pieces (represented in these vials) for recycling.

All the major components of a solar hot water heating system, which typically have a quicker pay-back period than solar electric panels. The evacuated tubes heat a liquid that is piped to a heat exchanger that transfers the heat to water. That hot water is stored in a tank.

Nanogel from Duo-Guard is a translucent insulator that can be used for walls or skylights.

The Nanogel can be used on its own, or incorporated with Duo-Guard's Illumall, a modular wall system where each panel changes color on a timer.

Cardboard furniture is displayed by the U.S. Green Building Council.

Pfister Energy specializes in building-integrated wind turbines. This vertical axis turbine can be mounted on the ground, a rooftop, or placed on its side.

Free Sourcefire tool pinpoints hostile MS Office files

Sourcefire, the company behind the popular Snort intrusion detection system, has released a freeware utility to help identify potentially threatening Microsoft Office files.

The tool, called OfficeCat, can be used to process Microsoft Office documents — Word, PowerPoint, Excel and Publisher — determine if possible exploit conditions exist.

Unlike products that detect attempts to exploit known Microsoft vulnerabilities, Sourcefire said OfficeCat can determine if a file contains hostile content before it is opened.

From the Sourcefire announcement:

OfficeCat provides reference information on discovered vulnerabilities so users can remediate risks. By detecting these hostile files before they are opened, OfficeCat enables users to proactively increase the effectiveness of their security efforts.

…To create effective rules, the VRT conducts ongoing research into Microsoft Office vulnerabilities and will regularly update OfficeCat with the latest vulnerability information.

The command-line utility ships with rules for a total of six Microsoft Office bulletins and about 45 CVE entries related to Microsoft Office vulnerabilities.

There has been a noticeable surge in attacks exploiting critical security vulnerabilities in the Microsoft Office software suite.

In addition to using Sourcefire’s OfficeCat, I strongly recommend Microsoft Office users to run Microsoft Office Update to ensure installations are fully patched.

Windows Mobile 7 phones coming in Q1 2009?

Windows Mobile 7 may be closer than many think.

According to a report from at least one major handset maker, Microsoft is planning to make available the final bits of its next mobile operating-system release in time for them to start selling Windows Mobile 7 phones in the first quarter of 2009. If true, that would seem to imply that Microsoft will release the final Windows Mobile 7 by the end of 2008, in order to give phone makers time to test and preload.

As is the case with Windows 7, Windows Mobile 7 is a forbidden topic. Microsoft won’t talk about planned features, beta dates or how/when/if Windows Mobile phones will become more head-to-head competitors with the iPhone.

(I am wondering whether Microsoft might finally share some Windows Mobile 7 info at its Worldwide Partner Conference in early July, given that Andy Lees, the newly appointed Senior VP of Microsoft’s Mobile Communications business is on the keynote line-up. If Microsoft really is going to deliver the final Windows Mobile 7 bits later this year, one would think it needs to be evangelizing about it now.)

There have been a few leaks about what Microsoft is planning for Windows Mobile 7 and Windows Mobile 8. Not too surprisingly, multi-touch and gesture-recognition support are on the docket. The user interface for Windows Mobile phones is slated to get an overhaul, making it more consumer friendly. And, at some point, consumer-focused services beyond Windows Live — things like music and photo-management, will find their way onto Windows Mobile devices via Microsoft’s Project Pink and Danger acquisition.

Until now, the only target date for Windows Mobile 7 I had seen leak was “some time in 2009.” But the Phone Report earlier this week quoted an official with HTC saying the company planned to deliver a Windows Mobile 7 phone in Q1 2009, and an Android-based HTC phone in Q4 2008, by the way.

From recent executive remarks, it sounds like Microsoft is trying to get Windows and Windows Mobile to be more in sync. Might this mean with Windows Mobile 8 — which Microsoft has told certain folks will be built from scratch — Microsoft might make Windows Mobile a “real” version of Windows, with the same core as Windows client?

Friday, June 20, 2008

Microsoft blames ‘human issues’ for Bluetooth patch hiccup


Microsoft has re-released its critical MS08-030 bulletin for Windows XP SP2 and SP3 users, warning that “two separate human issues” caused a major hiccup with the critical security patch.

The original version of the patch, which corrects a remote code execution flaw in the Windows Bluetooth stack, failed to properly fix the vulnerability for Windows XP users, according to Christopher Budd, a program manager in the MSRC (Microsoft Security Response Center).

[ SEE: Critical IE, Bluetooth, DirectX flaws highlight MS Patch Tuesday ]

Budd said an initial investigation into the hiccup identified “human issues” but he did not elaborate.

After we released MS08-030 we learned that the security updates for Windows XP SP2 and SP3 might not have been fully protecting against the issues discussed in that bulletin. As soon as we learned of that possibility, we mobilized our Software Security Incident Response Process (SSIRP) to investigate the issue.

Our investigation found that while the other security updates were providing protections for the issues discussed in the bulletin, the Windows XP SP2 and SP3 updates were not.

Our engineering teams immediately set to work to address the issue and release new versions of the security updates for Windows XP SP2 and SP3. These are available now and are being delivered through the same detection and deployment tools as the original update.

It’s important to note that this re-release only applies to users running Windows XP SP2 or SP3. “If you’ve deployed security updates for MS08-030 for other versions of Windows, you don’t need to take any action for those systems,” Budd said.

Microsoft has had trouble in the past with faulty security updates but it’s somewhat rare for to see a bulletin re-release because the patch missed an entire OS version. The very reason we have a Patch Tuesday release cycle is to avoid situations where IT admins cannot properly prepare for testing and deploying updates.

Having two Patch Days in a month is borderline unacceptable, especially when it involves the “human issues” excuse.

Local root escalation vulnerability in Mac OS X 10.4 and 10.5 discovered


Yesterday, an anonymous reader released details on a local root escalation vulnerability in Mac OS x 10.4 and 10.5, whichworks by running a local AppleScript that would set the user ID to root through ARDAgent’s default setuid root state. Here’s how it’s done :

“Half the Mac OS X boxes in the world (confirmed on Mac OS X 10.4 Tiger and 10.5 Leopard) can be rooted through AppleScript: osascript -e ‘tell app “ARDAgent” to do shell script “whoami”‘; Works for normal users and admins, provided the normal user wasn’t switched to via fast user switching. Secure? I think not.”

Find out how to fix it.


You’ve got several possible workarounds, you can remove the Apple Remote Desktop located in /System/Library/CoreServices/RemoteManagement/, or you can go through the visual Workaround for the ARDAgent ’setuid root’ problem.

Moreover, the AppleInsider speculates on the potential for abuse :

The effects of malicious code run as root may range from deleting all the files on the Mac to more pernicious attacks such as changing system settings, and even setting up periodic tasks to perform them repeatedly. Not all Macs are vulnerable, however. If a user has turned on Remote Management in the Sharing pane of System Preferences under Mac OS X 10.5, or if a user has installed Apple Remote Desktop client under Mac OS X 10.4 or earlier and has activated this setting in the Sharing preferences, the exploit will not function. Mac OS X 10.5’s Screen Sharing function has no effect on this vulnerability.

And even though the vulnerability can also be executed via a remote connection under specific circumstances based on the configuration, physical security to prevent the unauthorized local access is as applicable as it’s always been.

About-face: Apple patches Safari ‘carpet bombing’ bug


In what amounts to a major about-face, Apple has patched the Safari “carpet bombing” vulnerability that led to a Safari-to-Internet Explorer remote code execution combo threat.

After insisting for weeks that the issue is more of an irritant than a security risk, Apple today released Safari v3.1.2 for Windows with a patch warning that saving untrusted files to the Windows desktop may lead to the “execution of arbitrary code.”

From Apple’s advisory:

An issue exists in how the Windows desktop handles executables. Saving an untrusted file to the Windows desktop may trigger the issue, and lead to the execution of arbitrary code. Web browsers are a means by which files may be saved to the desktop. To help mitigate this issue, the Safari browser has been updated to prompt the user prior to saving a download file. Also, the default download location is changed to the user’s Downloads folder on Windows Vista, and to the user’s Documents folder on Windows XP. This issue does not exist on systems running Mac OS X.

The bulletin cites Microsoft’s security advisory on the combo-threat discovered by researcher Aviv Raff.

Safari v3.1.2 for Windows, available for Windows XP and Vista, also fixes at least three additional vulnerabilities that could lead to information disclosure and code execution attacks.

One of the other three bugs also describes a combo threat that goes the other way – Internet Explorer to Safari:

Visiting a malicious website which is in a trusted Internet Explorer zone may lead to the automatic execution of arbitrary code
Description: If a website is in an Internet Explorer 7 zone with the “Launching applications and unsafe files” setting set to “Enable”, or if a website is in the Internet Explorer 6 “Local intranet” or “Trusted sites” zone, Safari will automatically launch executable files that are downloaded from the site. This update addresses the issue by not automatically launching downloaded executable files, and by prompting the user before downloading a file if the “always prompt” setting is enabled.

The IE-to-Safari threat was reported by Will Dormann of CERT/CC .

The browser refresh also plugs a memory corruption issue in WebKit’s handling of JavaScript arrays. “Visiting a maliciously crafted website may lead to an unexpected application termination or arbitrary code execution,” Apple warned.

The fourth vulnerability is an out-of-bounds memory read that may occur in the handling of BMP and GIF images.

Finjan uncovers half a gigabyte of stolen data on crimeware servers

Finjan’s Malicious Code Research Center has uncovered a half of gigabyte of stolen data from US Healthcare organizations and from a major airline on crimeware servers in Argentina and Malaysia.

A representative of Finjan stated:

“Hackers incorporated sophisticated attacks using crimeware toolkits, Trojans, and Command and Control servers to drive traffic from a specific region with specific characteristics. The increase in web attacks is skyrocketing with industry figures that include a growth of more than 200% of web-based malware with an increase of more than 800% in backdoor and password-stealing malware.”

This is obviously a major case of data theft and I feel this is the knockout blow that might force our government to start imposing some strict laws around compliance for security related issues. It would be nice to see something like HIPPA compliance drive places that house medical data towards requirements around Attack & Penetration assessments, code reviews, data at rest/in-transit analysis, etc.

Everytime I go to the doctor, I look at their wireless devices, their Citrix into legacy apps, etc. and just shudder. Apparently it’s even easier than I would’ve thought as a half a gigabyte of medical records would seem to be a large amount. I think you’ve all seen my numerous comments on airline security, so I won’t even broach that this early in the morning.

For the full press release, read below.

-Nate


>>>>>>>>>>>>>>>>>
News Release
>>>>>>>>>>>>>>>>>
Finjan Discovers more than 500 Mb of Stolen Medical, Business and Airline Data on Crimeware Servers in Argentina and Malaysia

In its latest Malicious Page of the Month report, Finjan unveils medical, business and airline data stolen and traded by cybercriminals using targeted campaigns San Jose, CA, USA, June 18th, 2008 - Finjan Inc., a leader in secure web gateway products, today announced its discovery of a server controlled by hackers (Crimeserver) containing more than 500Mb of premium data. The data included healthcare and business related data, as well as personal identifiable information (stolen Social Security Numbers). This data is part of the premium offering that the cybercriminals operating the Crimeservers were selling to the highest bidder online.

The compromised data came from all around the world and contained information from individuals, businesses, airlines and healthcare providers. The report contains examples of compromised data that Finjan found on the Crimeserver, such as:
Compromised medical related data of hospitals and publicly owned healthcare providers
Compromised business related data of a U.S. airline carrier
Identity theft (stolen Social Security Numbers)

Some of the implications of stolen medical and patient data include: illegal and/or bogus treatments; obtaining prescription drugs for the purpose of selling them; loss of health coverage for the victimized patient; inaccurate records of victimized patients, which could result in incorrect and potentially harmful treatments. Healthcare providers could also face potential HIPAA violations or breach of general data protection legislation.

Finjan’s Malicious Code Research Center (MCRC) detected a Crimeserver operated by cybercriminals who used campaigns to steal data. These campaigns consisted of highly sophisticated attacks, incorporating Crimeware toolkits, Trojans and Command and Control (C&C) servers to drive traffic from a specific region, with specific characteristics.

“This report illustrates the latest development in cybercrime. It shows the business cycle of data collecting and trading by today’s cybercriminals. Crimeware infecting PCs is a serious business problem that has far-reaching consequences, such as impacting the security of businesses and patients around the world,” said Yuval Ben-Itzhak, CTO of Finjan. “We see that cybercriminals go after premium data that they can trade for substantial profit. The increase in Web-based attacks is staggering. Industry figures include a growth of more than 200% of Web-based malware, with an increase of over 800% in backdoor and password-stealing malware, illustrating that sensitive corporate and medical are at risk.”

According to Finjan, the fact that sensitive business, patient and personal data were compromised in a timeframe of less than one calendar month underscores the necessity for enterprises and organizations to have a comprehensive security technology in place that provides effective protection against these sophisticated threats.

The compromised data and the Crimeserver applications were detected using Finjan’s patented active real-time code inspection technology while diagnosing users’ Web traffic.

The research is described in detail in Finjan’s latest “Malicious Page of the Month” report released today. To download the report, please visit http://www.finjan.com/mpom

Code execution vulnerability found in Firefox 3.0


It’s not all about world records for Firefox 3.0.

Just hours after the official release of the latest refresh of Mozilla’s flagship browser, an unnamed researcher has sold a critical code execution vulnerability that puts millions of Firefox3.0 users at risk of PC takeover attacks.

According to a note from TippingPoint’s Zero Day Initiative (ZDI) , a company that buys exclusive rights to software vulnerability data, the Firefox 3.0 bug also affects earlier versions of Firefox 2.0x.

Technical details are being kept under wraps until Mozilla’s security team ships a patch.

According to ZDI’s alert, it should be considered a high-severity risk:

Successful exploitation of this vulnerability could allow an attacker to execute arbitrary code, permitting the attacker to completely take over the vulnerable process, potentially allowing the machine running the process to be completely controlled by the attacker. TippingPoint researchers continue to see these types of “user-interaction required ” browser-based vulnerabilities - such as clicking on a link in email or inadvertently visiting a malicious web page.

It looks very much like the vulnerability researcher was hoarding this vulnerability and saving it for Firefox 3.0 final release to make the sale.

In the absence of a fix, Firefox users should practice safe browsing habits and avoid clicking on strange links that arrive via e-mail or IM messages.

There are no reports of this issue being exploited but, if you are worried about being at risk of drive-by attacks, consider using a different browser.

Xohm WiMAX service to launch in Baltimore in September 08

We have been hearing about the on and off story of Sprint, Clearwire, and Xohm for quite some time now and I pretty much had given up hope of really seeing any commercial release of WiMAX with 4G right around the corner. According to Sprint CTO Barry West, Xohm will launch in September of this year. Apparently, the first rollout will start in Baltimore, Maryland and then move to DC and Chicago later in 2008.

There are 575 Xohm WiMAX base station sites up and running with different devices being tested. The devices you will be able to connect with Xohm at launch include a Samsung AirCard, modem from ZyXEL, a ZTE USB dongle, the Nokia N810 Internet Tablet, and selected laptops. I think it would be great to use WiMAX with a device like the Nokia N810 and look forward to trying it out when Xohm hits the Puget Sound area.

The pricing plans have not yet been revealed, but it may now only be a few months until we get a chance to try out this high speed network. Then again, I’ll wait to get too excited until I actually hear reports of people in Baltimore using the network.

Thursday, June 19, 2008

Roadrunner: World's fastest supercomputer


IBM once again tops the Supercomputer 500 list but this time it's with Roadrunner, the first supercomputer to be able to process 1 petaflop or 1 quadrillion calculations per second. Roadrunner connects 6,562 dual-core AMD Opteron chips as well as 12,240 Cell chips and runs on open-source Linux software from Red Hat.

Roadrunner resides at the Department of Energy's Los Alamos National Laboratory where its primary task will be to ensure the safety and reliability of the U.S's nuclear weapons stockpile.

Some of the cable used to wire Roadrunner.

Technicians crawling through the floors to hook up Roadrunner.

Wiring the rack from the back.

The First CU Compute Racks (front).

Two IBM QS22 blade servers and one IBM LS21 blade server are combined into a specialized "Triblade" configuration for Roadrunner. A production Triblade.

A schematic view of the Triblade which consists of two dual-core Opterons with 16 GB RAM and four PowerXCell 8i CPUs with 16 GB Cell RAM. Click on the image to enlarge.

Los Alamos and IBM researchers tested Roadrunner with "Petavision" which models the human visual system--mimicking more than 1 billion visual neurons and trillions of synapses.

Another view from Los Alamos.

Roadrunner at Los Alamos.

A schematic overview created for Wikipedia of the Roadrunner supercomuter. Click on the image to enlarge.

Wednesday, June 18, 2008

NVIDIA launches the ‘best performing GPU on the planet’


Accordingly, NVIDIA announced the release of their latest line of graphics cards this morning. The GTX 260 and GTX 280 mark the debut of the upcoming 200 series GPUs. The GTX 280 counts 240 processing cores and 1GB of RAM under its hood, while the 260 claims 192 cores and 896MB of memory. Either way, we’re talking serious graphic and physics horsepower.

Where there’s eyeball-popping graphics, there’s always an eyeball-popping price tag, and these two chips don’t disappoint: the 280 runs for $650 and the 260 at $399. Both require two PCI-E power connections to run and a huge power supply (perhaps 1,000W?) if you’re even thinking about daisy-chaining them a la Scalable Link Interface, or SLI.

Good news, though: this series actually draws less idle power than the company’s last generation of ultra high-performance cards.

What do you think, readers? ‘Best performing GPU on the planet,’ or a lotta horse for too much cash?

How to recover GPcode encrypted files?


Got backups? In response to the security community’s comments on the futile attempt to directly attack the 1024 bit RSA keys using distributed computing, Kaspersky Labs are now reasonably recommending that affected end users lacking backups of their encrypted data, take advantage of data recovery tools :

Currently, it’s not possible to decrypt files encrypted by Gpcode.ak without theprivate key. However, there is a way in which encrypted files can be restored to their original condition. When encrypting files, Gpcode.ak creates a new file next to the file that it intends to encrypt. Gpcode writes the encrypted data from the original file data to this new file, and then deletes the original file.

It’s known that it is possible to restore a deleted file as long as the data on disk has not been significantly modified. This is why, right from the beginning, we recommended users not to reboot their computers, but to contact us instead. We told users who contacted us to use a range of utilities to restore deleted files from disk. Unfortunately, nearly all the available utilties are shareware – we wanted to offer an effective, accessible utility that could help restore files that had been deleted by Gpcode. What did we settle on? An excellent free utility called PhotoRec, which was created by Christophe Grenier and which is distributed under General Public License (GPL).

Find out how to restore files encrypted by the GPcode ransomware by exploiting a weakness in the process in which the malware deletes the original files, why directly attacking the encryption algorithm was a futile attempt right from the very beginning, how would the malware authors adapt in the future and what can you do about it?

As I’ve already pointed out in a previous post “Who’s behind the GPcode ransomware?” even through they’ve successfullyimplemented the encryption algorithm this time, the only weakness in the process remains the fact that the malware authors are not securely deleting the original files, making them susceptible to recovery using data carving techniques, or through the use of plain simple point’n'click forensics software. If backups are not present, you would have to apply some marginal thinking given that not all of your affected files can be recoved, and therefore, recovering 500 out of 1000 is better than recovering none, isn’t it? Whatever approach you take try to adapt to the situation, and don’t pay. More info on the Stopgpcode utility released by Kaspersky :

To complete the recovery process, we’ve created a free utility called StopGpcode that will sort and rename your restored files. The utility will process the entire disk and compare the sizes of encrypted and recovered files. The program will use the file size as a basis for determining the original location and name of each recovered file. The utility will try to determine the correct name and location for each file, recreating your original folders and file names within a folder called “sorted”. If the utility cannot determine the original file name, the file will be saved to a folder called “conflicted”.

Next to the step-by-step tutorial on using PhotoRec, a data recovery utility, you can also watch a video of the process, or consider using third-party data recovery utilities next to their web based alternatives.

Why was the distributed cracking futile at the first place?

Mostly because the lack of easy to measure return on investment and applicability in a real-life situation - they could have simply started using GPcode variants with new and stronger keys on a per variant basis. The malware authors were also smart enough not to release a universal decryptor including the private key for all of their campaigns, instead, upon providing a custom built decryptor to the affected party, first they request the public key used in the encryption process to later one ship a customer tailored decryptor that works only for the encrypted files using the public key in question. Compared to the majority of malware variants attempting to infect as many hosts as possible, GPcode’s currently targeted approach is willing to sacrifice some efficiency and emphasize on quality.

How would the malware authors adapt in the future?

According to the author of Gpcode, or the person responsible for processing the decryptor requests, new versions with stronger encryption are already in the works, including commodity malware features such as anti-sandboxing, polymorphism and self-propagating abilities. This would result in a awkward situation, for instance, for the time being two out of the four emails used by the authors of GPcode aren’t even bothering to respond back to the infected party, so you can imagine the delays with responding given that GPcode starts self-propagating. They will basically end up with a situation where the number of affected people would outpace their capability to provide them with a custom built decryptor in a timely manner, even if someone’s willing to pay the ransom.

With the entire GPcode ransomware fiasco slowly becoming a tool in the marketing arsenal of a backup company that can now use GPcode as a fear mongering tactic, malware free backups are once again reminding us of their usefulness.