/* -- STUFF -- */

iTnews: Forgetful mobile devices for careless users

Thursday, October 16, 2008


As a journalist at iTnews:

“Blessed are the forgetful: for they shall have done with their stupidities too,” German philosopher Friedrich Nietzsche once wrote.

The 19th Century philosophy seems to have struck a chord with modern day manufacturers, who tout mobile devices with remote memory erasure capabilities or that store no user data at all.

Securing mobile data has become an increasingly prevalent issue for businesses in recent times, as notebooks and smartphones gain ubiquity in the corporate world.

Employees who access corporate data remotely and out of office hours may threaten the security of sensitive information in cases of accidental loss or theft of their mobile devices.

IT contractor EDS recently was reported to have lost a portable hard drive containing data on as many as 1.7 million prospective U.K. armed forces recruits.

The incident follows the U.K. Ministry of Defence’s recent admissions to the loss of three hard drives containing details of more than 50,000 members of the Royal Air Force, and the loss of loss of 658 laptops and 121 USB data drives since 2004.

Other storage devices such as CDs and flash memory cards also have resulted in data leakage in the past.

Sensitive MI6 photos found their way to the public last month via a camera that was sold on eBay. Meanwhile, in March, HSBC lost a CD containing names, life insurance cover levels and dates of birth of 370000 of its customers, prompting industry criticisms of ‘basic stupidity’.

Microsoft has partnered with Nokia, Apple and Sony Ericsson to safeguard sensitive information through technology that wipes clean the memory of missing smartphones.

Dubbed Microsoft System Centre Mobile Device Manager 2008, the technology is a function of Windows Mobile 6.1, and allows system administrators to deliver a ‘kill command’ to a lost or stolen mobile device via Microsoft Exchange.

The kill command is executed as soon as the missing device is connected to a cellular or Internet-enabled network, after which a confirmation notice is delivered to the administrator.

Users also are able to encrypt external memory cards, such as SD cards, so that they can only be read by the smartphone. Because the encryption key is stored on the smartphone, an executed kill command would render data on encrypted memory cards inaccessible.

Personal data that is not backed up will be lost forever with the execution of a kill command. However, as most corporate data is stored on the Exchange server, it can easily be restored on a replacement device.

Rick Anderson, who is Microsoft’s Enterprise Mobile Solution Specialist, said that the technology is focussed on reducing the chance of corporate data being compromised by reducing the window of opportunity.

He noted that Windows Mobile 6 has attained the Common Criteria EAL2+ Assurance Level after assessment by the Defence Signals Directorate (DSD), and has been accepted for use in the Australian Government's official communications and information systems.

“You certainly have your mobile phone everywhere, so you have to be conscious of the fact that you can lose it, by leaving it on the train for example,” he said.

“Blackberry and us [Microsoft] would probably be the ones to look at in terms of corporate mobile solutions,” he told iTnews.

Meanwhile, Hewlett-Packard (HP) has added to its notebook range the HP Compaq 6720t, which features an embedded, write-protected operating system.

The 6720t is a mobile thin client which, like conventional desk-based thin clients, is designed to provide access to virtual computing solutions, such as blade PCs or virtual desktop infrastructure.

It is a solid-state system with flash memory and no moving parts. No data is stored on the device, so sensitive data is not compromised in case of loss of theft.

“From a thin client point of view, it’s [corporate data] all secured in a data centre,” said Fiona Wright, who is HP Australia’s Market Development Manager for Thin Clients.

“For example, if an agent is on the road and left the device somewhere, their sensitive information would not be lost or compromised,” she told iTnews.

The 6720t was released in July 2008 as the commercial world’s first thin client that has been built for mobility.

However, the fact that it relies on server-based data and applications may somewhat limit users’ movements.

IDC research manager of IT spending, Jean-Marc Annonier, expects a broadband connection to be the minimum requirement for the remote use of thin clients.

“Thin clients are not really good for mobility,” he said. “What you need is a live network connection for the thin client to connect to the server.”

“You can’t do this on a train, definitely, even if you have a 3G connection,” he told iTnews.

“The problem is latency,” he said. “With 3G, you may have one second of latency, [during which time] you can’t see what you’re doing.”

But for businesses whose employees work predominantly from the office, home, or a client’s broadband-enabled premises, mobile thin clients could deliver not only security, but economic benefits as well.

A recent report by IDC found thin clients to reduce hardware and software costs by 87 percent, IT costs by 61 percent, and worker downtime by 49 percent when compared with traditional PCs.

Through reducing the need for hardware upgrades and enabling software updates to be rolled-out centrally on the server, thin clients were found to produce a 466 percent return on investment for businesses that were studied.

Large organisations are expected to reap the greatest return on investments for thin client deployments. Annonier expects organisations with between 500 to 1000 workers, using between 300 and 700 devices, to have most to gain.

IDC estimates that one quarter of Australian organisations already have deployed thin clients, and deployments are roughly evenly split between production or pilot phases.

13 percent of organisations are found to be evaluating the technology, while six percent claimed to have no plans to deploy thin clients. The remaining 56 percent of organisations surveyed were either unable to respond, or unaware of the technology.

“Computer virtualisation, although not a new concept, has come of age,” Annonier wrote in the report.

“The days of the traditional PC and the distributed computing model as we know them are numbered and the trend towards centralised computing is already becoming increasingly evident,” he wrote.

According to HP’s Wright, customer feedback for the 6720t has been positive so far.

She expects mobile thin clients to appeal especially to mobile workers who deal with sensitive information, such as those in the health and finance industries.

“Being a new product, there’s a lot of conversation and a lot of hype,” she told iTnews. “From my point of view, it [the technology’s suitability] really does depend on what the IT manager wants to use the thin client for.”

“I do hear from customers that security is on their minds, and I do believe that mobile thin clients will offer what they are after,” she said.

more

iTnews: Economic crisis a 'return to normality', ANZ CEO says

Wednesday, October 08, 2008


As a journalist at iTnews:

Global economic uncertainty may be curbing jobs and investments, but according to Mike Smith, CEO of the ANZ bank, ‘good businesses’ should have nothing to fear.

Addressing 850 business delegates at a luncheon organised by the Australia-Israel Chamber of Commerce (AICC) last month, Smith described ‘one of the greatest economic crises since the Great Depression’.

“I’ve lived through -- and survived -- at least seven economic crises,” he said. “Everyone is very skittish, very nervous, and least bit of information is often taken out of context.”

“I believe we have 18 months to run of this extreme volatility,” he said.

Following the U.S. sub prime crisis and the subsequent collapse of major financial institutions, Australian businesses now face a tightening of credit availability and a softening local economy.

According to Smith, the credit crunch could re-establish a divide between ‘good’ and ‘bad’ businesses -- a divide that he said has blurred due to Australia’s strong economic growth in recent times.

“If you think about Australia, for 15 years, we’ve had very good credit growth and access to equity has been extraordinary,” he said. “That has meant that good businesses have been doing well, but so have bad businesses.”

“What we’re seeing is a return to normality. Good businesses will continue to do well, but bad businesses may not,” he said.

“Access to credit should not be a problem for a good project -- if it stacks up, it will work,” he said. “If you’re a good customer of the bank, there should not be an issue. You should have access to credit.”

Event sponsor IBM echoed Smith’s optimism, citing a global expansion strategy that has resulted in 63 percent of its revenue coming from outside the U.S., including 21 percent from the Asia Pacific region, during the past year.

Glen Boreham, managing director for IBM Australia and New Zealand, highlighted recent Gartner research that predicts regional spending to increase by 8 percent from last year, despite the U.S. downturn.

As well as striking recent deals with BHP Billiton, Customs, and QLD Motorways, IBM also has invested $10.8 million in a new IT Services Centre at the University of Ballarat Technology Park (UBTP).

“In the last 12 months, IBM has hired over 1,000 people through acquisitions, contracts, graduate and professional recruitment,” Boreham told iTnews. “Currently IBM employs nearly 15,000 people in Australia.”

“IBM has a proven record of providing high value to clients during turbulent times, which reflects the market's confidence in our strategy and business direction,” he said.

“For IBM in Australia, we are continuing to win major deals and our clients are continuing to look to IT to help transform their organisations by improving productivity through innovation and automation.”

more

iTnews: Poor IT performance surprises job market survey

As a journalist at iTnews:

Advertisements for IT jobs have fallen 19 percent during the past twelve months, signalling the beginning of a buyers market for jobs in Australia, the Olivier Job Index claims.

Of 385,488 online job vacancy advertisements surveyed in September, vacancies in the IT sector fell 3.27 percent in the month.

September was the fourth consecutive month of decline, and saw the overall Olivier Job Index fall 1.17 percent in the month and 2.03 percent in the year to reach a five-year low.

A year ago, IT was the second largest job sector after Sales and Marketing. It has fallen behind Engineering, Trades and Services, and Building and Construction to be ranked fifth.

“I've been surprised at how poorly IT has performed,” said Olivier Group’s Director, Robert Olivier, noting that IT has experienced the second largest fall during the past year, after the Banking and Finance sector.

“In a year it’s gone from a sellers market to a buyers market. It’s been a slow painful decline -- a death by a thousand cuts.”

“I think this is because banking and the public sector are big IT users and demand has fallen. Also, many Australian operations are part of US global businesses and the squeeze is coming from head office,” he speculated.

Although the ‘innovative spirit and drive’ is unlikely to be lost, a decline in the availability of venture capital cash and monetary investment could curb commercial development, Olivier said.

Meanwhile, he expects employers to be gambling on how long the economic slowdown will last by cutting back on temporary staff and contractors, and graduate hires.

Job advertisements for temporary staff and contractors fell 4.7 percent in September and 14.3 percent during the past year. Meanwhile, advertisements for graduate positions fell 2.2 percent in the month and 16.03 percent in the year.

“There’s no point letting people go if you’re going to have to rehire in 12 months or less,” Olivier said, highlighting the security of permanent positions so far.

“But if employers see the problems lasting longer, then we’ll see a rise in the jobless rate,” he said.

For the younger generation of employees and job hunters, there may be tough times ahead, as the falling value of superannuation and investment funds may lead mature workers to defer retirement, and in some cases, bring people out of retirement.

“When employers become more careful about whom they employ and supply exceeds demand, job tenure becomes a critical factor,” Olivier said. “In the 2001-2 downturn, we saw employers penalising the job hoppers.”

“They [Gen Y] have never experienced a contracting labour market and with all the talk of an aging population and skills shortages perhaps were not expecting it either!”

Olivier said workers should consolidate with one employer, and make sure that they are considered core staff and high performance / low maintenance.

He said the global economic slowdown could put an end to part of the widely-touted skills shortage. However, he noted that demand for staff in some sectors will be stronger than in others.

“If the global meltdown gets worse or stays for longer then overall supply will exceed demand and some of the shortage will be over,” he said.

“But it’s also important not be generalise and to consider matters on an occupational basis,” he said.

“Healthcare, mining and engineering will continue to have shortages, but there will be a lot of pain in financial services, IT and a wide range of white collar professions.”

more

iTnews: iPhone startup brings fuel price app to Australia

Friday, October 03, 2008


As a journalist at iTnews:

A team of developers from Australia and the U.S. is monetising consumers’ petrol price concerns with the launch of a free-to-use iPhone application.

Dubbed ‘GasBag’, the application is designed to locate the cheapest local petrol station using user-submitted price information and rich mapping framework.

Since its launch in the U.S. last month, GasBag has attracted about 75,000 unique users across the country. The application is expected to launch in Australia, Canada and the U.K. by the end of this month.

GasBag was conceptualised in May, and its parent company, jamcode LLC, was incorporated one month later.

Although three of the four jamcode founders are Australian, the team chose to focus its initial development efforts in the U.S. due to the popularity of the iPhone and consumers’ sensitivity to fuel prices.

“Everyone here [in California] talks about fuel all the time,” said Mick Johnson, an Australian GasBag developer who currently resides in California.

“We saw that need, and realised that if we could cater to it, we could do very well,” he told iTnews.

The application currently is offered as a free download from Apple’s iTunes App store and is supported by target advertisements through a partnership between jamcode and mobile advertising company, Mobclix.

As GasBag develops, jamcode expects to provide improved targeting according to information such as car model, trip destination, how often the user fills up, and users’ advertising preferences.

“Our current modelling shows GasBag to be monetisable to the tune of roughly one to five dollars per user per year, once you take into account how often someone fills up, how many ads they'll see, and how much each ad is worth,” Johnson told iTnews.

“We are currently targeting to hit 400,000 to 500,000 users by the end of 2008 across four countries,” he said.

Prior to launching in Australia, GasBag developers will be working on locating petrol stations using street directories and public maps, and converting units from gallons to litres and miles to kilometres.

Another challenge could be competition from Google, which recently partnered with petrol price monitoring company Motormouth to provide a similar Web-based application.

However, Johnson views competition from Google in a positive light, noting that GasBag has become the most popular application of its kind on the App Store despite competition from ‘four or five’ similar applications in the U.S.

“I'm very happy to see the Google-Motormouth collaboration,” he said. “For starters, it's a direct validation of the Australian market for GasBag.”

“Secondly, I think their [Google’s] model of receiving data from their own fleet of sources is a good reliable pulse, but I also think GasBag's inherent advantage is the extremely strong user community it's built on. As such, we can update our list of stations, our prices, and our addresses much more rapidly,” he said.

Johnson said the iPhone provides developers with opportunities for innovation through its extra features and the coherence of the iPhone SDK code base.

He expects GasBag to be just the beginning of products and opportunities for jamcode.

“It [GasBag] looked like a good application to start with, as we can now leverage this user base to bring out a suite of applications with similar functionality but for different use cases, such as finding parking lots, ATMs, hotels, etc,” he told iTnews.

“The iTunes store has been great for GasBag - it really let us provide our application to an extremely wide market very quickly,” he said.

more

iTnews: Non-equilibrium chips could avoid overheating laptops

Thursday, October 02, 2008


As a journalist at iTnews:

Researchers are re-examining the Second Law of Thermodynamics in a bid to manage heat from laptops and other miniaturised electronics.

As chip manufacturers cram increasing numbers of transistor switches in small areas to make faster, cheaper chips, heat dissipation has become a growing concern.

The Pentium II processor, introduced in 1997, was said to generate more heat than a hot plate, and Intel in 1999 predicted that the amount of heat generated by computer chips would increase linearly as chip sizes decrease.

Should the trend continue, future electronic devices could generate more heat than nuclear reactors and be ‘as hot as the Sun’ within two decades, researchers say.

“Laptops are very hot now, so hot that they are not 'lap' tops anymore,” said Avik Ghosh, who is researching new heat dissipation methods at the University of Virginia's School of Engineering and Applied Science.

“It [heat] is probably the most critical impediment to continued miniaturisation of transistors, which carry out logic operations in computers,” he told iTnews.

Ghosh and his colleague Mircea Stan are investigating a concept known as ‘non-equilibrium Brownian ratchets’ which could revolutionise how heat is dissipated between computer components.

Currently, devices are engineered to operate near thermal equilibrium, in accordance with the Second Law of Thermodynamics which states that heat tends to transfer from a hotter unit to a cooler one.

However, using the concept of Brownian ratchets, which are systems that convert non-equilibrium energy to do useful work, the researchers hope to allow computers to operate at low power levels, and harness power dissipated by other functions.

“The main quest we have is to see if by departing from near-equilibrium operation, we can perform computation more efficiently,” Ghosh told iTnews.

“We aren't breaking the Second Law -- that's not what we are claiming,” he said. “We are simply re-examining its implications, as much of the established understanding of power dissipation is based on near-equilibrium operation.”

But while the physics of non-equilibrium Brownian ratchets has been studied extensively for some time, the concept’s application in a technology context has not.

Ghosh expects to face challenges ranging from proof-of-concept demonstration, to going beyond models to experimental testing, and analysing the practicality, robustness and cost-effectiveness of these schemes.

“Until we do a proper study, we can't be sure whether this method would suffice
to address the considerable challenges of heat generation and removal,” he said.

“Our short-term plan is to study this over the next three to five years at this time to see where we end up with non-equilibrium switching, and whether it could offer a solution.”

more

iTnews: GPS spoofing device developed to thwart spoofing

Wednesday, October 01, 2008


As a journalist at iTnews:

Researchers have developed a portable Global Positioning System (GPS) spoofing device in a bid to explore aspects of civilian spoofing and how such attacks may be thwarted.

By providing researchers with information about which aspects of spoofing are most easily implemented, the project could allow device manufacturers to prioritise their spoofing defences.

Spoofing is a term used to describe the transmission of fake signals that navigation devices accept as authentic signals from GPS satellites.

As GPS devices become more pervasive in home and business use, spoofing could have a serious impact -- for example, through large enterprises such as utility companies that use GPS in their core operations.

“A spoofed GPS device can be tricked into computing the wrong position or time,” said Brent Ledvina, an assistant professor of electrical and computer engineering at Virginia Tech who conducted the research in collaboration with Cornell University.

“This can be devastating for some receivers and a nuisance for others,” he said.

The research was presented at the Institute of Navigation GNSS conference in Georgia last month, after more than a year of equipment building and experimentation.

While GPS spoofing has attracted some U.S. government attention during the past half-decade, manufacturers have yet to address these security concerns, Ledvina said.

“Currently, GPS receiver equipment manufacturers do not have spoofing on their
radar,” Ledvina told iTnews.

“[Cornell researcher] Todd Humphreys and I talked with the leading receiver manufacturers at the Institute of Navigation conference … and not a single individual we spoke with was knowledgeable of the topic.”

“That doesn't mean these companies are not interested or concerned, per se, but their representatives at this conference certainly were not,” he said.

Ledvina expects most current GPS receivers to be vulnerable to spoofing attacks, with the yet-to-be verified exception of multi-antenna receivers that use special signal processing.

The researchers have proposed two software modifications that they expect to make receivers less vulnerable to spoofing by changing how they react to signals.

They have filed a provisional patent application for the technology and are in discussion with ‘a couple’ of companies about commercialisation.

However, Ledvina expects the only long-term solution to come from encrypting signals broadcast by the GPS satellite constellation by adding an authentication signal.

“The truth is there is no perfect solution for a stand-alone, single-antenna receiver that does not use any type of authentication,” he told iTnews.

“It would most likely take a long time [for authentication signals to be added] because the signal specifications for the broadcast GPS signals are difficult to modify, and satellites on orbit today have a long lifetime.”

“Note that adding an authenticating signal is not a technical issue; it's more of a political issue,” he said.

more

iTnews: HP collaborates to identify Wi-Fi dead zones cheaply

Tuesday, September 30, 2008


As a journalist at iTnews:

A collaborative effort by HP Labs and Rice University has produced a technique that could lower the cost of identifying ‘dead zones’ in large wireless networks.

The technique enables Wi-Fi architects to test and refine their layouts before a network is deployed.

According to Joshua Robinson, a graduate student at Rice University, there currently is no standard industry practice to identify Wi-Fi dead zones.

“The frequency of dead zones have actually been a huge obstacle to deploying city-wide wireless networks,” he told iTnews.

“Since companies don't advertise how they find dead zones, it's hard to say authoritatively what happens.”

Some providers employ expensive, exhaustive measurement studies that require the network to be tested from every location from which potential users may wish to connect.

Other approaches involve taking a few measurements in an ad hoc fashion and fixing any remaining dead zones after the network is deployed.

According to Robinson, the goal of the new technique is to focus measurement efforts on ‘trouble areas’ that potentially could be dead zones.

The technique identifies locations at which the network should be tested by combining wireless signal models with publicly-available information about basic topography, street locations and land use.

“We develop accurate predictions and use these predictions to avoid spending a lot of measurements in areas that have clearly very good or very poor performance,” he explained.

“This is how we are able to use a small number of measurements to more accurately find a network's performance and identify all the dead zones.”

The research won best-paper honours at the annual MobiCom ’08 wireless conference in San Francisco this month.

By requiring five times fewer measurements when compared with a grid sampling strategy, and ‘far fewer’ measurements than needed for an exhaustive measurement strategy, the new technique could reduce labour and equipment costs, researchers say.

Robinson expects the municipalities, companies, and non-profit organisations looking to deploy city-wide wireless mesh networks to benefit most from the new technique.

He named for example the Technology-for-All (TFA) network that is being built by Rice University in partnership with a local non-profit organisation to wirelessly provide free Internet access to an under-served neighbourhood in Houston, Texas.

“We currently serve around 4000 people,” he told iTnews. “Since we do not have a big budget to test the network, techniques to reduce the cost and time involved in finding our dead zones are very helpful.”

Besides the TFA deployment, the new technique also has been tested on Google’s wireless network in Mountain View, California.

When compared with exhaustive measurement studies of both networks, the technique was found to achieve approximately 90 percent accuracy while requiring less than two percent of the number of measurements performed.

In the short term, Rice University researchers will be focusing on extending their research for use in the network planning process.

HP Labs has a definite commercial interest in the project and has been involved in prior deployments in Taipei. However, no plans for commercialisation of the technique have been announced as yet.

more

iTnews: Beware smartphone data leakage, Marshal warns

Friday, September 26, 2008


As a journalist at iTnews:

The increasing use of Blackberry, iPhone and other smartphone devices in the enterprise could put corporate data at risk, content security vendor Marshal warns.

According to Marshal’s Asia-Pacific Vice President, Jeremy Hulse, companies need to govern the use of smartphones, which enable a greater number of people to access company data from anywhere.

While notebook computers have enabled similar data mobility in the past, Hulse expects the burgeoning smartphone culture to introduce new risks to enterprise security.

“You don’t really pull a notebook out as much as a smartphone,” he noted. “The risk is pulling a smartphone out with friends at a bar, leaving it around, or losing it in a public place.”

Highlighting the importance of financial and strategic data to a corporation, Hulse said businesses should pay more attention to defining and protecting their critical information.

“The level of risk [posed by smartphones] depends on the type of information that people are pushing down to mobile devices, and the locations they are accessing this information from,” he told iTnews.

“They [businesses] have to ask themselves, ‘Do people need to access corporate information on mobile devices?’”

Market pervasion and a diminutive size have contributed to the fact that mobile phones and PDAs currently are far more commonly lost, or left behind, than notebook computers.

According to a recent survey by privacy vendor Credant technologies, a total of 62000 mobile devices have been left in London cabs during the past six months.

While personal data and identity fraud has been the main worry of lost mobile devices in the past, Hulse expects corporate data loss soon to steal the spotlight.

“It’s only a matter of time, especially with the amount of storage available in new devices,” he said.

Besides instilling a corporate culture of greater care when accessing company data on a smartphone, Hulse suggests the use of technologies such as content filtering, hardware and software locks.

While he could not identify manufacturers of mobile devices that offer particularly good or bad security, Hulse noted that some vendors have collaborated with Microsoft to install technology that wipes clean a devices’ memory in case of loss or theft.

Other vendors offer software that provides a standard operating environment across mobile devices and enterprise desktop computers, which could enable organisations to monitor and filter the transfer of sensitive data.

Noting that smartphone technology could benefit employees’ productivity, Hulse said security should not be seen as a barrier to mobility, but an enabler to maximise the benefit from mobile technology investments.

“I think smartphones can actually be really productive, but I think they need to be looked at in terms of security,” he told iTnews.

“The capability [for increased productivity] is there, but training for staff needs to be there too,” he said.

more

iTnews: In-the-cloud security to grow in economic crisis, Webroot predicts

Thursday, September 25, 2008


As a journalist at iTnews:

The economic downturn could give the software as a service (SaaS) model an edge over traditional security software provision, Webroot believes.

In Sydney this week to speak at Gartner’s IT Security Summit, Webroot CTO Gerhard Eschelbeck highlighted the difficulty of managing an evolving threat landscape amid staff shortages and tightening budgets.

While e-mail security has dominated the spotlight previously, Webroot research indicates that the Web has been a greater source of threats in recent times.

The research also found that more than half of Web security decision makers feel that keeping security products up-to-date is challenging, and almost 40 percent believe their companies devote insufficient resources to Web security.

“Nowadays, the threat landscape is changing in so far that the variants of malware is exploding compared to five years ago,” Eschelbeck said, noting that malware variants in circulation today are five times as numerous as those five years ago.

“It is clear that the traditional applications are reaching some of their physical limits,” he told iTnews.

Compared to traditional, on-premise security applications, Webroot’s SaaS offerings provide businesses with the ability to outsource the burden of security.

Companies’ e-mail and Web traffic is filtered through Webroot’s data centres in Australia to detect and remove any malware.

Webroot has invested approximately $1m in its Australian operations, including a newly-launched data centre in Sydney and eight support staff across Sydney and Melbourne.

According to Charles Heunemann, Webroot’s Managing Director of Asia Pacific Operations, the company has provisioned for ‘significant growth’ in the region.

The Sydney data centre currently is used to only a ‘single digit percentage’ of its capabilities, he said.

“What we’ve got is a situation where we’ve come from an early adaptor stage to a wider use of SaaS,” Heunemann said of the growing market for in-the-cloud security applications.

The rise of SaaS was said to be a result of economic pressure to deliver more value through IT and a trend towards outsourcing ‘non-core’ applications.

In the Asia-Pacific region, the market for SaaS is experiencing a Compound Annual Growth Rate (CAGR) of 44 percent, Heunemann said, compared to a CAGR of 13 percent for on-premise software.

Webroot estimates in-the-cloud security applications currently to have a market penetration of 8 percent in Australia, 4 percent in Asia and 25 percent in the U.K.

“The genesis of security technology tends to be in the U.K.,” Eschelbeck said, expecting Australian adoption to reach similar figures ‘before long’.

In terms of Eschelbeck’s research into the ‘Laws of Vulnerabilities’, the SaaS model could reduce the threat posed by current Web-based malware by narrowing organisations’ window of exposure.

“The Laws of Vulnerabilities is to do with how quickly organisations are patching their systems,” he explained. “There is a physical limit to how quickly companies can patch, so there is always a window of exposure of five to nine days.”

“The huge advantage that it [SaaS] brings is moving the responsibility [of patching] from the organisation to the provider,” he said.

Heunemann noted that through providing a specialised security service to multiple organisations, service providers such as Webroot also have access to economies of scale and greater visibility of the overall threat landscape.

Webroot also is able to provide its customers with some level of insurance, through guaranteeing a minimum malware capture and detection rate.

“Compared to the traditional model where you’re looking at deploying an on-site application to protect against malware, the SaaS model essentially is a pay-as-you-go option that scales very well linearly, as you grow,” Eschelbeck said.

“Typically, IT departments are not overstaffed -- I’ve never heard of overstaffing as an being an issue for these departments -- so the advantage [of SaaS] is taking the burden [of security] away from end users,” he said.

more

iTnews: Ruxcon hacker conference opens arms to security pros

Wednesday, September 24, 2008


As a journalist at iTnews:

Community-organised hacker conference, Ruxcon, is aiming to attract a ‘more diverse field’ of attendees to its annual event in November.

Now in its sixth year, Ruxcon is expected to bring together some 350 vulnerability enthusiasts from across Australia.

Ruxcon 2008 will be the sixth such event since its launch in 2003. Over the years, the conference has evolved from a technical, specialist event to have a broader security focus, according to conference organiser Chris Spencer.

“The focus is going to be a little more professional this time around,” Spencer told iTnews. “We want to start attracting a more diverse field of security professionals.”

“I don’t think there is much of an Australian hacking community anymore; the security industry has commercialised vulnerability research, so that there just isn’t a vibrant hacking community,” he said.

Spencer compared Ruxcon to high-profile hacker conferences such as Defcon and Black Hat in the U.S., describing Ruxcon as a hobbyist, community-driven event.

Since its inception, the conference has been organised by the same group of volunteers, all of whom have day jobs in the Australian security industry.

When not organising Ruxcon, Spencer works as a vulnerability researcher. Other organisers include consultants and security administrators.

Ruxcon 2008 will be held at the University of Technology, Sydney from 29 to 30 November, and costs $60 to attend.

Presenters hailing from Australia, New Zealand, Italy and France will discuss topics such as how Graphics Processing Units (GPUs) can be used by malware, and heap exploitation theory in Windows Vista.

Despite the vulnerabilities and methods that will be discussed at the conference, Spencer notes that Ruxcon has not encountered any resistance from vendors to date.

And previous years’ sponsorship by vendors such as Google and VeriSign’s iDefense has not impacted presentations such as ‘Google Hacking’, which will be discussed this year.

“When they [Ruxcon presenters] present on these topics, they are doing it from a research background and not a malicious standpoint,” Spencer said of the risks of revealing vulnerabilities.

“As a whole, it’s [Ruxcon] just about putting on a demonstration of the talent we have in Australia,” he said.

more