Tuesday, July 7, 2015

Computer weekly european user awards 2015 winners revealed

Ten innovative IT projects were crowned winners of the Computer Weekly European User Awards 2015 at a ceremony during Cloud World Forum.
The Computer Weekly European User Awards honours IT professionals across Europe that have excelled in their approach to using technology. Some 130 entries were narrowed down to a shortlist of 50 finalists.
Brian McKenna, business applications editor at Computer Weekly, presented the winners with their trophies at the conference, which took place at the Olympia National in London on 24 June 2015.
The Computer Weekly European User Awards announcement was part of a joint ceremony in which McKenna also announced the winners of the Cloud World Series Awards and Navispace awards.
The winners for 2015 are:
Best private sector project of the year
MTGx, entered by Load Impact.
Best datacentre project of the year
European Bioinformatics Institute, entered by Delphix.
Best mobile computing project of the year
Natural Resources Wales (NRW).
Best public sector project of the year
Home Office, entered by Equal Experts.
Best networking project of the year
Osborne.
Best storage project of the year
Amatis Networks, entered by Tintri.
Best security project of the year
University College London Hospitals NHS Foundation Trust, entered by Nexthink.
Best cloud project of the year
The Open University.
Best big data, BI and analytics project of the year
British Gas Smart Metering (BGSM), entered by QlikView.
Best business applications project of the year
Dwr Cymru Welsh Water.
MTGx, entered by Load Impact
In 2013, MTGx acquired the rights to broadcast the 2014 Winter Olympics throughout Sweden. As Sweden is quite competitive in winter sports, MTGx expected a significant amount of traffic accessing their various sites, as well as the live stream offered on os.viaplay.se.
Load Impact was commissioned to help execute a series of load and performance tests prior to the start of the Olympics.
Judges said: “MTGx tested several worst-case scenarios for the Winter Olympics, including massive usage spikes, and identified bugs that could have severely impacted the performance of the site. The entry demonstrates the importance of testing way beyond normal thresholds.
“The nature of internet usage today means traffic spikes cannot always be predicted and so everything needs to be tested, re-tested and optimised to ensure sites remain running.”
European Bioinformatics Institute, entered by Delphix
Genomic research is one of the most exciting areas in science, and the UK is proudly playing a leading role with the European Bioinformatics Institute (EBI) in Cambridge.
Set to improve efficiency of treatments and cut costs, genomics will usher in a new age of personalised medicine that focuses on treating patients not just by their current illness, but their specific genetic makeup. However, the technological implications of this data-intensive research on datacentre infrastructures are only just being understood.
When the first human genome was sequenced, it took 10 years and cost almost £10bn. Now it can be done in days, if not hours, and costs under £650. However, as this area of life science develops, the amount of data it generates and demands is growing exponentially.
The EBI is part of the European Molecular Biology Laboratory (EMBL) and operates as a non-profit organisation providing freely available data from life science experiments to support researchers in academia and industry.
EBI already holds over 50 petabytes of data in three datacentres in the UK, and this is doubling every year. This means in just five years time they could have well over 1.5 exabytes of data – or 1.5 million terabytes.
To take control of this explosion in data and the accompanying infrastructure, EBI employed Delphix’s data-as-a-service (DaaS) platform. Having already modernised its computing through server virtualisation, revolutionising the data architecture in its datacentres was the next step to cope with the data onslaught.
Using the Delphix DaaS platform, EBI began a project to virtualise the 500 databases that hold the metadata for genome datasets. This data can be provided as a service to its various teams, enabling each developer to create their own environment on demand without affecting the datacentre processing power or increasing storage requirements.
Judges said: “This is a really interesting project. Not only has a non-profit organisation had to tackle the costly issue of big data analytics and storage, but the project is innovative and forward-thinking.”
Natural Resources Wales (NRW)
The Natural Resources Wales (NRW) ICT team has delivered a complex programme of change and unification of IT systems, merging three government bodies into a single entity, consolidating the functions of Countryside Council for Wales, Environment Agency for Wales, Forestry Commission for Wales, along with certain Welsh Government functions.
Judges said: “This mobile computing project stood out for its ability to save costs and drive the business forward. A well thought-out project and well-presented entry, which is a reflection of a pioneering team.”
Home Office, entered by Equal Experts
Equal Exerts supports the Home Office in delivering the Visa exemplar system called the UKVI Online Application Service.
The Immigration Platform Technologies (IPT) programme is delivering the technology and information to support the immigration service now and in the future. It is delivering three digital technologies:
UKVI Online Application Service: A single online application process for all visa and immigration services.Case-working Tool: A modern, resilient case-working tool for the Home Office.Immigration Identity Assurance System: An integrated platform providing a single view of customers and their history.
Judges said: “The Home Office is a deserving winner for such a forward-thinking and creative project. Equal Experts show originality and freshness with their ideas and what the organisation saves as a result.”
Osborne
With an annual turnover in excess of £320mn, Osborne employs around 1,000 staff and several thousand sub-contractors working on up to 90 sites across the UK. Osborne clients span both private and public sector, including critical national infrastructure such as rail, road, health and education projects. The company has been an early adopter of technologies such as building information modelling (BIM), thin client and virtualisation to help streamline the construction process.
IT access for construction sites offers many benefits, but has in the past proved problematic to deliver. Osborne originally covered the site connectivity gap by providing 3G dongles, 3G routers or wired broadband. Osborne found a great deal of inconsistency in this approach and turned to Onwave.
Onwave’s managed service takes advantage of any available access method to improve the performance, reliability and flexibility of IP connectivity. As an internet service provider, Onwave delivers a complete managed service comprising hardware, software and maintenance, and will deliver traffic from each site through its own network, directly into the clients’ datacentre.
Judges said: “A well thought-out project that displays clear savings. The entry is well put together and shows Osborne to be innovative, original in its ideas and committed to getting it done right.”
Amatis Networks, entered by Tintri
Amatis Networks, a UK-based end-to-end network systems provider was in need of more suitable storage technology. It was determined that a scalable private cloud deployment was the best approach for its customers’ environment.
One of its customers’ experienced operational problems on a daily basis in its environment. The customers’ virtual machines (VMs) were crashing during peak times, and the existing infrastructure was unable to support the availability needed.
Amatis wanted the ability to easily deploy virtual applications and VMs through a web interface and provide continuity of configuration across all sites. Customers were also asking Amatis to take responsibility of the day-to-day management, security and alerting for new environments.
By taking a simple, appliance-based approach the Tintri system emerged as the best choice for the customer’s environment. The system would enable Amatis to gain an exceeding level of support and engagement from the smart storage.
Judges said: “The savings this project makes reveals what an imaginative and resourceful team has been created through the Amatis Networks and Tintri partnership.”
University College London Hospitals NHS Foundation Trust, entered by Nexthink
Nexthink’s user IT analytics helped deliver tactical and strategic benefits to University College London Hospitals (UCLH) in the first year. UCLH was looking for greater intelligence on the performance and utilisation of IT assets across its managed technical estate to help improve efficiency and plan for longer-term strategic investments.
In its first year, Nexthink helped the ICT department become more proactive in its approach to the management of incidents, security and rationalisation of the estate, and enabled it to become a more intelligent customer.
Judges said: “This security project is an excellent example of security supporting the business through providing strategic and tactical benefits at the same time as improving information security.
Many information security experts are encouraging organisations to build a capability to monitor, manage and proactively respond to security issues before they become a threat to the business, which is what Nexthink has achieved in this project.”
The Open University
The Open University Production Portal is a Microsoft Azure cloud-based application used to manage the production of the thousands of hours of audio/video (AV) content the Open University (OU) creates each year for its students and the general public. Every year, more than 10,000 pieces of AV content are commissioned and produced for the university by a large number of external production companies.
Prior to the project, the production process in the OU was ineffective. It promoted duplication of effort resulting in lost man-hours and could not adequately track what third-party assets were used in the AV content.
The answer to these problems was the brainchild of the OU’s Licensing and Acquisitions department. The Production Portal project was launched in late 2013, with the aim of running in time to manage the production of the 2014-2015 academic years’ content.
Judges said: “The Open University had a well-written entry. The project they embarked on was more comprehensive than some of the others. Instead of having just signed up for a software-as-a service offering, the OU’s project showed innovation and creativity.”
British Gas Smart Metering (BGSM), entered by QlikView
British Gas Smart Metering (BGSM) was founded in 2010 to roll out smart meters to British Gas customers across the UK. So far, the company has installed more than one million meters in homes across Britain, but will install 16 million by 2020.
The logistics behind scheduling engineers to travel and access customer homes and fit the smart metering equipment is a complex process, especially with so many customers to serve. The company decided to invest in its business intelligence and data to help streamline the process and deploy meters more effectively. Smart meters are designed to help consumers monitor and use energy more efficiently and will revolutionise this industry.
Following three months’ proof of concept activity, QlikView was chosen as the platform to deliver the performance view and an element of self-service analysis, while Hadoop Data Lake Platform was used for data storage and consolidation.
As part of the delivery, the Data Lake now has more than nine billion records available to the customers of the QlikView dashboards delivered by the project and it consumes data from multiple data sources, including more than 150 SAP tables from that source system alone. On average the Qlikview dashboards process files with more than four billion records and fetch 45 million records daily. Using QlikView, British Gas can quickly pull data together into a single application.
Judges said: “This project is a fine example of a truly big data analytics project, with QlikView pulling in data from big and traditional data sources. The QlikView dashboards process files with more than four billion records and fetch 45 million records daily. It has given the management team a ‘single version of the truth’. Smart metering is important to reducing energy usage, and the project has been seen as a global pioneer.”
Dwr Cymru Welsh Water
Dwr Cymru Welsh Water is the fourth largest company in Wales. The company is responsible for providing more than three million people with a continuous, high-quality supply of drinking water and for taking away, treating and properly disposing of wastewater.
The billing system is vital to the smooth running of the business, as it issues bills to the value of over £800m to its customers each year. It holds the key data to enable it to successfully respond to nearly 800,000 customer calls per year and processes more than 1.1 million meter reads per year.
In 2012, the company took the decision to replace its legacy mainframe billing environment with a modern suite of applications. The Newid programme was created to manage this substantial implementation. (Note: Newid stands for New Income Delivery - the key aim of the programme is to accurately bill customers for their water and sewerage charges. Newid also means “change” in Welsh.)
Dwr Cymru Welsh Water took the decision to act as the system integrator. After an intense procurement process, the product RapidXtra (from Echo Managed Services) was selected as the core billing engine.
Judges said: “Dwr Cymru Welsh Water is responsible for providing more than three million people with clean water. Its Billing Replacement Program (Newid), beginning in 2012 and going live in January 2015, was a large, complex change management program. Welsh Water was its own systems integrator.

“The programe came in on time and in budget (£33m). Data quality was improved by 44% and technical operating costs were reduced by £400k per year, with a greater range of tariffs now available to help customers.”

Security Think Tank: Making the most of logs with SIEMS

Most of our hardware and software generates logs which can be used for review in an information security context. But as we know all too well, the extent of technology sprawl in a business presents a whole raft of challenges, which extends to log management as well.
There is not just one log to deal with, there are hundreds – maybe thousands – that we may be interested in reviewing; everything from Windows to firewalls to servers generates them, which makes collecting and studying the vast amounts of data in logs incredibly difficult, especially when insights are needed quickly.
Currently, many logs are generated and then ignored, as resources (or skills perhaps) to review and analyse them in a timely and useful manner are lacking.
This sounds like a big data problem, and it is. How can we make sense of all this data and put it to good use? How can we know what data to collect, and what aspects to focus on? This is where security information and event management (SIEM) tools come in. These tools offer an automated way to tie together all the log data generated by the network and its security tools, then condense it into something manageable.
SIEM tools are a practical way to enable security teams to detect, respond to and prevent incidents in a fast-moving, data-heavy environment. They provide a way to detect anomalies and attacks on a network by comparing current traffic to the average in real-time. Notifications can then be sent to security personnel to respond and rectify.
This functionality can be extended to automate actions – if the SIEM detects an abnormally high amount of traffic going out of a PC (a symptom of ex-filtration attacks), it can learn this pattern of traffic and automatically stop it if the issue is detected again in the future. This process can be completed much quicker than a human and is an improvement to the overall security programmer.
Log management and SIEM tools have huge potential to make the lives of security staff easier, but they also make an inevitable impact on user privacy. All devices that generate logs will have an IP address or MAC address that is traceable to a user depending on the IAM system. Security departments have the ability to go extremely deep into the data, so the practicalities must be balanced with privacy.
Ultimately, if your are monitoring your networks for security purposes, the best thing you can do is tell all your users in any agreements they sign that you are collecting data relative to their activity for security purposes. You may wish to remind users via pop-ups when they connect to the internet, access business apps and use collaboration tools that you are monitoring and collecting data. We have to be able to analyse and use log data and associated user data o have sophisticated security tools that can sit on the front line of a business’s defenses; otherwise there is no point.
Logs can play a useful role in information security, and the advent of big data and automated analysis tools has increased their utility. The key is to set out what log management will deliver for the function, then plan for that delivery to happen, whilst ensuring that privacy considerations are addressed. 

Adrian Davis is managing director for Europe at (ISC)2.

Friday, July 3, 2015

Let’s get real about decryption, says GCHQ tech director

Perceptions that GCHQ is capable of mass decryption of data are not based on physical realities, according to the UK intelligence agency’s technical director Ian Levy.
“The best estimate for the number of bits of work to decrypt just one 1,024-bit SSL [secure sockets layer] session is 280 (1,208,925,819,614,629,174,706,176) and with the best processor we have today requiring 10 nanojoules per instruction, that works out at 3.4 terawatt hours per 50 minutes, which is the entire power generating capacity of the UK,” he told a Digital Security BT Tower Talk in London.
Levy said while it would be “awesome” to be able to do that, there needs to be more “honesty” in the discussion around GCHQ’s decryption capabilities.
He said that unfortunately all the speculation around the decryption capabilities of intelligence agencies has created the public perception that good encryption does not work properly in protecting information.
“This perception drives weird behaviour and undermines trust in technology," said Levy. "But even if an intelligence agency wants to break encryption, it is a great deal more work than anybody realises because the laws of physics still apply.”
Levy said that for a number of years, the UK has had legislation known as the Regulation of Investigatory Powers Act (Ripa) part three, which says anyone suspected of a crime who has an encrypted device, a notice can be served requiring the contents to be made intelligible on penalty of two to five depending on the type of crime involved.
“But what we are struggling with at the moment is the encryption of over-the-top services because providers are behaving like nation states in protecting their users, thereby blocking security and law enforcement officers from accessing the channels being used by criminals and terrorists,” he said.  
On the topic of the planned Investigatory Powers Bill aimed at giving police, security and law enforcement agencies greater access to communications data, Levy said people should consider how this would work and what it would enable.
He said that, for example, if a young girl is sexually abused after being groomed online, police would be able to ask mobile phone providers what mobile phones were located in the area the child was picked at the time she was picked up and which of those phones made connections to Facebook at the same time as the child, which would narrow the field of inquiry down from everybody in the country to a relatively small number.
“That is what the bill is about – it has nothing to do with content; it is all to do with metadata, and if we bring honesty to the discussion, we can start to address the issue of trust, skills and other problems in tackling this properly,” said Levy.
It is up to parliament, he said, to set limits and codes of conduct, adding that the Anderson report on the proposed Investigatory Powers Bill is “a great starting point for what a modern interception security framework should look like”.
According to Levy, the security industry has failed to give users, particularly young people, the information they need to make informed judgements when using technology. “People need to have a better understanding of the potential long-term impact of sharing their data, such as how it could affect insurance premiums in the future,” he said.
Levy also pointed out that the data the Investigatory Powers Bill seeks to access is a “lot less rich” than the data social media companies collect about their users. “It requires an act of parliament and a code of practice to determine what can and cannot be done with communications data, but social media companies are able to change their terms and conditions whenever they like,” he said.
Levy added that most people who use internet services do not read the terms and conditions, and are therefore working under an incorrect assumption of privacy on the internet.
“Google made $57bn last year, and most of the services they are offer are free. So somehow they are getting money out of their users, and that is by aggregating huge amounts of metadata about their users which can be used to make money,” he said.
Levy said a paper by US academic Paul Ohm demonstrates how population-scale anonymisation does not work using anonymised search results published by Yahoo. Ohm was able to identify many of the people behind the search results by combining the Yahoo data with public Netflix review data.
“At its heart, the internet economy is fundamentally incompatible with privacy,” he said.
On the topic of cyber security, he said while attackers are extremely capable and are targeting people in clever ways, they too are bound by the laws of physics, which means if they are going to compromise a machine they need to be able to connect to it, and if they are going to compromise a person they need to be able to manipulate that person to do something.
“In general, a lot of what is going on is mass, untargeted stuff – and we can defend against that, and in some cases it is a trivial thing to do,” said Levy.
Typically, when companies have to admit they have been compromised, they tend to describe the attack as “unprecedented” and “sophisticated”, but he said many of these attacks can be traced by to a spearphishing email that tricked someone inside the organisation into doing something to let the attackers in, which is in attack technique that has been used for the past 20 years.
“If the attack involves a vulnerability that has been patched by the vendor but the patch has not been applied by the victim, then that attack was entirely defendable, and if an organisation has a system that has been designed in such a way that someone clicking on a malicious email link can give an attacker access to the organisation’s entire credit card database, to my mind they deserve everything they get. That’s business risk management gone wrong and cyber Darwinism wins,” said Levy.  

BT Security chief predicts big challenges ahead, despite progress

Cyber security is not all doom and gloom because there is some “good stuff” going on and progress is being made, according to BT Security president Mark Hughes.
“Cyber security is a big opportunity, not just a threat and although it’s an arms race with constantly increasing capabilities on both sides, by systematically applying security controls we are raising the bar,” he told attendees of a Digital Security BT Tower Talk in London.
Hughes added that it is “nonsense” to say that cyber attacks are an “insurmountable problem” because there is a lot that businesses can and should do to protect themselves.
However, he said businesses should be aware of the potential security threats that are on the horizon, such as those posed by coming 5G mobile networks.
“Early deployments of 5G networks are delivering gigabit throughputs with just milliseconds of latency bringing them very close to the performance of fixed line networks,” said Hughes.
“This opens the door to the internet of things (IoT) and will make things like driverless cars a reality, but it also means we will have to rethink cyber security to ensure the integrity of connections and transactions on networks that are likely to piggy-back off domestic broadband connections,” he said.
Huawei, a major player in the Chinese mobile market, believes 5G will provide speeds 100 times faster than 4G and will increase network expandability up to hundreds of thousands of connections.
Hughes predicts that 5G will bring big disruptive changes, and with those changes will come big challenges for information security.
Cloud computing also presents a security conundrum, he said, because a lot the applications that are being used in the cloud are already 10 to 15 years old and architected for systems that ran in a different way to how they run now.
“We are also facing a long period of transition in which organisations are still going to be running mainframe systems as well as cloud systems, where some systems are on-premise while others are off-premise and maybe even running in different jurisdictions, which is going to be challenging from an information security point of view,” said Hughes.
Another challenge, he said, is finding people who know the right questions to ask when it comes to deriving benefit from big data: “We are only just beginning to understand what we can get out of big datasets, which makes data science probably one of the biggest challenges.”
BT’s experience as the telecommunications provider for the London Olympic and Paralympic Games 2012, said Hughes, demonstrated the importance of harvesting information in real time and correlating that into a form that data analysts can use to identify malicious activity.
“This is something that needs to be done in addition to all the traditional means we have used to protect networks and we are seeing organisations currently working to attain this capability,” he said.
Despite the progress that has been made in cyber security, Hughes said many organisations are still not clear about what are their most important data assets and where they are stored and protected, and consequently are still trying to protect everything rather than focusing on what is crucial.
“This is a common mistake that many businesses make, but instead they should be looking to understand and value their assets, and to understand the risk appetite of the organisation to ensure the right assets are protected properly,” he said.
Another common mistake, said Hughes, is that there is some “big hand of government” that can fix this, while in reality, most of what needs to be done and the improvements that need to be made are vastly distributed across industry and different jurisdictions.
“Organisations need to have a way of sharing information about threats, and there is a common purpose in ensuring that we can collectively understand how those things are coming down the line to attack us so we can defend against them, which is fundamentally different to the way we have done things in the past,” he said. 
At the same time, Hughes said, many organisations need to get busy doing basic stuff like ensuring that all their software systems are patched up to date.
“If we do that and if we raise the bar systematically, then we will collectively get to a point where we will be able to benefit from all the great things the digital world can deliver,” he said.

Hacker tries to hold Plex video streaming service to ransom

Holding data for ransom payable in bitcoin is a growing trend, according to security researchers, and video streaming service Plex is the latest target, prompting a password reset.
The most common attacks use ransomware such as CryptoWall that encrypts company data and demands payment for the decryption key or demands payment to hold off business-crippling distributed denial-of-service (DDoS) attacks, a tactic used by a gang known as DD4BC.
In the Plex case, the attacker gained access to the server hosting its forums and blog, and threatened to publish IP addresses, forum private messages and email addresses if a ransom was not paid.
The hacker demanded payment of 9.5 bitcoin (£1,500) by 3 July 2015, saying the ransom would thereafter go up to 14.5 bitcoin (£2,500).
Plex has refused to pay the ransom and alerted its users to the breach. The firm is also requiring affected users to reset their passwords as a precaution.
According to Plex, the passwords were encrypted using salting and hashing. This means some random nonsense was added to the password text (salting) and the salted password was scrambled cryptographically and stored in a one-way scramble version only (hashing).
However, there is always a risk that given time the hacker could reverse engineer the passwords, which is why Plex has opted to reset affected passwords.
Plex said that its forums will remain offline until the investigation into the intrusion is completed, but that all other systems are operational. The firm said no payment information is stored on its servers.
Plex advised users to ensure that they were not using the same password elsewhere and to use a password manager.
Independent security advisor Graham Cluley said this is good advice. “If you re-use passwords it only takes one website to be hacked for you to suffer a world of pain,” he wrote in a blog post.
Password managers, he said, enable users to store their passwords securely and generate unique, complex, hard-to-crack passwords.
However, Cluley was critical of Plex’s decision to embed a clickable link to reset passwords in their email advisory to users. “That's precisely the kind of trick used in phishing attacks,” he wrote.
Cluley said giving in to blackmail is never a good idea because there is no guarantee that the extortionist will not ask for more money.
“Instead, invest the money in better security – and perhaps either patching your software, or getting a solution which is more capable of defending itself against future attacks,” he wrote.

ICO reports progress in data protection, but funding remains a concern

The past year has been one of progress in data protection and freedom of information – but funding continues to cause concern, according to the latest annual report by the Information Commissioner’s Office (ICO).
“It’s thirty years since this office was established in Wilmslow. We’ve seen real developments in the laws we regulate during that time, particularly over the past year,” said information commissioner Christopher Graham.
He cited as an example the Court of Justice for Europe ruling on Google search results, saying the case could never have been envisaged when the data protection law was established.
“Our role throughout has been to be the responsible regulator of these laws. More than that, we work to demystify some of this legislation, making clear that data protection isn’t to be seen as a hassle or a duck-out, but a fundamental right.
“A good example of that is our role in the data protection package being developed in Brussels. We’ve been asked for our advice, based on our experience regulating the existing law, while we’ve also provided a sensible commentary on proceedings for interested observers.
“That role will continue this year, in what promises to be a crucial twelve months. The reform is overdue, but it is vital that we get the detail right on a piece of legislation that needs to work in practice and to last.”
The information commissioner reflected on the tenth anniversary of the Freedom of Information Act, which was implemented in January 2005.
“It is striking to see how decisions that were so hard fought in the early years have resulted in routine publication of information. Publication of safety standards of different models of cars, for example; or hygiene standards in pubs and restaurants; and surgical performance records of hospital consultants. Publication is now expected and unexceptionable.
“It’s been the ICO’s job to help public authorities to comply with requests.
“The ICO’s role has led to information being released that time and time again has delivered real benefits for the UK. Our annual report is our claim to be listened to in the debates around information rights. It shows the ICO knows what it is talking about."
Graham highlighted the strengthening of the ICO’s regulatory powers to show how the legislation continues to develop. In the past year, the ICO was given powers to compulsorily audit NHS bodies for their data handling. Companies' practice of forcing a prospective employee to make a subject access request for their spent criminal record, for example, was also made an offence.
“The long wished-for commencement of the offence of enforced subject access (section 56 of the Data Protection Act (DPA)) enables the ICO to tackle the abuse of this important right. No longer can employers get round the legal safeguards by forcing would-be employees to prejudice their own privacy in return for a job,” Graham said in the report.
He said a change in the law made it easier to issue fines to companies behind nuisance calls and texts. The report showed that, of the £1,078,500 monetary penalties issued by the ICO in the past year, £386,000 – nearly 36% – were for companies making nuisance calls or texts, while there was an 11.4% rise in number of related reports to 180,188.
The report shows that, while the number of data protection reports the ICO received fell just 3% compared with the previous year to 14,268, the value of monetary penalties fell by more than 45% – reflecting the ICO’s emphasis on helping UK organisations improve data protection, rather than punishing them for shortcomings. The ICO reported an increase in the proportion of complaints that were resolved informally to 22%, up from 19% the previous year.
According to the report, the ICO answered 195,431 helpline calls, conducted 41 audits of data controllers and 58 advisory visits to SMEs, responded to 1,177 information requests and recorded 4.9 million visits to the ICO’s website.
Despite the progress and the challenges that lie ahead with the EU data protection reforms, Graham said the ICO still awaits a solution to the problem of how best to fund its operations in future.
The report highlights as an “area of uncertainty” possible reductions in income for freedom of information work, given the government's focus on deficit reduction.
Funding has been a central theme in the previous two annual reports with the information commissioner consistently expressing concerns about funding for the ICO in the long term.
For the past six years, the ICO has faced a reduction in its funding for freedom of information work and notification fees for data protection.
The proposed EU data protection reforms will remove the notification fee that funds the ICO’s work under the Data Protection Act.
In response to these changes, the ICO has called for a new method of funding, and last year’s report called on parliament to establish a single, graduated information rights levy to fund the ICO.
In last year’s report, the information commissioner called on parliament to “strengthen the commissioner's powers, enable the adequate resourcing of the ICO, and guarantee the commissioner's independence”.
However, while still expressing concern about the uncertainty of funding for the ICO, Graham said this year’s accounts reflect the welcome agreement from the Ministry of Justice (MOJ) allowing the ICO greater flexibility in accounting for non-frontline costs between its data protection income from registration fees and the grant-in-aid which pays for the freedom of information work.
Commenting on ICO finances, Chris McIntosh, chief executive of ViaSat UK said that, while the ICO’s net expenditure fell 32%, this year’s report suggests it is operating against the limits of its financing.
“If we are to ask the ICO to take greater action against those breaking the data protection act; to be able to monitor and audit organisations as it feels necessary; and to have greater power to enforce data protection best practice, it is clear this funding needs to increase,” McIntosh said.
According to McIntosh, with greater resources the ICO might have been able to perform audits that came to more than 1/40th of the number of data incidents investigated. “In an ideal world, we would see the ICO performing more audits and having to investigate fewer incidents – but it seems that is still some way off,” he said.
However, McIntosh noted that, while the value of financial penalties levied by the ICO has almost halved compared with the previous year, the final amount paid to the ICO and its consolidated fund after reductions and appeals has not been nearly so greatly affected; dropping by just 13% from £872,000 to £757,000.
“After last year, where more than half of the consolidated fund’s supposed income was eliminated, this can be seen as a serious improvement. This is mostly down to no appeals to punishments being brought, which could suggest that the ICO is being smarter about how it picks its battles and not pursuing cases that could result in a costly and ultimately counter-productive appeal,” he said.
“For an organisation that needs to consider its budget, this is the wisest course of action: We can only hope that, in the future, greater resources will allow the ICO to pursue tougher cases as well."

Security Think Tank: Guidelines to enable security to get the most out of log management

Organisations collect a mountain of logs each day. This includes logs from servers, firewalls and intrusion detection systems, events from network infrastructure devices, such as routers and access gateways, and from various software and hosted services. Information is often scattered across systems, as departments set up their own log management tools, creating many different hiding places where they store log data.
The need for compliance means IT administrators are required to manage their log management systems in a co-ordinated manner to enable spotting any unauthorised activity. For example, the PCI Data Security Standard (PCI DSS) specifically calls out for the need of a log review and the importance of tying identity to activity.
Ensure whatever log management tool you use is installed and managed correctly so it monitors events and data that matter to the organisation, meaning the reports actually have value to your organisation. Use log data to work out what has happened during an “outage”. 
All of the information necessary to work out what is happening, or has just happened, can be found in the log files. Systems that allow staff to write and run reports in real time, based on outage information, deliver the facts needed by response teams to understand what is happening on the network.
If we can use software to collect this information and display it in a meaningful way, analysts can make informed decisions as to the seriousness of a log event in a matter of seconds, and their ability to detect and respond to harmful events improves dramatically. 
Knowing the identity of individuals who access unauthorised data is really important, but only if the information you receive is correctly organised and correlated to avoid falsely accusing an individual of illegally accessing sensitive information. Privacy and log management is a difficult subject, but a balance needs to be struck between the need to collect data, which will benefit the business, and infringing users’ rights.
Log management also needs to be part of the overall network security infrastructure to protect against “blended threats” to your organisation. The way to successfully manage log data will lie in the ability to look for user behaviour or attitude changes, plus the ability to monitor activity and report on segregation of duties, dual controls and access violations. 
Log management needs to be at the core of your company’s incident response plan – ensure the system is monitored 24/7 and you are notified about serious problems in seconds, rather than the morning after. Hackers, after all, only need a few minutes on your network to find the valuable data they want, so the speed of your response is absolutely key. The good news is we are getting the tools that are beginning to make this practical.
Tim Holman is an international board director at the Information Systems Security Association and CEO at 2-sec.

Wi-Fi enhances guest experience and profitability for Belgian theme park

With the expectation of internet access almost universal, the provision of publicly accessible Wi-Fi networks in shops, restaurants, hotels and other public spaces is nothing new.
Bobbejaanland, one of the biggest theme parks in Belgium, has tapped into this trend and deployed a wireless LAN (WLAN) infrastructure supplied by Fortinet.
Even though they exist to give visitors a fun day out, like any other business theme parks must be profitable, and profit was a key motivation in Bobbejaanland’s decision to deploy the wireless network infrastructure.
The park’s first goal was to provide reliable network connectivity to the different corporate devices installed around the park, including computers and point of sale (PoS) devices. The second goal was to offer free internet connectivity to its visitors.
However, in the digital world it is important to keep in mind there will always be a trade-off and nothing is truly free. In Bobbejaanland’s case, the trade-off is around personal data.
The concept of big data can be very helpful for a company such as Bobbejaanland – it collects a lot of data, stores it and generates more data based on what it has collected. What if you could learn how your visitors are entering the park, when they go to lunch or how long they remain in a specific area? What if you could attract them into particular shops? This data all comes from smartphones.
Niels Meeus, IT manager at Bobbejaanland, explains how the park implemented and uses its set-up based on Fortinet appliances.
Across its campus, which covers an area of 56 acres and has 40 rides, it has deployed 30 wireless hotspots at sensitive places to cover the maximum possible area of the park, with 90% currently covered.
These are managed and protected by a Fortigate firewall, which also serves as the wireless network management tool.
Next to the classic firewall system, another product has been deployed – FortiPresence. This tool helps gather intelligence based on connected devices. By applying triangulation, the device location can be estimated and put on a dynamic map.
The benefits of such a system are very valuable in terms of learning how visitors flow through the park, conducting queue event analysis, driving concession sales and profiling newcomers and regular visitors.
From a management perspective, a critical aspect of the deployment is to learn about visitors’ habits to make their day out more pleasant, but also to get them to spend more money.
To capture visitor information, guests are invited to register on a captive portal to access the internet using an email address or a Facebook account, explains Meeus. During the registration process, guests must accept a disclaimer clearly stating their devices will be tracked via their MAC address. According to Meeus, only one person has so far asked for their data to be erased.
The dynamic captive portal then displays promotions depending on the guest’s location in the park, with access points (APs) installed at critical locations. For example, should a guest connect to the AP closest to a pizza stand, they may receive push notifications on available meal deals.
Visitor location tracking also helps Bobbejaanland’s management to anticipate guest needs before they become a problem. A surge of people into one area may highlight the need to re-fill drinks or vending machines – which is correlated with notifications the machines send over an internet of things (IoT) sensor network when almost empty – or herald a need for cleaning teams or even security guards to be deployed.
Under Belgian law, Bobbejaanland is considered to act as an internet service provider and must keep track of visitors’ activity by logging enough data to investigate potential incidents, from MAC addresses, device-associated email or Facebook accounts and the traffic generated, although this does not include the payload, which may contain sensitive information. This data is stored in the cloud for a period of three weeks.
Besides visitors to Bobbejaanland, Fortinet created two other user profiles with access to the wireless network. These are for back-office users and applications, and mobile employees moving around the park. All three categories of users have their own virtual LAN (VLAN), service set identifier (SSID), specific encryption levels and firewalls. 
On top of this, quality of service (QoS) has been implemented to prioritise the different types of traffic. In terms of incidents or potential breaches, Meeus says the park has faced none since implementing Fortinet, although the system was able to detect and disable two rogue APs.
The success of the implementation so far is a good example of how technology can grow businesses if used in the right way. For the future, the park has plans to go further by exploring other ways to interact with guests, such as Bluetooth low-energy beacons and radio-frequency identification (RFID) technology. 
Bobbejaanland also has a downloadable mobile application to help guests find their way around, look up prices and chat with friends, and plans to enhance it with interactive games based on Bluetooth beacons, says Meeus.

Windows Server 2003 end of support: Five options to choose from

Microsoft’s withdrawal of support for Windows Server 2003 on 14 July is a deadline many IT departments have not been looking forward to.
Industry estimates indicate that upwards of a fifth of servers are still running this version of Windows Server, which has now reached the end of its life as far as Microsoft is concerned.
Organisations will have the option to pay a premium for custom support contracts, but some businesses may find that the option to migrate to a newer operating system (OS) is out of their control
In November 2014, US-Cert issued a warning about the end of support deadline, stating: “Computers running the Windows Server 2003 operating system will continue to work after support ends. However, using unsupported software may increase the risks of viruses and other security threats. Negative consequences could include loss of confidentiality, integrity and/or availability of data, system resources and business assets.”
In a report titled Windows Server 2003 end of life: An opportunity to evaluate IT strategy, analyst company IDC warned that organisations could face problems with regulatory compliance if they remain on Windows Server 2003. 
“Failure to have a current, supported operating system raises significant concerns about an organisation’s ability to meet regulatory compliance requirements, as well as the needs of business units, partners, and customers,” the IT research firm noted in its February 2015 report.
But Windows Server 2003 is still dominant. According to CloudPhysics, which provides big data analytics for datacentres, one in five Windows Server virtual machines (VMs) runs the 2003 version, and thus will be affected by the removal of support.
And while Windows 2003 VM share is declining, given the current rate of decline CloudPhysics estimated that the proportion of servers still running the unsupported OS would reach a statistically insignificant level in the first half of 2018, three years after support ends. “This is a relatively faster decline than Windows 2000, which reached end of life in 2005 but retains a 1% share 10 years later,” the firm said.
According to CloudPhysics, since virtualisation separates PC server hardware from the OS, legacy operating systems can exist for much longer since they are able to run on newer servers.
In a blog post, Krishna Raj Raja, a founding member of CloudPhysics, noted that prior to virtualisation a server refresh generally required an OS refresh. “Newer hardware typically has limited or no support for legacy operating systems, so upgrading the OS became a necessity. With virtualisation, however, the hardware and the OS are decoupled, and therefore OS upgrades are not a necessity,” said Raj Raja.
Given that VMware announced support for 64-bit operating systems in 2004, and vSphere supports both 32-bit and 64-bit operating systems simultaneously, there is no need to choose one over the other, according to Raj Raja, with a legacy 32-bit OS (and even 16-bit OS) able to continue to co-exist with newer 64-bit operating systems.
“VMware’s support for legacy operating systems is excellent. It is possible to run a legacy OS such as Windows NT on modern processors that Windows NT natively wouldn’t even recognise. Also, the virtual devices in VMs provide encapsulation and prevent device driver compatibility issues,” said Raj Raja.
Dell Software president John Swainson said some organisations are upgrading to Windows Server 2008 as it is less disruptive than going to Microsoft’s newest version, Windows Server 2012 R2. 
In a recent interview with Computer Weekly, he said he had seen a number of organisations simply migrate to Windows Server 2008, as it is still a supported operating system and does not require the major application reworking associated with shifting the whole Windows Server infrastructure onto Windows Server 2012.
“Moving to Windows 2012 requires changing applications, and is a far more expensive upgrade from Windows Server 2003,” he said.
In the Gartner paper Managing the risks of running Windows Server 2003 after July 2015, one of the suggestions analyst Carl Claunch made for those systems that cannot be moved is to run a demilitarised zone (DMZ). 
“The concept of a demilitarised zone has been frequently used to isolate systems that are accessible by outsiders, to minimise what they could do to the rest of the datacentre if they become compromised. Further, much tighter control can be placed on which other systems they are permitted to contact and the types of access allowed,” he wrote.
“This may reduce the usability of a system, but it may be better than the alternative of losing all use if a new vulnerability becomes known. The nature of the vulnerability and the usefulness of the system in that case will help decide whether a DMZ may be sufficient to address risks.”
Could Linux be a viable option? Red Hat argues that since organisations moving to Windows Server 2012 would incur considerable costs, assessing the viability of running workloads on Linux should not be discarded. 
“If your organisation is running Windows Server 2003, now is the time to carefully consider Linux. If you upgrade to a new Windows infrastructure, 2008 or 2012, you’ll incur significant expenses associated with additional licences, client access licences, software licences, migration and future maintenance,” claimed Red Hat in its Migrating from Windows to Red Hat Enterprise Linux executive brief.
The cloud is another option. Why run a file server on-premise if a cloud service such as Box can be used instead? Application servers may be run more cost effectively on the public cloud.
Certainly, moving to the next supported release of Windows Server is not the only approach an IT department can take. Overall, the end of support for Windows Server 2003 represents an opportunity for CIOs to reassess their legacy Windows server applications and a chance to drop them or re-engineer them to run on a different platform.

UK universities revise computer course guidelines to boost ranks of cyber warriors

IT security certification body (ISC)2 has published accreditation criteria for teaching cyber security to more than 20,000 UK undergraduates a year from September 2015.
The UK’s first higher education leaning guidelines for undergraduate computing degrees will now form part of the accreditation criteria referenced by BCS, the Chartered Institute for IT.
Bill Mitchell, director of education at BCS, said the guidelines will provide additional direction on cyber security elements to complement the existing information security criteria for computing-related degrees accredited by the BCS.
(ISC)2, the Council of Professors and Heads of Computing (CPHC), the UK government and 30 UK universities developed the course guidelines over the past two years.
The guidelines will transform UK computing degree courses by ensuring that cyber security is taught in almost every computing degree at 100 universities across the UK.
The new-look degree courses are a key initiative in the government’s National Cyber Security Strategy to address the growing cyber security skills shortage.
Cabinet Office minister Matthew Hancock said the UK can maintain its world-class cyber security sector only if there are enough skilled professionals.
“Initiatives such as this are excellent examples of encouraging the best young people to consider careers in cyber security,” Hancock said.
The guidelines will help universities embed and enhance relevant cyber security principles, concepts and learning outcomes in their curricula at all levels.
This aims to ensure students are taught a broad spectrum of cyber security concepts, from threats and attacks to good governance and designing secure systems and products based on up-to-date industry expertise.
The guidelines also include core concepts such as information and risk, security architecture and operations, and cyber security management.
The UK university guidelines seek to bring computing degrees into closer alignment with industry requirements. It is hoped universities will provide graduates with the understanding and knowledge of cyber security necessary to building the IT infrastructure to support the UK economy.
“This marks a significant shift in teaching security in higher education; cyber security is now recognised as integral to every relevant computing discipline, from computer game development to network engineering,” said Carsten Maple, professor of cyber systems engineering at the University of Warwick and vice-chair of the CPHC.
“Previously, cyber security was treated as a separate discipline to computing, with students taught how to create applications or develop systems and technology – but not how to secure them; leading to proliferation of systems with built-in vulnerabilities,” he said.
Maple said the guidelines provide a practical and accessible way of incorporating cyber security into university curricula and moving the discipline forward.
The UK has long been affected by both a cyber security talent shortage and a mismatch between the capabilities of computing graduates and the requirements of industry, said Adrian Davis, European managing director of (ISC)2.
“These compounding issues have ultimately been compromising our ability to both build and defend the digital economy and UK plc,” he said.
The UK is now among the first nations in the world to ensure cyber security will be embedded throughout every relevant computing degree, said Davis.
“Crucially, the most up-to-date skills will be taught as the framework is built and maintained with the input of front-line information and cyber security professionals. UK graduates entering the workforce will be able to immediately put their skills to use,” he said.
(ISC)2’s latest Global Information Security Workforce Survey found 62% of UK organisations have too few cyber security workers; and 20% of UK respondents admitted they would take over eight days to rectify a security breach. 
The survey forecasts a 1.5 million global shortfall of information security professionals by 2020, which means organisations are increasingly struggling to manage threats, avoid errors and are taking longer to recover from cyber attacks,

Most VPNs leak user details, study shows

Most virtual private network (VPN) services used by hundreds of thousands of people to protect their identity online are vulnerable to leaks, a study has revealed.
VPNs are used by around 20% of European internet users to encrypt communications to circumvent censorship, avoid mass surveillance and access geographically limited services, such as BBC iPlayer.
But a study of 14 popular VPN providers found that eleven of them leaked information about the user because of a vulnerability known as IPv6 leakage, according to researchers at Queen Mary University of London (QMUL).
The leaked information ranged from the websites a user is accessing to the actual content of user communications, such as comments posted on forums. However, interactions with websites running HTTPS encryption, which includes financial transactions, were not subject to leaks.
The leakage occurs because network operators are increasingly deploying a new version of the protocol used to run the internet called IPv6, which is set to replace IPv4, but many VPNs currently only protect user’s IPv4 traffic.
For the study, the researchers connected various devices to a Wi-Fi access point, which was designed to mimic the attacks hackers might use.
Researchers attempted two of the kinds of attacks that might be used to gather user data. One was passive monitoring, which simply collects unencrypted information that has passed through the access point. The other was DNS hijacking, which redirects browsers to a controlled web server by pretending to be commonly visited websites, such as Google and Facebook.
The study also examined the security of various mobile platforms when using VPNs and found they were much more secure when using Apple’s iOS, but were still vulnerable to leakage when using Google’s Android.
“There are a variety of reasons why someone might want to hide their identity online. It is worrying they might be vulnerable despite using a service specifically designed to protect them,” said Gareth Tyson, a lecturer from QMUL and co-author of the study.
“We are most concerned for those people trying to protect their browsing from oppressive regimes. They could be emboldened by their supposed anonymity, while actually they're revealing all their data and online activity and exposing themselves to possible repercussions,” he said.
The paper A Glance through the VPN Looking Glass: IPv6 Leakage and DNS Hijacking in Commercial VPN clients will be presented at the Privacy Enhancing Technologies Symposium in Philadelphia on 30 June 2015.
In August 2011, Computer Weekly quoted James Lyne, currently the research chief at security firm Sophos, as saying criminals were already capitalising on the fact that few people are filtering IPv6 traffic or even know how to.
In the transition period, Lyne advised businesses turn off IPv6 until they are thoroughly prepared for the security implications of the new protocol and have updated all security filters and controls in their networks. Only switch IPv6 on, he said, once the controls are in place.
There is no instant switch to the new protocol, said Lyne, so partial adoption means using tunnelling technologies to transport IPv6 over IPv4, and this kind of workaround is another potential source of confusion, misconfiguration and security gaps.
It is important businesses understand if their web security solution can rate and analyse IPv6 content, he said, because without that ability, users will be vulnerable to attacks.