Friday, July 31, 2009

Globus Toolkit, Version 4.0 (GT4) Released to Open Source Grid Development Community

Leading Grid Vendors, Standards Bodies and Development Community Call GT4 the Most 'Enterprise Ready' Version To Date


CHICAGO, May 2 /PRNewswire/ -- The Globus Consortium today announced the release of Globus Toolkit, version 4.0 (GT4), developed by the Globus Alliance. The Globus Toolkit is an open standards building block for enterprise-level Grid implementations.

GT4 is the most stable, "enterprise ready" version of the Globus Toolkit ever -- incorporating the latest web services standards, new security and authorization features, and the collaborative efforts of a global community of open source Grid developers. GT4 can be downloaded at: http://www.globustoolkit.org/.

"Interoperability, flexibility and the freedom to choose the best vendor products and equipment is what enterprise Grid is all about," said Ian Foster, Board Member with the Globus Consortium. "The leading enterprise Grid vendors and standards bodies are standing behind GT4 as the preferred open source software for enterprise Grids. By building Grids with the Globus Toolkit, and by working with vendors who support the Globus Toolkit -- organizations can best position themselves to exploit the full potential of enterprise Grid."

For nearly a decade, a global community of Grid developers have contributed to Globus Toolkit code, and this latest GT4 release includes all of the necessary tools for building an enterprise Grid. Key additions to GT4 include:

-- GT4 complies with the latest Web Services Interoperability Organization (WS-I) web services standards, which provides maximum interoperability between different environments.

-- GT4 includes initial support for important authorization standards, including Security Markup Language (SAML) and Extensible Access Control Markup Language (XACML). These provide business with a foundation for building a secure web-services enabled Grid infrastructure.

-- GT4 implements the Web Services Resource Framework (WS-RF) and Web Services Notification (WS-N) specifications, which are emerging standards in OASIS backed by major vendors for web services enablement of Grid and resource management systems.

-- GT4 features sophisticated authorization and security capabilities- Globus has always been diligent in Grid security, and GT4 is also "enterprise ready" from a security perspective.

"For nearly a decade, major vendors and standards bodies -- including the Global Grid Forum (GGF) -- have contributed to the open source Globus Toolkit," said Mark Linesch, Chair of the GGF. "The Globus Toolkit has seen terrific success in research, academic and commercial high-performance computing environments. By continuing to align with the latest grid and web services standards, GT4 is poised for broader adoption -- particularly in enterprise markets where efficient resource sharing and more effective data integration are becoming increasingly critical."

North Carolina-based MCNC, which tests and deploys advanced networking solutions on its North Carolina Research and Education Network (NCREN) in partnership with North Carolina universities and state government, recently performed successful "testbed" work on GT4 -- with systems set up across the OC48 NCREN backbone. Vendor participants included: Cisco, Gridwise Technologies, IBM, Network Appliance, Red Hat and Sun.

"GT4 brings in all the standards and interfaces," said Wolfgang Gentzsch, the Managing Director at MCNC who oversaw the GT4 testbed efforts. "GT4 is much more flexible -- it brings in the ability to easily enhance Grid towards additional services, like accounting and billing, metering and measuring. Now that a larger part of GT4 is based on web services -- it is much easier to interface and communicate with other tools which are based on the same web services standards. As a result, more and more commercial tools will be compliant in the near future with Grid services."

Also standing behind the Globus Toolkit are the Globus Consortium Members -- HP, IBM, Intel, Nortel, Sun and Univa -- all of which have enterprise Grid products and services built on or around the Globus Toolkit. For example, IBM has offered its IBM Grid Toolbox based on the Globus Toolkit, and later this fall will announce a version based on GT4. And Grid Engine, the open source project sponsored by Sun, has been integrated into GT4 for a project at Imperial College in London. MCNC and Gridwise Technologies have also contributed to this effort.

"Globus Toolkit 4.0 allows enterprise users to bring standardized web services into a Grid environment, further simplifying the automated allocation of resources available on a Grid," said Greg Astfalk, Chief Scientist, Office of Corporate Strategy and Technology at HP. "HP sees Grid as a powerful way to virtualize and manage resources, enabling companies to respond to changing business needs and realize HP's vision of becoming Adaptive Enterprises."

"Standards are key to accelerating adoption of Grid Computing in the commercial marketplace," said Ken King, IBM's Vice President of Grid Computing. "A key driver of grid standards is the successful implementation and acceptance of Globus with enterprise customers. The new Globus Toolkit 4.0 adds more robust web services capabilities, enhanced security and powerful authorization features that we believe will be very compelling to our customers as they look at Grid as a way to simplify their infrastructure."

"Sun is actively engaged in bringing the productivity gains of grid computing into the enterprise," said Sohrab Modi, vice president, N1 Grid Systems, Sun Microsystems. "As a major supporter of open standards and open source software, Sun is pleased to see the arrival of Globus Toolkit 4 as another key tool toward achieving widespread use of grid technologies in the enterprise."

"Enterprises are increasingly challenged by constricting IT budgets, yet frustrated with proprietary, inflexible systems that remain under-utilized," said Steve Tuecke, CEO of Univa Corporation, a provider of commercial software, technical support and professional services for the Globus Toolkit. "Since the previous release of the Globus Toolkit, we have witnessed the emergence of many Web Services standards that are relevant for Grid infrastructure. Expanded support for these standards throughout GT4 enables enterprises to more easily integrate existing IT systems with Globus in order to optimize the use of existing computing, storage and networking resources."

About the Globus Consortium

The Globus Consortium is the world's leading organization championing open source Grid technologies in the enterprise. With the support of industry leaders IBM, Intel, HP, and Sun Microsystems, the Globus Consortium draws together the vast resources of IT industry vendors, enterprise IT groups, and a vital open source developer community to advance use of the Globus Toolkit in the enterprise. The Globus Toolkit is the de facto standard for Grid infrastructure enabling IT managers to view all of their distributed computing resources around the world as a unified virtual datacenter. By giving enterprises access to computing resources as they need it, IT costs can go up and down as business demands. An open Grid infrastructure is the pre-requisite to fulfilling the promise of utility computing.

Contact: Travis Van Page One Public Relations 650.565.9800, x103 travis@pageonepr.com

Globus Consortium


from : http://au.sys-con.com/node/81918

Wednesday, July 29, 2009

그리드 컴퓨팅이 은행권 투자업무시스템에 적용된다

우리은행, 비용절감과 리스크 관리 및 분석 작업의 원활한 수행노려

우리은행이 투자업무 강화를 위해 그리드 컴퓨팅 시스템을 도입하기로 해 주목된다.

28일 우리은행에 따르면, 은행측은 주식 및 일반파생 상품 운영 시 옵션 가격, 위험지수 등 많은 양의 계산 및 시뮬레이션의 속도 향상을 위한 그리드컴퓨팅 솔루션 도입을 위한 BMT(벤치마크테스트)를 실시하기로 했다고 밝혔다.

이번 그리드 컴퓨팅 환경 도입은 최근 강화되고 있는 금융권 투자업무(IB) 강화와 맞물려 진행되는 것으로 금융권 전반에 영향을 미칠 전망이다.

앞서 그리드 컴퓨팅은 금융권에서 신한은행이 차세대 시스템을 오픈하면서 도입한 바 있다.

그러다 당시 그리딩컴퓨팅의 개념은 단순히 전산시스템의 무정지 시스템 구현에 목적이 있었을 뿐 본격적인 분석 작업을 위한 도입은 이뤄지지 않았다.

우리은행 관계자는 “현재 트레이딩 부서가 전체 시스템을 재구축하고 있다”며 “IB업무 등 전체를 재개발하고 있으며 그 일환으로 그리드 컴퓨팅이 검토되고 있는 것으로 안다”고 설명했다.

투자은행 업무 환경에선 다양한 파생상품은 물론 상품에 따른 리스크 측정, 시장 모델링, 경기에 따른 수익 분석 등 다양한 시물레이션과 분석작업이 수반된다.

시뮬레이션의 경우 별도의 서버와 워크스테이션 등 하드웨어 자원을 활용해 처리했지만 금융환경이 복잡해지면서 분석해야 할 지표가 증가하는 등 기존 시스템으로는 한계에 도달했다는 것이 금융권의 시각이다.

특히 최근 비용절감이 화두가 되고 있는 금융권의 분위기를 감안하면 무조건적인 하드웨어 증설에 부담을 느끼고 있다는 것이 업계 관계자들의 공통된 의견이다.

이를 해결할 수 있는 방안으로 제시되고 있는 것이 바로 그리드 컴퓨팅인 것.

이미 금융권에선 그리드 컴퓨팅 도입으로 여러 가지 장점을 얻을 수 있는 것으로 파악하고 있다.

우선 비용절감과 효율성이라는 두 마리 토끼를 잡을 수 있다는 점이다. 파생상품 등 다양한 금융공학이 도입돼는 금융상품 개발에는 복잡한 금융 어플리케이션들의 도움을 절대적으로 필요로 한다.

하지만 기존에는 이러한 금융 어플리케이션들을 별도의 서버나 장비를 통해 운영함으로서 응답속도와 같은 효율성을 기대하기 힘들었다.

하지만 그리드 컴퓨팅을 도입해 금융권 어플리케이션의 수행성과 효율성을 증진시킬 수 있다는 것.

이미 해외에서도 그리드 컴퓨팅은 금융권에 급속도로 도입되고 있다.

업계에 따르면 최근 국제통화기금 IMF도 국제금융 위험분석 신뢰도를 높이기 위해 고성능 그리드 컴퓨팅을 적극 도입하기로 한 것으로 알려졌다.

그리드컴퓨팅 업체의 한 관계자는 “금융권에서 리스크 측정과 분석이 중요해지고 있는데다 금융위기 이후 이러한 분석 작업이 더 강화되고 있는 추세”라며 “기존 자원을 이용해 분석을 효율화하는 그리드 컴퓨팅 기술이 다시 한번 각광을 받고 있다”고 전했다.

실제로 파생상품 등 다양한 업무를 처리해야 하는 2금융권에서도 그리드 컴퓨팅 솔루션에 대한 관심이 높아지고 있다는 것이 관련 업계의 주장이다.

그리드 컴퓨팅 업체의 관계자는 “지난 2006년 금융권 그리드 컴퓨팅이 한때 화두로 등장했지만 이후 자연스럽게 수그러들었다"며 "하지만 최근 비용절감과 맞물려 다시 한번 성장의 기회를 잡은 것”이라고 설명했다.

<이상일 기자> 2401@ddaily.co.kr

Tuesday, July 28, 2009

클라우드컴퓨팅연구조합, KAIST에 컴퓨팅 자원 제공

한국클라우드컴퓨팅연구조합(이사장 한재선)은 28일 넥스알, KAIST 등과 함께 대학에 컴퓨팅 자원을 제공하는 국내 첫 클라우드 컴퓨팅 연구개발(R&D) 테스트베드를 오픈한다.

테스트베드는 대학 수업 및 연구에 필요한 컴퓨팅 자원을 클라우드 형태로 무상 제공하는 ‘CCI:U(씨유, Cloud Computing Initiative for Universities)’ 프로젝트의 일환으로 대전 KAIST에 구축됐다. 연구조합은 28일 대전 KAIST에서 CCI:U 프로젝트 발표 및 테스트베드 오픈행사를 갖는다.

테스트베드는 △CPU 600코어 △메모리 1테라바이트 △스토리지 디스크 300테라바이트 규모 하드웨어와 넥스알의 클라우드 플랫폼 ‘아이큐브 클라우드’, 오픈소스 플랫폼 기술 ‘하둡’ 등의 소프트웨어로 구성된다. 조교와 학생들이 클라우드 컴퓨팅을 쉽게 이용할 수 있도록 지원하는 온라인 수업 실습관리서비스 ‘코스랩’ 등도 함께 제공된다.

연구조합은 CCI:U 프로젝트로 컴퓨팅 자원 제공 외에도 수업 진행을 위한 커리큘럼 및 실습과정 개발도 지원할 계획이다. 연구조합은 KAIST, 포항공대, 고려대 등과 이에 관해 협의 중이며 이르면 오는 9월 가을학기부터 수업에 적용할 수 있을 것으로 예상했다.

한재선 연구조합 이사장은 “국내 클라우드 컴퓨팅은 산업체 주도로 경제적인 관점에서만 다뤄지고 있다”며 “장기적으로 한국이 클라우드 컴퓨팅 산업에서 우위를 점하기 위해서는 대학의 역량을 강화하는 것이 필요하다”고 강조했다.

from: http://www.etnews.co.kr/news/detail.html?id=200907270121

Monday, July 27, 2009

Cloud computing more secure than traditional IT, says Google

Cloud computing can provide higher levels of security than most in-house IT, says Google.

Most businesses do not have the security intelligence gathering capabilities and resources to match Google's, said the firm's enterprise security director Eran Feigenbaum.

"Cloud computing can be as secure, if not more secure, than what most organisations do today in the traditional environment," said US-based Feigenbaum on a visit to London.

Security is often cited as a concern by businesses considering cloud computing, but the model eliminates the opportunity for most common causes of data breaches, he said.

"Data is typically lost when laptops and USB memory sticks are lost or stolen, but local storage is no longer necessary if a company uses cloud-based apps," said Feigenbaum.

One of the selling points for the cloud computing model is that users are able to access documents from anywhere at any time over the internet.

"Statistics show that 66% of USB sticks are lost and around 60% of those lost contain commercial data," said Feigenbaum.

Security patching is a common problem that can be eliminated by cloud computing.

"Research shows most organisations take between 25 and 60 days to deploy security patches, but CIOs admit it can take up to six months," said Feigenbaum.

Cloud computing service providers like Google claim that the model frees company IT administrators of this task, improving security for many organisations.

Google is able to patch systems rapidly and efficiently as it has a homogenous IT environment across the organisation, unlike most other businesses, said Feigenbaum.

The rapidly increasing number and sophistication of cyber threats is another area of security that most organisations are ill-equipped to deal with.

"Google is able to gather security intelligence from billions of transactions a day and apply that intelligence in real time throughout the organisation," he said.

"A lesson learned on Google.com is a lesson learned on Google Apps".

According to Feigenbaum, enterprises will move to cloud computing just as people started using banks because they had better security resources.

"This change in mindset will move businesses from datacentres to cloud computing services that have the expertise and systems to protect their data," he said.

Alluding to a problem with Google Docs that caused users to share documents inadvertently earlier this year, Feigenbaum said the model once again proved itself.

"Only 0.05% of users were affected and we were able to fix the problem very quickly. In a traditional environment, more users would have been affected and it would have taken longer to resolve," he said.

Feigenbaum emphasised the importance of security and privacy to Google, saying these issues are "paramount" to the company's success.

For this reason, Google has 24-hour security monitoring of systems, distributes data throughout the organisation so it is not humanly readable, and conducts regular third-party security audits.

The company also has several processes in place aimed at minimising insider threats to security.

Google enforces role-based and least-privilege employee access to systems, provides security training for all employees and requires them to sign up to a strict code of conduct.

Top five cloud computing security issues

In the last few years, cloud computing has grown from being a promising business concept to one of the fastest growing segments of the IT industry. Now, recession-hit companies are increasingly realising that simply by tapping into the cloud they can gain fast access to best-of-breed business applications or drastically boost their infrastructure resources, all at negligible cost.

But as more and more information on individuals and companies is placed in the cloud, concerns are beginning to grow about just how safe an environment it is.

1. Every breached security system was once thought infallible
2. Understand the risks of cloud computing
3. How cloud hosting companies have approached security
4. Local law and jurisdiction where data is held
5. Best practice for companies in the cloud

Every breached security system was once thought infallible

SaaS (software as a service) and PaaS (platform as a service) providers all trumpet the robustness of their systems, often claiming that security in the cloud is tighter than in most enterprises. But the simple fact is that every security system that has ever been breached was once thought infallible.

Google was forced to make an embarrassing apology in February when its Gmail service collapsed in Europe, while Salesforce.com is still smarting from a phishing attack in 2007 which duped a staff member into revealing passwords.

While cloud service providers face similar security issues as other sorts of organisations, analysts warn that the cloud is becoming particularly attractive to cyber crooks.

"The richer the pot of data, the more cloud service providers need to do to protect it," says IDC research analyst David Bradshaw.

Read more about cloud computing and security >>



Understand the risks of cloud computing

Cloud service users need to be vigilant in understanding the risks of data breaches in this new environment.

"At the heart of cloud infrastructure is this idea of multi-tenancy and decoupling between specific hardware resources and applications," explains Datamonitor senior analyst Vuk Trifković. "In the jungle of multi-tenant data, you need to trust the cloud provider that your information will not be exposed."

For their part, companies need to be vigilant, for instance about how passwords are assigned, protected and changed. Cloud service providers typically work with numbers of third parties, and customers are advised to gain information about those companies which could potentially access their data.

IDC's Bradshaw says an important measure of security often overlooked by companies is how much downtime a cloud service provider experiences. He recommends that companies ask to see service providers' reliability reports to determine whether these meet the requirements of the business. Exception monitoring systems is another important area which companies should ask their service providers about, he adds.

London-based financial transaction specialists SmartStream Technologies made its foray into the cloud services space last month with a new SaaS product aimed at providing smaller banks and other financial institutions with a cheap means of reconciling transactions. Product manager Darryl Twiggs says that the service has attracted a good deal of interest amongst small to mid-tier banks, but that some top tier players are also being attracted by the potential cost savings.

An important consideration for cloud service customers, especially those responsible for highly sensitive data, Twiggs says, is to find out about the hosting company used by the provider and if possible seek an independent audit of their security status.

"Customers we engage with haven't been as stringent as we thought they would have been with this".

Read more about cloud computing and security >>



How cloud hosting companies have approached security

As with most SaaS offerings, the applications forming SmartClear's offering are constantly being tweaked and revised, a fact which raises more security issues for customers. Companies need to know, for instance, whether a software change might actually alter its security settings.

"For every update we review the security requirements for every user in the system," Twiggs says.

One of the world's largest technology companies, Google, has invested a lot of money into the cloud space, where it recognises that having a reputation for security is a key determinant of success. "Security is built into the DNA of our products," says a company spokesperson. "Google practices a defense-in-depth security strategy, by architecting security into our people, process and technologies".

However, according to Datamonitor's Trifković, the cloud is still very much a new frontier with very little in the way of specific standards for security or data privacy. In many ways he says that cloud computing is in a similar position to where the recording industry found itself when it was trying to combat peer-to-peer file sharing with copyright laws created in the age of analogue.

"In terms of legislation, at the moment there's nothing that grabs my attention that is specifically built for cloud computing," he says. "As is frequently the case with disruptive technologies, the law lags behind the technology development for cloud computing."

What's more, many are concerned that cloud computing remains at such an embryonic stage that the imposition of strict standards could do more harm than good.

IBM, Cisco, SAP, EMC and several other leading technology companies announced in late March that they had created an 'Open Cloud Manifesto' calling for more consistent security and monitoring of cloud services.

But the fact that neither Amazon.com, Google nor Salesforce.com agreed to take part suggests that broad industry consensus may be some way off. Microsoft also abstained, charging that IBM was forcing its agenda.

"Standards by definition are restrictive. Consequently, people are questioning whether cloud computing can benefit from standardisation at this stage of market development." says Trifković. "There is a slight reluctance on the part of cloud providers to create standards before the market landscape is fully formed."

Until it is there are nevertheless a handful of existing web standards which companies in the cloud should know about. Chief among these is ISO27001, which is designed to provide the foundations for third party audit, and implements OECD principles governing security of information and network systems. The SAS70 auditing standard is also used by cloud service providers.

Read more about cloud computing and security >>



Local law and jurisdiction where data is held

Possibly even more pressing an issue than standards in this new frontier is the emerging question of jurisdiction. Data that might be secure in one country may not be secure in another. In many cases though, users of cloud services don't know where their information is held. Currently in the process of trying to harmonise the data laws of its member states, the EU favours very strict protection of privacy, while in America laws such as the US Patriot Act invest government and other agencies with virtually limitless powers to access information including that belonging to companies.

UK-based electronics distributor ACAL is using NetSuite OneWorld for its CRM. Simon Rush, IT manager at ACAL, has needed to ensure that ACAL had immediate access to all of its data should its contract with NetSuite be terminated for any reason, so that the information could be quickly relocated. Part of this included knowing in which jurisdiction the data is held. "We had to make sure that, as a company, our data was correctly and legally held."

European concerns about about US privacy laws led to creation of the US Safe Harbor Privacy Principles, which are intended to provide European companies with a degree of insulation from US laws. James Blake from e-mail management SaaS provider Mimecast suspects that these powers are being abused. "Counter terrorism legislation is increasingly being used to gain access to data for other reasons," he warns.

Mimecast provides a comprehensive e-mail management service in the cloud for over 25,000 customers, including 40% of the top legal firms in the UK.

Customers benefit from advanced encryption that only they are able to decode, ensuring that Mimecast acts only as the custodian, rather than the controller of the data, offering companies concerned about privacy another layer of protection. Mimecast also gives customers the option of having their data stored in different jurisdictions.

For John Tyreman, IT manager for outsourced business services provider Liberata, flexibility over jurisdiction was a key factor in his choosing Mimecast to help the company meet its obligations to store and manage e-mails from 2500 or so staff spread across 20 countries. The company is one of the UK's leading outsourcing providers for the Public Sector, Life Pensions and Investments and Corporate Pensions leading. "Storing our data in the US would have been a major concern," Tyreman says.

Read more about cloud computing and security >>



Best practice for companies in the cloud

Inquire about exception monitoring systems
Be vigilant around updates and making sure that staff don't suddenly gain access privileges they're not supposed to.
Ask where the data is kept and inquire as to the details of data protection laws in the relevant jurisdictions.
Seek an independent security audit of the host
Find out which third parties the company deals with and whether they are able to access your data
Be careful to develop good policies around passwords; how they are created, protected and changed.
Look into availability guarantees and penalties.
Find out whether the cloud provider will accommodate your own security policies


Read more about cloud computing and security:

A history of cloud computing >>

Security trends for 2009 >>

Security Handbook: The essential guide to establishing a security policy >>

Security Zone: opinions and insights from experienced professionals >>

External links:

Cloud security blog >>

from: http://www.computerweekly.com/Articles/2009/04/24/235782/top-five-cloud-computing-security-issues.htm