Friday 30 December 2011

INTERVIEW QUESTIONS

[Don't forget to have a look at free bonus at the end of this document.]

Review these typical interview questions and think about how you would answer them. Read
the questions listed; you will also find some strategy suggestions with it.

1. Tell me about yourself:
The most often asked question in interviews. You need to have a short statement prepared
in your mind. Be careful that it does not sound rehearsed. Limit it to work-related items
unless instructed otherwise. Talk about things you have done and jobs you have held that
relate to the position you are interviewing for. Start with the item farthest back and work up
to the present.

2. Why did you leave your last job?
Stay positive regardless of the circumstances. Never refer to a major problem with
management and never speak ill of supervisors, co-workers or the organization. If you do,
you will be the one looking bad. Keep smiling and talk about leaving for a positive reason
such as an opportunity, a chance to do something special or other forward-looking reasons.

3. What experience do you have in this field?
Speak about specifics that relate to the position you are applying for. If you do not have
specific experience, get as close as you can.

4. Do you consider yourself successful?
You should always answer yes and briefly explain why. A good explanation is that you have
set goals, and you have met some and are on track to achieve the others.

5. What do co-workers say about you?
Be prepared with a quote or two from co-workers. Either a specific statement or a
paraphrase will work. Jill Clark, a co-worker at Smith Company, always said I was the
hardest workers she had ever known. It is as powerful as Jill having said it at the interview
herself.

6. What do you know about this organization?
This question is one reason to do some research on the organization before the interview.
Find out where they have been and where they are going. What are the current issues and
who are the major players?

7. What have you done to improve your knowledge in the last year?
Try to include improvement activities that relate to the job. A wide variety of activities can
be mentioned as positive self-improvement. Have some good ones handy to mention.

8. Are you applying for other jobs?
Be honest but do not spend a lot of time in this area. Keep the focus on this job and what
you can do for this organization. Anything else is a distraction.

9. Why do you want to work for this organization?

© JobsAssist.com (www.jobsassist.com ) and VyomWorld.com (www.vyomworld.com ) - Free Student Resources

50 Common Interview Questions and Answers

This may take some thought and certainly, should be based on the research you have done
on the organization. Sincerity is extremely important here and will easily be sensed. Relate
it to your long-term career goals.

10. Do you know anyone who works for us?
Be aware of the policy on relatives working for the organization. This can affect your answer
even though they asked about friends not relatives. Be careful to mention a friend only if
they are well thought of.

11. What kind of salary do you need?
A loaded question. A nasty little game that you will probably lose if you answer first. So, do
not answer it. Instead, say something like, That's a tough question. Can you tell me the
range for this position? In most cases, the interviewer, taken off guard, will tell you. If not,
say that it can depend on the details of the job. Then give a wide range.

12. Are you a team player?
You are, of course, a team player. Be sure to have examples ready. Specifics that show you
often perform for the good of the team rather than for yourself are good evidence of your
team attitude. Do not brag, just say it in a matter-of-fact tone. This is a key point.

Friday 16 December 2011

Calculate Your GPA

Calculate Your GPA

Saturday 3 December 2011

Norman debuts endpoint protection system for securing enterprise and SMB networks against malware attacks

Norman Endpoint Protection delivers cost effective, easy-to-administer threat protection for small and medium businesses (SMBs) and scales, using a new multilevel architecture, to serve even the largest, most complex enterprise networks. Security policies may be set and managed centrally across multiple sites ensuring rapid security updates and effective local control.

Norman offers proactive content security solutions and forensics malware tools. Its anti-malware solutions, including malware analysis tools, network security and endpoint protection, are powered by patented Norman SandBox technology and used by security solutions providers around the world. Norman's unified core anti-malware protection for clients, servers and network security are delivered as products and services designed to protect business communications and resources, including corporate and government networks and applications, remote employees, branch offices and extranets. Norman's solutions are available through Norman subsidiaries and a network of global partners.

"This latest version of Norman Endpoint Protection delivers the most sophisticated IT protection available," said Audun Lodemel, vice president, marketing and business development. "The advanced architecture makes it extremely easy to manage any size network, while ensuring a lower cost of ownership experience."

Norman Endpoint Protection installs in minutes and its intuitive user interface displays all clients and servers in a network. All computers are updated with the latest protection and stay secure while reporting status to the Endpoint Manager console application. The Endpoint Manager is easy to use through logical, drag-and-drop actions for group handling and security policy management.

Norman Endpoint Protection comes with Norman SandBox and DNA Matching security technologies that protect against new and unknown malware and other persistent threats, keeping endpoints safe at the lowest possible cost. Norman Endpoint Protection stops all kinds of malware threats including viruses, worms, trojans, spyware, adware, bots, zero-day threats and rootkits.

SecureForce launches enterprise security intelligence managed service

In order to battle the volatility of networks and constant barrage and evolution of attacks, SecureForce LLC announced Wednesday its Enterprise Security Intelligence (ESI) solution operating in the cloud. SecureForce's ESI service examines configurations of firewalls, routers, VPNs, and load balancers across an organization's network to discover, classify, and prioritize openings in the network infrastructure. In addition, the service enables security policy validation, identifies and prioritizes vulnerabilities for remediation, and reveals regulatory compliance to auditors.

"A commitment to security means ensuring networks are constantly being checked to validate their integrity. The challenge is doing this frequently enough to identify and resolve any vulnerability while having the confidence in the security of your network at any point in time," said Stefen Smith, CSO for SecureForce. "Clients using our Enterprise Security Intelligence service are able to consistently monitor for problems, model changes, assess risks, and quickly remediate the variety of issues that occur when managing the diverse set of layer-three devices spread across their organizations."

SecureForce's Enterprise Security Intelligence service analyzes the configuration of layer-three devices in the network to examine the device, as well as to figure out how each of those devices interact with each other to provide security for the role in which they are being used.

SecureForce's ESI service focuses on two essential capabilities. The first, Network Accessibility Audit, delivers a best practices report to identify how the customer's network meets existing industry standards for managing the integrity of networks. The service analyzes every possible way devices communicate with each other. The resulting audit identifies untrusted devices or zones - pointing out accessibility holes in the network that require immediate attention.

The second capability of ESI is the Network Vulnerability Analysis that builds upon the intelligence generated in the Network Accessibility Audit by adding vulnerability scan data to the mix. The result is the delivery of a list of vulnerable or incorrectly configured devices, prioritized by degree of exposure and downstream risk, and a remediation plan directing clients where to take immediate corrective action.

"Over the last decade, we've found that most organizations out there are fortunate to perform these types of systematic audits and analyses once a year. With the increasing number of devices showing up across networks, the time and labor intensity required to perform such tasks is cost prohibitive to perform on a monthly basis - and in many cases nearly impossible," said Marc Perrelli, President and CEO of SecureForce. "With our Enterprise Security Intelligence service, we now provide our clients with a degree of confidence in the security posture of their networks that they've never had. More importantly, that confidence level is not something that will fade away over time as with other audits performed annually, if ever. Our service provides the audit and analysis each month - helping organizations conserve precious budgetary dollars, while providing the ability to actively monitor and manage the vulnerability state of their network."

SecureForce's Enterprise Security Intelligence solution allows clients to make informed decisions when mitigating vulnerabilities - ultimately addressing those prioritized issues that provide the biggest impact and benefit to the organization. ESI brings a structured approach to verifying the Defense In Depth principles and using real-world data to make recommendations, eliminating the manual and lengthy process of interpreting vulnerability scanner reports.

VASCO Data Security and OneLogin jointly protect users accessing the cloud

VASCO Data Security International Inc., a software security company offering authentication products, and OneLogin, a cloud-based Identity and Access Management (IAM) provider, have announced the integration of DIGIPASS as a Service with OneLogin's single sign-on (SSO) portal. Customers can now securely access their OneLogin account with VASCO's strong authentication in an easy to use, convenient manner that is managed entirely in the cloud.

OneLogin provides IAM in the cloud with an on-demand solution consisting of single sign-on, password management, multi-factor authentication, directory integration, user provisioning and an application catalogue with over 1,800 pre-integrated applications, including AtTask, Box.net, Cornerstone OnDemand, DocuSign, Google Apps, LotusLive, Netsuite, SugarCRM, WebEx Workday, Yammer and Zendesk. OneLogin, Inc. is backed by Charles River Ventures.

VASCO is a supplier of authentication and e-signature solutions and services specializing in Internet Security applications and transactions. VASCO has positioned itself as global software company for Internet Security serving a customer base of approximately 10,000 companies in more than 100 countries, including over 1,700 international financial institutions. VASCO's prime markets are the financial sector, enterprise security, e-commerce and e-government.

OneLogin offers identity and access management for enterprise Web applications. With one click, customers can access to more than 1,800 cloud applications. VASCO's cloud-based authentication service, DIGIPASS as a Service, eliminates static passwords and introduces multi-factor authentication, making access to OneLogin more secure. This reduces the risk of unauthorized access to corporate data accessible through the Internet and eliminates the hassles associated with password management.

DIGIPASS as a Service is offered as a complete outsourced authentication model, in which VASCO manages the entire authentication process. The service is completely integrated with OneLogin so enterprises don't need to invest in an on-premise infrastructure to protect their cloud applications. The DIGIPASS as a Service offering includes a fully redundant hosted authentication back-end, the provisioning of DIGIPASS software, hardware or mobile authenticators to end-users, first line support and professional and fulfillment services. End-users can use the same DIGIPASS for multiple applications.

DIGIPASS as a Service is sold in a subscription model, on a monthly or yearly basis. The service supports as few as five users with an unlimited growth of users.

For VASCO customers that are not ready to move fully to the cloud and have already deployed IDENTIKEY, an off-the-shelf centralized authentication server that supports the deployment, use and administration of strong user authentication with DIGIPASS, OneLogin has also integrated support via the RADIUS protocol. Strong authentication available through both VASCO's cloud-based and on-premise offerings provides a migration path to the cloud for many organizations.

"Security is the number one driver for identity and access management today," said Thomas Pedersen, chief executive officer of OneLogin. "VASCO's strong authentication products combined with OneLogin bring bank-level security to thousands of cloud-based applications in a turnkey solution that is instantly accessible to organizations of any size."

Sourcefire's Immunet anti-malware offering completes 2 million endpoints

Sourcefire Inc., provider of cybersecurity solutions, announced Monday that Immunet, its advanced anti-malware solution, has surpassed 2 million installed endpoints. Immunet achieved this milestone due to its "big data" approach to endpoint security, which allows it to provide an additional layer of protection not afforded by traditional anti-malware technologies.

Is Cloud Security Mere Vapor? Part Two

This article is the second in a two-part series exploring the best ways CIOs can ensure security in the cloud. This part reveals the final five of ten cloud related challenges CIOs face. Click here to read the first article in this series.

6. User Experience. Despite cost, user experience has to be as good or better.For example, if you can’t surf a YouTube video from your virtual desktop, then you’re probably not going to think it’s terrific. Find strong security mechanisms that work well at the desktop but are nearly undetectable to the users. Security that is obvious is easily mapped for workarounds by hackers and is a source of endless complaints from those we try to protect.

7. Mobility. If you can’t get to your virtual desktop from your Windows or Mac OS, iPad, iPhone, Android, Linux, or whatever else comes next, then you’re not going to use it. Without such functionality, BYOIT really doesn’t mean anything since anything less equals minimal functionality. Plus, if you don’t go the virtual desktop route via your mobile devices, then you are stuck navigating a mobile device strategy to secure data and other things you really don’t want on those devices to begin with. The simple yet unpopular CIO decision so far? Ban them all! But you don’t have to. Combine a virtual strategy with some mobile armor techniques to control these endpoints and you’ll strike a happy medium with your corporate users.

8. Voice/Video. In an everything-virtual world that is quickly going cloud, CIOs realize that their voice system needs to follow them into the cloud or half of their business is going to be down. Plus, being tied to costly physical PRI and analog lines can be very limiting. Session Initiation Protocol (SIP) trunks and instant Direct Inbound Dial (DID) range provisioning and mobility are quickly becoming par for the course now. But when it all goes IP, who is listening in? And even when encrypted, who holds the keys?

9. User & Application Metrics. Particularly around cloud desktops and applications, the need to improve the user experience or application response time is top of mind. A few innovative cloud providers offer amazing dashboards and functionality around this and it’s a huge lure because normally such systems with that amount of eye candy are costly, thus putting them far beyond the reach of anything less than the most sophisticated enterprise corporations. Like anything cloud, it’s what you can do with something that is most appealing. So naturally, being able to see in near real-time – via a remote web dashboard -- all of the processes running on your 50,000 desktop users deployed in 12 countries worldwide is too good to pass up. Granular metrics also lend themselves to spotting anomalies quickly and easily. Such anomalies are usually security risks waiting to be mitigated.

10. Professional Services. Even the large corporations with tons of staff and huge budgets are finding they can’t execute fast enough. IT was built in silos,where, for ages, a server team, desktop team, application team, storage team, and others all operated independently. But today’s technologies must leverage all of these technologies simultaneously or they just won’t work. Cloud architects are a rare breed of multi-certified individuals who usually have one or more certifications in each of these technology silos so that they can deliver a comprehensive solution that scales beyond enterprise to cloud-sized infrastructures wherein an infinite number of even the largest enterprises are hosted. Don’t be afraid to get a second pair of eyes and a new range of tools to parse the vast amount of security logs and other data your infrastructure produces. This could save your business a lot of public humiliation later.

Cloud Standards or Wild, Wild West?
Open Stack, Microsoft Federation, Open Virtualization Format (OVF) and other emerging standards are increasingly being adopted by various cloud providers. Keep an eye on how agnostic, open and API friendly your cloud provider is. Open standards are often the most secure since they are tested by a wide community of people with a vastly diverse set of skills.

The Sky’s the Limit: A Look at Dell’s Cloud

Dell has been a major player in the cloud space but from a HW infrastructure perspective, providing the driving force behind many private and public cloud offerings around the world. The major difference for Dell is that we are entering into the cloud arena with the ability to deliver hardware and services in a consumable fashion that meets our customers’ needs. Why now? Network speeds have increased dramatically, linking the world together in a way that just was not possible before; the types and quality of cloud service providers have risen consistently and an increase in comfort level for businesses to place their crown jewels – their data – outside the traditional four walls, as many CIOs now consider hosted environments to be more secure and stable than their own due to SLAs and overall customer commitments; and a growing number of CIOs seeing technology-as-a-service having the most profound influence, providing scalable methods of managing the IT side of the world. Yes – now is the time for Dell to strengthen its position in this market, facilitating our continuing transition from a hardware and services company into a solutions organization that can bring the best of breed products – hardware, services, application modernization, security, integration and cloud-based offerings – to our customers, helping to transform businesses and enabling new value.

KB: Let’s talk about bringing the cloud down to earth. What solutions and services does Dell offer to not only help organizations maximize their IT investment but also meet their broad-based business demands?

EC: As I mentioned, and probably will again, Dell already provides a very robust set of offerings for delivering cloud infrastructure to its customers. In fact, 18 out of the top 25 Internet companies on the planet use Dell hardware as their backbone for their day-to-day services, so we understand the cloud – from a hardware perspective, a services perspective and an overall infrastructure perspective.

In addition, our IT management Software-as-a-Service [SaaS} offerings – which are a robust set of tools focused on email management, distributed device management, backup, recovery & encryption, monitoring, and alerting allows us to fulfill the needs of customers in areas such as maintaining a cloud based disaster recovery for email. In other words: Simple-to- set-up email continuity, archive and security solutions. If you are upgrading exchange, dealing with email system outages or needing to simplify your disaster recovery process, it pays to have a cloud ‘flip of the switch’ fail over system at your fingertips.

We also have some great services, such as SecureWorks, an amazing security offering that allows us to provide additional security controls for our customers’ organizations such as network log, security devices and other key assets such as monitoring, 24/7 support with alerts, and vulnerability and web application scanning. We also have InSiteOne, which gives us the ability to provide data storage, archive and disaster recovery for medical data and information, along with a integration service called Boomi, which allows us to integrate data across new and legacy applications in a simplified fashion by literally taking information from one application to another application in an cloud enabled automated drag and drop fashion. Boomi and SecureWorks are my favorite acquisitions because when I talk about the cloud, the two biggest concerns we hear from our customers revolve around security and legacy system interoperability: These services solve both better than any other offering in the market.

In the near future, we’ll be offering public cloud offerings that will give us the ability to provide infrastructure as a service to manage elastic demand needs, burst capacity for peak and season demand as well as scalable infrastructure for services providers.

The reality is that we think there’s a lot of hype in the cloud today. What Dell brings is a pure backbone of how to approach the cloud effectively so that our customers feel comfortable about their journeys. We’re finding that every customer’s journey is a little different. Slowing down to completely understand our customers’ ultimate goals and requirements will help us partner together to find the best way to manage their growing data management needs, ever changing compute workloads, applications needs, integration bursting requirements and the associated services to bring it all together.

KB: What exactly does the term ‘cloud’ mean to consumers these days?

EC: Cloud is being used as more of a marketing term versus an offering, so when everybody talks about cloud, and they use it fairly self-servingly. That’s completely understandable and I am certainly guilty of this too. The challenge with the word ‘cloud’ is that it can have so many meanings depending on the lens through which one are looking.

I truly believe that Dell sets itself apart because it has the right set of integrated solutions for its customers. Whether it’s the on or off premise cloud hardware infrastructure, the consulting, the wraparound services, or the application modernization that’s needed to run an organization’s backbone, Dell provides solutions in a way that other providers cannot. Of course, the cloud it not just about savings; it’s also about bringing together a complex environment and making it easier for our customers to manage it, simplifying business processes and enabling an IT delivery model to help transform any organization.

KB: Will Dell’s cloud solutions enable businesses to deploy and manage private, public and hybrid cloud infrastructures?

EC: The short answer is ‘yes.’ We have hardware that supports the cloud infrastructure from our Dell Data Center Services [DCS] organization, which allows us to provide the right backbone to our customers through hardware solutions, designed specifically for the cloud.

From a build and operate perspective, we certainly offer all sorts of tools across the infrastructure. For example, we have a self-service Virtual Integrated System [VIS] director program, which allows us to manage virtual environments in a clean, effective way.

These solutions and many more combined with Dell Cloud Solutions HW, IaaS [Infrastructure as a Service], SaaS [Software as a Service], Security, Integration and Consulting give us the ability to deliver on all combinations of hybrid cloud solutions.

KB: Regardless of a business’s size, what is the common denominator when it comes to designing, building and deploying highly efficient cloud solutions?

EC: It really depends where the customers are in their journey. For many customers, consolidating their environments into a centralized location and reducing operating expenses and capital expenditure is a huge step in achieving their goals; I mean, who would not want this? They also see that these services are being designed to bring scalability staff, so their resources can be refocused on other business mission critical activities, thus providing a whole bunch of savings and, more importantly, efficiency. The challenge here is that is does take introspection from customers to identity those parts of their business that will bring them the best results.

Then, there are other customers who are just looking to start leveraging cloud based services and have specific areas of pain where help is needed, so it’s really about a maturity scale and understanding where your customers are on their own journeys.

Dell’s customers are in many different areas. Some are well down the path and have a defined cloud strategy and understand where they can increase savings and how they want to utilize it, while others are still in discovery mode and are looking for that trusted advisor that can come in and help them understand how their environment works and how it will work in the cloud. These customers can use a crawl, walk, run methodology, knowing where that low-hanging fruit is where they can get quick value-added help. Customers ultimately want to learn what to put to and in the cloud that will allow them to create more of a leveraged environment that hopefully should bring with it optimum efficiency. We have a great Cloud Consulting services that utilizes a clear methodology to assess cloud readiness and can help through all aspects of defining an organization’s cloud strategy.

KB: How important is it to consolidate business applications onto one database grid to enable increased scalability?

EC: I don’t think it’s critical that it’s all on a single platform; it’s like saying that my need for a CRM system, HR system and email server need to be all the same data center to work. They do not in today’s world. With the ability to integrate applications across the cloud, customers can lean on the right expertise to maximize their business. Many companies today get payroll services outsourced, use SaaS-based sales tools such as salesforce.com, and use online travel booking service, for example. Why? Because these providers can typically do it better and more efficiently than businesses can do it in house.

It’s about picking the pockets of expertise that allow your company to be the most efficient. For instance, if I had a small liquor distributorship and I decided I wanted to become the biggest liquor distributorship on the planet, well, the first thought in my mind isn’t about IT and the consolidated systems and my backend billing and procurement systems; it’s about my relationships with my vendors and my customer base. Essentially, the IT part of it becomes an afterthought. So, for organizations looking to the cloud, our recommendation is that you go down kind of a diverse path, which allows us to determine what we can do to build and operate, whether that is through a partner or an internally supported organization. Customers need to ask themselves how they can develop and integrate into a program that will not only allow them to build out their cloud platforms but also truly consume and deliver them.

Our goal is to provide a holistic view to our customers, so that when they’re figuring out what their journeys look like, they can have the right competitiveness, the right control and the right choice based on the types of services for which they are looking.

Dell recommends to its customers that they first investigate. This means understanding what’s in their environments and determining what would make the most sense to move to the cloud. Experiment! Look to things where you think you’re going to gain some efficiency within the organization and then figure out how you adopt.

KB: Do you anticipate a single giant emerging and claiming the marketplace in the cloud space? Also, why, in a sky crowded with cloud providers, should businesses entrust their environments to Dell?
EC: As far as if it’s going to be one monolithic owner of the cloud, I don’t think so; I think that the ingenuity and resourcefulness of the industry will allow the market to continue to open and introduce additional wraparound services.

With regards to where we’re at in the cloud today, the industry is still very young. When you look at the different eras of computing – the mainframe era, the PC client server era, the Internet era and the virtual era – all of these changes have brought about new applications, new resources and new ways of adapting to the business.

Dell wants to be the cornerstone of that and we offer a full foundation of cloud services that allows our customers to have choices in their journeys, to pick and choose those services, those applications and those cloud-based tools that allow them to become more efficient in their space and do more with fewer resources.

So I see this as a journey that everyone is going to go through and certainly Dell is putting its footprint down very hard right now. We’re opening up 10 new data centers around the planet with more than a billion dollar investment. We want to take a good portion of the market share by offering the right tools and key strategic partner offerings to our customers.

The fact is that it’s not just about being there first; it’s about offering that right solution and that’s what we’re focused on. We want to make sure our solutions are wrapped around offerings that make it easier for our customers to achieve success however they need to define it. Of course, we must remember, it’s not just about savings; it’s also about bringing together a complex environment and making it easier for our customers to manage through it.

Legal Issues in the Cloud: Exploring Business Continuity, Liability and SLA-related Issues

Many companies are familiar with ‘e-discovery’ and have data retention, storage and destruction policies in place that apply in the event of litigation. If a cloud customer is sued, or there is the threat of litigation, the customer may have to initiate a ‘litigation hold’ to preserve documents, including electronic documents and any metadata in the documents. This could present a challenge in the cloud if the customer’s data is commingled with that of other clients or if the customer’s data is stored on parallel servers. Cloud customers should determine the vendor’s ability to prevent the destruction, alteration or mutilation of customer data in the vendor’s possession, as well as the vendor’s search capabilities for the data. Cloud customers should also make sure that their corporate policies and procedures account for any data in the cloud. Do the data retention and destruction policies of the cloud vendor align with those of the customer?

As for the vendors, they need to develop a process for dealing with e-discovery requests and should provide notice to their customers promptly (within hours, not days) of any subpoena or other legal process seeking access to the customer’s data. Vendors may also need to provide the customer with access to its logs and reports to verify the security, integrity and chain of custody of the customer’s data.

Likewise unveils software to give companies better control of unstructured data

Likewise announced Thursday new software to better access, secure, manage and audit unstructured data across multiple platforms – financial files, medical records, office documents, media and big data files, which account for nearly half of all stored information.

Companies have been limited in their ability to control unstructured data across multiple platforms. Likewise Data Analytics and Governance, the new application available now as a public software beta, gives organizations greater visibility into their unstructured data for improved security, auditing and compliance.

The Likewise Storage Services platform, used by such OEM storage vendors as HP and EMC Isilon, offers a consistent security model for file-based access and cross-platform, unified storage across physical, virtual and cloud environments. Likewise Storage Services provides integrated identity and access management, as well as secure access to data from Windows, Unix and Linux systems. Supported protocols include SMB/CIFS 1.0, 2.0, 2.1, NFS 3.0, and a RESTful API. Likewise Storage Services is available with a commercial license from Likewise Software.

“With 40 percent of unstructured data classified as sensitive and only 14 percent of organizations with a plan for managing that data, it is a critical and growing challenge to secure and govern data, and extract information,” said Barry Crist, CEO, Likewise. “Likewise intersects identity, security and storage to harness and secure data, track and manage who has access to what data, and free its use to make the most of it.”

Likewise Data Analytics and Governance enables organizations to implement a set of automated best practices to secure and manage unstructured data. The application uses analytics to contextualize data with user identity, sensitivity, and other information to mitigate risks, reduce costs and create value.

The software can help organizations understand performance and usage across storage pools, categorize unstructured data to create new applications or lines of business, and exploit data to maximize revenue. Companies can consolidate reporting across data silos, enforce consistent access policies, and manage entitlements from a single web console. The result is a global hierarchical view of an organization’s unstructured data that can identify and remediate root causes of security, performance and access issues.

"The problem with unstructured data has grown exponentially over time. It can seem insurmountable, but companies must get their arms around the sensitive data contained in these files,” said Ginny Roth, analyst, Enterprise Strategy Group. “Without the ability to have some glimpse into this data in the wild, companies will be increasingly vulnerable to high profile breaches."

The new Likewise application integrates with the Likewise Storage Services platform used by OEM network attached storage (NAS) vendors such as HP and EMC-Isilon, and has adapters that support NetApp, EMC-Celerra and other NAS filers.

The application monitors and audits usage, ownership and classification of data with dashboards and custom reports, manages and provisions entitlements from a single, central location, generates reports that demonstrate compliance, including reports for HIPAA, ITAR, PCI DSS, Sarbanes-Oxley, GLBA and FISMA, and receives real-time event alerts for policy-driven exception monitoring. It also uses analytics tools to get real-time and historical information on users and groups, file security settings, data sensitivity and access patterns, centrally views users, groups and netgroups in Microsoft Active Directory, LDAP or Sun Microsystems’ Network Information Service (NIS), identifies all CIFS and NFS users and workloads using NAS controllers, accepts events and log data from various sources through a predefined RESTful API, provides support for global environments and storage arrays with high workloads, feeds information to existing SIEM or reporting tools, works with NAS devices using the Likewise Storage, and offers services platform, as well as NetApp and EMC-Celerra NAS devices.

OCZ Technology debuts Talos 2 enterprise SAS 6G SSD line

OCZ Technology Group Inc., provider of high-performance solid-state drives (SSDs) for computing devices and systems, announced this week the Talos 2 Serial Attached SCSI (SAS) SSD Series, the follow-up to the high performance, high capacity Talos Series previously available only in a 3.5 inch form factor.

With capacities up to 1TB now available in a compact 2.5-inch form factor, Talos 2 offers increased I/O performance and scalability in enterprise storage environments. Talos 2 leverages OCZ Virtualized Controller Architecture 2.0 technology which implements a complex command queuing structure with unique queue balancing algorithms to provide exceptional performance. Talos 2 SSDs deliver superior random transactional performance at up to 70,000 4K IOPS and features improved mixed workload (75 percent read; 25 percent write) performance with up to 42,000 8K IOPS.

Unlike many SAS SSDs, the Talos Series is dual-ported to offer superior data integrity and increased performance, along with delivering an enterprise feature-set including DataWrite Assurance Protection in case of sudden power loss. Talos 2 also includes the option to enable T10-DIF (Data Integrity Field), allowing for the insertion of 8 bytes of additional data during transfers to ensure complete data integrity.

“The Talos 2 SAS solid state drives expand on the original series by offering enterprise customers superior performance, reliability, and density all in a compact footprint,” said Ryan Petersen, CEO of OCZ technology Group. “The Talos 2 enterprise SSDs are optimized for the most demanding storage systems and provide clients with an easy to deploy solution that vastly improves application performance over traditional SAS based HDDs.”

To address the complete spectrum of applications, Talos 2 SSDs are available in 100GB to 1TBcapacities, in MLC, eMLC, and SLC NAND configurations. OCZ is now sampling Talos 2 to strategic customers and the drives will be made available to SMB and enterprise clients through OCZ's global business-to-business channel.

Winchester debuts FlashServer HA-4000 line of high availability servers

Winchester Systems Inc., a data storage solutions provider, announced this week a new line of high availability servers. The new FlashServer HA-4000 Series offers two servers in one unit that automatically failover and provide 99.999 percent system uptime without the need for complex clustering software in Windows, Linux and VMware environments.

FlashServer HA Redundant Failover Servers addresses planned and unplanned downtime for critical applications. FlashServer HA delivers continuous uptime through its fully redundant modular hardware featuring Intel Xeon processors with up to six cores and up to 96 GB memory each. These servers provide continuous availability through hardware redundancy in all components: CPU, memory, motherboards, I/O, hard disk drives, power supplies and cooling fans.

Two servers in one unit run applications simultaneously so if one fails, the other takes over immediately with no lost data and no delays common to cluster solutions. Only one set of software licenses is needed. No downtime is needed for routine operating system and application updates as one server at a time can be taken off-line and updated.

Failover with the FlashServer HA is automatic and transparent to the OS, virtualization software, applications, and users. The environment can survive a drive, motherboard, CPU, RAM, bus, power supply and fan failure. The result is a realizable 99.999 percent uptime without clustering. For the utmost in reliability and to protect against both hardware and software failures, FlashServer HA servers work exceeding well in cluster environments, delivering a double layer of protection from fault tolerant servers and cluster failover, while clustering provides load balancing and other benefits.

To run in redundant mode, the HA Server only requires one copy of the operating system and application software license, no professional services to setup or configure the server.

FlashServer HA installs like any other standard Intel server, saving on professional services and reducing complexity and need for specialized staff to support complex cluster environments. FlashServer HA implements redundancy at the hardware level, reducing failures to automatic notifications and indicator light status. This approach reduces the complexity associated with other forms of high availability such as clustering systems. It also simplifies system management and recovery operations, since administration and configurations are performed as a single system view.

It also eliminates minutes of downtime for a cluster failover and avoiding a data loss provide substantial recapture of otherwise lost revenue often measured in thousands of dollars per minute.

"Everyone has at least one application where downtime is just unthinkable," according to Mr. Joel Leider, the company's chief executive officer. "FlashServer HA is the solution -- just replace he standard Intel server with FlashServer HA and get the needed protection," he added.

AppNeta improves performance management capabilities with expanded NetFlow and packet capture service

AppNeta offers network performance management solutions required to drive exceptional application performance across all data center, cloud, remote office and mobile environments. AppNeta’s award-winning PathView Cloud solutions leverage a zero administration, cloud-based service to meet the performance demands of distributed network infrastructure and mainstream network-dependent applications including unified communications, cloud services and virtual service delivery. With more than 1,000 customers worldwide, AppNeta provides end-to-end performance insight to network engineers and IT outsourcers, enabling predictable and efficient delivery of business-critical application services from wherever they originate to wherever they are consumed.

AppNeta’s PathView Cloud solutions leverage a zero deployment, cloud-based service to meet the performance demands of today’s distributed networks and mainstream network-dependent applications including unified communications, cloud services and virtualization. PathView Cloud offers IT organizations unmatched breadth of insight and time to value with integrated capabilities that can be deployed across any IP infrastructure, at any location.

PathView Cloud now enables users to expand the value of Netflow and packet capture on any network without additional, high-end network equipment to provide visibility. The PathView m30 microAppliance includes integrated pass-through capability, remotely generating Netflow records and packet captures.

If there is an issue with the hardware or software, the appliance “fails-to-wire”; restoring the network connection automatically with no manual intervention in milliseconds. This capability is typically available only in larger, more expensive appliances.

Eclipse Shortcut

1) Ctrl + T for finding class even from jar
2) Ctrl + R for finding any resource (file) including config xml files
3) Ctrl + 1 for quick fix
4) Ctrl + Shift + o for organize imports
5) Ctrl + / for commenting , uncommenting lines and blocks
6) Ctrl + Shift + / for commenting ,uncommenting lines with block comment
7) Ctrl + o for quick outline going quickly to method
8) Selecting class and pressing F4 to see its Type hierarchy
9) Alt + right and Alt + left for going back and forth while editing.
10) Ctrl + F4 or Ctrl + w for closing current file
11) Ctrl+Shirt+W for closing all files.
12) Alt + Shift + W for show in package explorer
13) Ctrl + Shift + Up and down for navigating from member to member (variables and methods)
14) Ctrl + l go to line
15) Ctrl + k and Ctrl + Shift +K for find next/previous
16) select text and press Ctrl + Shift + F for formatting.
17) Ctrl + F for find , find/replace
18) Ctrl + D to delete a line
19) Ctrl + Q for going to last edited place
20) Ctrl + T for toggling between super type and subtype
21) Go to other open editors: Ctrl + E.
22) Move to one problem (i.e.: error, warning) to the next (or previous) in a file: Ctrl + . for next, and Ctrl + , for previous problem
23) Hop back and forth through the files you have visited: Alt + ? and Alt + ?, respectively.
24) Go to a type declaration: F3
25) CTRL+Shift+G, which searches the workspace for references to the selected method or variable
26) Ctrl+Shift+L to view listing
27) Alt + Shift + j to add javadoc at any place in java source file.
28) CTRL+SHIFT+P to find closing brace. place the cursor at opening brace and use this.
29) Alt+Shift+X, Q to run Ant build file using keyboard shortcuts.
30) Ctrl + Shift +F for Autoformating

Friday 2 December 2011

Ethical Hacking Distance Learning Programme with CISE Certification!

While the Singularity is not to be confused with the astronomical description of an infinitesimal object of infinite density, it can be seen as a technological event horizon at which present models of the future may break down in the not-too-distant future when the accelerating rate of scientific discovery and technological innovation approaches a real-time asymptote. Beyond lies a future (be it utopian or dystopian) in which a key question emerges: Evolving at dramatically slower biological time scales, must Homo sapiens become Homo syntheticus in order to retain our position as the self-acclaimed crown of creation – or will that title be usurped by sentient Artificial Intelligence? The Singularity and all of its implications were recently addressed at Singularity Summit 2011 in New York City.

The future cometh: Science, technology and humanity at Singularity Summit 2011

In its essence, technology can be seen as our perpetually evolving attempt to extend our sensorimotor cortex into physical reality: From the earliest spears and boomerangs augmenting our arms, horses and carts our legs, and fire our environment, we’re now investigating and manipulating the fabric of that reality – including the very components of life itself. Moreover, this progression has not been linear, but instead follows an iterative curve of inflection points demarcating disruptive changes in dominant societal paradigms. Suggested by mathematician Vernor Vinge in his acclaimed science fiction novel True Names (1981) and introduced explicitly in his essay The Coming Technological Singularity (1993), the term was popularized by inventor and futurist Ray Kurzweil in The Singularity is Near (2005). The two even had a Singularity Chat in 2002.

Computer animation

Computer animation is the art of creating moving images via the use of computers.

It is a subfield of computer graphics and animation.
Increasingly it is created by means of 3D computer graphics, though 2D computer graphics are still widely used for low bandwidth and faster real-time rendering needs.
Sometimes the target of the animation is the computer itself, but it sometimes the target is another medium, such as film.
It is also referred to as CGI (Computer-generated imagery or computer-generated imaging), especially when used in films. To create the illusion of movement, an image is displayed on the computer screen then quickly replaced by a new image that is similar to the previous image, but shifted slightly.
This technique is identical to how the illusion of movement is achieved with television and motion pictures. Computer animation is essentially a digital successor to the art of stop motion animation of 3D models and frame-by-frame animation of 2D illustrations.
For 3D animations, objects (models) are built on the computer monitor (modeled) and 3D figures are rigged with a virtual skeleton.
For 2D figure animations, separate objects (illustrations) and separate transparent layers are used, with or without a virtual skeleton.
Then the limbs, eyes, mouth, clothes, etc.
of the figure are moved by the animator on key frames.
The differences in appearance between key frames are automatically calculated by the computer in a process known as tweening or morphing.
Finally, the animation is rendered. For 3D animations, all frames must be rendered after modeling is complete.
For 2D vector animations, the rendering process is the key frame illustration process, while tweened frames are rendered as needed.
For pre-recorded presentations, the rendered frames are transferred to a different format or medium such as film or digital video.
The frames may also be rendered in real time as they are presented to the end-user audience.
Low bandwidth animations transmitted via the internet (e.g.
2D Flash, X3D) often use software on the end-users computer to render in real time as an alternative to streaming or pre-loaded high bandwidth animations

3D computer graphics

Such images may be for later display or for real-time viewing.
Despite these differences, 3D computer graphics rely on many of the same algorithms as 2D computer vector graphics in the wire frame model and 2D computer raster graphics in the final rendered display.
In computer graphics software, the distinction between 2D and 3D is occasionally blurred; 2D applications may use 3D techniques to achieve effects such as lighting, and primarily 3D may use 2D rendering techniques.
3D computer graphics are often referred to as 3D models.
Apart from the rendered graphic, the model is contained within the graphical data file.
However, there are differences.
A 3D model is the mathematical representation of any three-dimensional object (either inanimate or living).
A model is not technically a graphic until it is visually displayed.
Due to 3D printing, 3D models are not confined to virtual space.
A model can be displayed visually as a two-dimensional image through a process called 3D rendering, or used in non-graphical computer simulations and calculations

Engineer Develops Method To Combat Congenital Heart Disease In Children

Marsden’s work focuses on designing and using simulation tools to provide a way of testing new surgery designs on the computer before trying them on patients, much like, for example, engineers use computer codes to test new designs for airplanes or automobiles. Certain severe forms of congenital heart defects leave a patient with only one functional heart pumping chamber. These “single ventricle” defects are uniformly fatal if left untreated, and require a patient to undergo multiple heart surgeries, ending with a Fontan procedure.
In the Fontan surgery the veins returning blood to the heart from the body are directly connected to the arteries that send deoxygenated blood to the lungs, forming a modified t-shaped junction. This bypasses the heart on the one side so that the resulting circulation puts the single pumping chamber to optimal use. Using models derived from MRI image data, Marsden has come with a way to optimize a Y-Graft model for the Fontan procedure which can help pediatric surgeons determine whether this procedure will benefit a patient, as well as and determine how a patient’s heart will perform during moderate exercise.
An advantage of Marsden’s proposed Y-Graft design is that it can be optimized or modified for an individual patient by custom manufacturing the graft portion prior to surgery.
“Our goal is to provide a set of personalized tools that can be used in collaboration with surgeons to identity the best procedure for patients,” Marsden said.
Pediatric surgeons at Stanford University plan to use Marsden’s Y-Graft computer models for a Fontan procedure for the first time later this year. One of the pediatric cardiologists working with Marsden is Dr. Jeff Feinstein, an associate professor of Pediatrics (Cardiology) at Stanford University with a specialization in interventional cardiology, and director of the Vera Moulton Wall Center for Pulmonary Vascular Disease at Stanford.

'4D MRI' Technology Helps Predict Outcome Of Pediatric Heart Surgery

The technology, known as image-based surgical planning and developed with the help of pediatric cardiologists and pediatric surgeons at The Children’s Hospital of Philadelphia (CHOP) and Emory University, creates a three-dimensional model of the child’s heart with data from the child’s MRI scans at different times in the cardiac cycle, also called a 4D MRI. The models allow surgeons to visualize the direction of blood flow and determine any energy loss in the heart. So if a surgeon were planning a certain correction to an area of a child’s heart, a model created by the system would show the surgeon how well blood would flow through the newly configured heart.
The goal of the Georgia Tech/Emory project is to create a complete system that allows surgeons to get a

detailed look at the child’s heart functions with the new MRI system, design surgical procedures for optimum post-operative performance and evaluate the heart’s performance with a sophisticated blood flow computer simulation

Patients with this defect often undergo multiple surgeries to reconfigure the pulmonary and systemic systems in operations called Fontan repairs, a reconfiguration that diverts the blood flow coming to the right side of the heart directly to the lungs so that the heart no longer has to pump blood to the lungs. Staged over several years, these surgeries are a common, but not always successful, option used for treating a single-ventricle defect.
After a less-than-optimal operation, children sometimes experience a reduced capacity to perform physical activities and may experience blood clotting and ventricle arrhythmias. The Georgia Tech/Emory surgery planning system could eliminate the need for additional surgeries by optimizing early surgeries.
“The research is meant to get at the root of the ‘failing’ Fontan, investigating why these pumping chambers fail in the hopes of devising new strategies to give these children a second chance in life. Using advanced imaging and bioengineering tools, the project hopes to describe how blood flows in this type of circulation and how this blood flow might be altered to extend the life of the patients,” said Mark Fogel, M.D., director of cardiac MR in the Cardiac Center at Children’s Hospital and a key collaborator on the project

Predicting Successful Surgeries

Bioengineers Combine Mathematical Equations, Data to Simulate Surgeries

STANFORD, Calif.--People are as different on the inside as they are on the outside, making it difficult to predict which heart surgery will help which patient. Now, a new, high-tech approach may predict which patients will and will not have successful surgeries.
Heart attack survivor, David Lesesky says, "When I started having problems, I just didn't want to take the chance." He didn't take a chance. Lesesky made it through the heart attack and survived surgery and is doing just fine. The outcome, however, is not always the same: Each patient and each surgery brings its own risks

"The question we have for this patient is that would she benefit from a procedure -- bypass procedure -- to improve blood flow down to the legs?" Taylor says after examining a patient's 3D model on the computer. The yellow on the model shows the potential bypass path. When blood flow is simulated, it's revealed that two of the vessels going into the legs were clotted off -- the surgery would not have been successful.
"What it will mean for the patient is fewer operations -- conceivably more successful operations," Taylor says. And it will help keep hearts beating -- longer.
The computer model is being tested right now, retrospectively, on patients who are already planning to have surgery. So far results show it will be successful in predicting the outcome of cardiovascular surgeries

Cell Phone Viruses

Software Engineers Allow PCs to Scan Mobile Devices for Viruses

PITTSBURGH--It takes constant vigilance to combat the viruses that persistently lurk in cyber space. While we all know our PCs are vulnerable to data loss, you might be surprised to find out so is your cell phone! A new technology could be the key to ferreting out electronic viruses forever.
That fancy cell phone you use to surf the Web and check e-mail could be infected with a computer virus.
"Our cell phones are becoming more and more sophisticated to look more and more like regular computers, and so they can also acquire viruses," says Adrian Perrig, an assistant professor of engineering at Carnegie Mellon University in Pittsburgh.
While most of us take steps to safeguard our PCs, cell phone viruses are so new you might not even know about them. Engineers at Carnegie Mellon University found a key to detecting even the most evasive electronic bugs.
Perrig says, "Our technique is called SoftWare-based ATTestation, which allows an external host -- like the laptop computer or even another cell phone -- allows them to look into the memory of a device in a way that even malicious code executing on the device cannot hide."
Traditional anti-virus programs scan for a list of known threats, but if a threat is not on the list, it's not detected. With software-based attestation -- SWATT for short -- there's no virus roster. Rather it scans the memory of a handheld device. Because all viruses must dwell in memory, any deviation signals a potential virus.
Right now SWATT only detects bugs. Once they figure out how to exterminate them, it will go on the market. In addition to computers, PDAs and cell phones, this software can detect viruses in any communication device, even the navigation systems of luxury cars.
There are several different ways a computer can become infected. A virus is a small piece of software that attaches itself to an existing program. Every time that program is executed, the virus starts up, too, and can reproduce by attaching itself to even more programs. When contained in an email, the virus usually replicates by automatically mailing itself to dozens of people listed in the victim's email address book

Security at Your Fingertips

Electrical Engineers Develop Pocket-Size Fingerprint Recognition

A new pocket device reads fingerprints and validates them by wireless access to a computer. With this biometrics system, users can avoid using passwords, and get simpler and more secure access to bank balances, credit cards, and even buildings.

Dominic DeSantis dares anyone to try and hack into his personal PC files. "I have different files on my desktop that you can't open without passwords," he says.
Tough password tactics may slow down a cyber thief, but it's not foolproof. Now, electrical engineers have developed this new security device that uses a one-of-a-kind access code -- your fingerprint.
"It becomes a personal identification device that you carry with you, and the device only works for you," says Barry Johnson, an electrical engineer at Privaris, Inc., in Fairfax, Va. "The fingerprint, being something that you are, is something you that you will not forget."
With the touch of a finger, online access is a cinch for credit card purchases, viewing bank balances, or checking e-mail -- all without remembering or typing a single password or PIN number. Once you scan your finger, the device compares the scan to your fingerprint data, or biometrics already stored in the device.
"The ability to not only store the fingerprint on the device, and only on the device, but to do that securely is a unique feature of the device," Johnson says. He says the new device can work with existing security systems and also works for access into buildings.
It's a unique way to help consumers like DeSantis stay secure with something he'll never lose.
BACKGROUND: "Spoofing" is the process by which individuals test a biometric security system by introducing a fake sample. This can help companies improve those systems in order to better protect their information and employees. The goal is to make the authentication process as accurate and reliable as possible.
HOW IT WORKS: Digits from cadavers and fake fingers molded from plastic, or even play dough or gelatin, can potentially be misread as authentic by biometric security systems. Electrical and computer engineers are addressing this issue by trying to "spoof" such systems in hopes of designing more effective safeguards and countermeasures. One such study found a 90 percent false verification rate; the scanning machines could not distinguish between a live sample and a fake one. The system was modified to detect the pattern of perspiration from a live finger, which reduced the false verification rate to less than 10 percent.

Computer security

Computer security is a field of computer science concerned with the control of risks related to computer use. The means traditionally taken to realize this objective is to attempt to create a trusted and secure computing platform, designed so that agents (users or programs) can only perform actions that have been allowed.

This involves specifying and implementing a security policy.
The actions in question can be reduced to operations of access, modification and deletion.
Computer security can be seen as a subfield of security engineering, which looks at broader security issues in addition to computer security.

Cyber security standards

Cyber security standards are security standards which enable organizations to practice safe security techniques in order to minimize the number of successful cyber security attacks..

For more information about the topic Cyber security standards, read the full article at Wikipedia.org, or see the following related articles

Security engineering is the field of engineering dealing with the security and integrity of real-world systems. It is similar to systems engineering

Two New Publications Provide a Cloud Computing Standards Roadmap and Reference Architecture

Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of computing resources, including servers, data storage and applications and services. NIST is responsible for accelerating the federal government's secure adoption of cloud computing by leading efforts to develop standards and guidelines in close consultation and collaboration with standards bodies, the private sector and other stakeholders, including federal agencies.
To produce the NIST Cloud Computing Standards Roadmap (NIST Special Publication 500-291), the NIST Cloud Computing Standards Working Group compiled an "Inventory of Standards Relevant to Cloud Computing" that will continue to be updated. The working group includes volunteer participants from industry, government and academia.
The working group found a number of gaps in available standards ranging from fundamental issues such as security and privacy protection to user interfaces and important business-oriented features. The group also identified standardization priorities for the federal government, particularly in areas such as security auditing and compliance, and identity and access management.
The NIST Standards Working Group Co-convener Michael Hogan said "NIST SP 500-291 encourages federal agencies to become involved with developing specific cloud computing standards projects that support their priorities in cloud computing services to move cloud computing standards forward."

Cloud Computing for Administration

Cloud Computing is a tempting development for IT managers: with cloud computing, companies and organizations no longer have to acquire servers and software solutions themselves and instead rent the capacities they need for data, computing power and applications from professional providers. You only pay for what you use. In Germany, primarily companies are turning to cloud computing, transferring their data, applications and networks to server farms at Amazon, Google, IBM, Microsoft or other IT service providers. In the space of just a few years, cloud computing has emerged as a market worth billions, with a high level of importance for business-location policy in the German economy.
"There are considerable reservations about cloud computing in the public-administration area. First, because of the fundamental need to protect citizens' personal data entrusted to public administrators; but also the potential of outsourcing processes are frightening in the eyes of the authorities. Due to fear of the loss of expertise, for one, and for another because the law requires that core tasks remain in the hands of administrators." This is how study co-author Linda Strick of FOKUS summarizes the status quo.
The study points out that cloud-specific security risks do in fact exist, but that these can be completely understood and analyzed. "There is even reason to assume that cloud-based systems can actually fulfill higher security standards than classic solutions," Strick explains. To assist administrators with introduction of the new technology, FOKUS researchers in the eGovernment laboratory are developing application scenarios for a media-fragmentation-free and hence interoperable use of cloud-computing technologies.
A cockpit for security
To permit companies and public authorities to acquire practical experience with the new technology and test security concepts, experts from the Fraunhofer Institute for Secure Information Technology SIT in Munich have created a Cloud Computing Test Laboratory. Along with security concepts and technologies for cloud-computing providers, researchers there are also developing and studying strategies for secure integration of cloud services in existing IT infrastructures.
"In our test lab, function, reliability and interoperability tests, along with individual security analyses and penetration tests, can be carried out and all of the developmental phases considered, from the design of individual services to prototypes to the testing of fully functional comprehensive systems," notes Angelika Ruppel of SIT in Munich.
Working with the German federal office for information security [Bundesamt für Sicherheit in der Informationstechnik] BSI, her division has drafted minimum requirements for providers and has developed a Cloud Cockpit. With this solution, companies can securely transfer their data between different cloud systems while monitoring information relevant to security and data protection. Even the application of hybrid cloud infrastructures, with which companies can use both internal and external computing power, can be securely controlled using the Cloud Cockpit

Blue Skies Thinking for Cloud Security?

As cloud computing moves data and services from local systems to remote centres, the question of security for organisations must be addressed. A research paper published in the International Journal of Services and Standards suggests that a cloud-free security model is the best way forward and will circumvent the fact that cloud service providers are not yet meeting regulations and legal standards.


In the early days of computing, users accessed resources using desktop terminals connected to a mainframe. The personal computer changed all that uniting the input, processing and output devices in a single machine.
Recently, however, the merits of separating the computers on our desks and in our pockets from the processing workhorses has re-emerged especially as broadband and mobile networks have got faster. Today, countless business and individuals access their email and documents on remote web-based systems. Social networking tools, such as MySpace, Facebook, and Twitter maintain data on remote servers accessed through a web browser. Data storage and back up too are often undertaken on remote servers

Yunis has developed a model to assess all the various risks associated with relocating an organisation's data and services to remote computer servers in the clouds. "The model can be applied in assessing the security issues emanating from cloud computing, identifying the security countermeasures needed to address these issues and coordinating the efforts of the various participants to enhance information security in organisations adopting cloud computing," explains Yunis.
She points out that there are six important issues that must be addressed to ensure an organisation or individual's use of cloud computing is not compromised.
The first is "resource sharing. On shared services, there is the possibility that another user on the same system may gain access inadvertently or deliberately to one's data, with potential for identity theft, fraud, or industrial sabotage. Second, because data is held offsite data ownership might be compromised. Third, the intrinsic latency of transferring data back and forth for processing in the cloud means that some users might lower encryption levels to cut send and receive delays, giving rise to additional security concerns. Fourth, the issue of Service Line Agreements (SLAs) may lead to an organisation being refused access to data and services if there are technical, security or commercial disagreements between them and the cloud service provider. Fifth, data might be lost or otherwise compromised because of a technical or other failure on the part of the provider. Finally, negative aspects of interoperability and portability in which failure or attack of a virtual component in the processing and storage may compromise security.

Cloud Computing Brings Cost Of Protein Research Down To Earth

Their research appears online in Journal of Proteomic Research and is funded by the NIH Heart Lung and Blood Institute's Proteomics Innovation Center at the Medical College. Proteomics is a biomedical research term used to describe the large-scale study of all the proteins expressed by an organism. It usually involves the identification of proteins and determination of their modifications in both normal and disease states.
One of the major challenges for many laboratories setting up proteomics programs has been obtaining and maintaining the very costly computational infrastructure required for analysis of the vast flow of proteomics data generated by mass spectrometry instruments used to determine the elemental composition as well as chemical structure of a molecule, according to senior investigator, Simon Twigger, Ph.D., assistant professor of physiology.
"We're applying this technology in our Proteomics Center to study cardiovascular disease, the effects of radiation damage, and in our collaboration with the University of Wisconsin- Madison stem cell research group," he says.
With cloud computing making the analysis less expensive and more accessible, many more users can set up and customize their own systems. Investigators can analyze their data in greater depth than previously possible, making it possible for them to learn more about the systems they are studying.
"The tools we have produced allow anyone with a credit card, anywhere in the world, to analyze proteomics data in the cloud and reap the benefits of having significant computing resources to speed up their data analysis," says lead author Brian Halligan, Ph.D., research scientist in the Biotechnology and Bioengineering Center

Galaxy DNA-Analysis Software Is Now Available 'in the Cloud'


Galaxy -- an open-source, web-based platform for data-intensive biomedical and genetic research -- is now available as a "cloud computing" resource.

A team of researchers including Anton Nekrutenko, an associate professor of biochemistry and molecular biology at Penn State University; Kateryna Makova, an associate professor of biology at Penn State; and James Taylor from Emory University, developed the new technology, which will help scientists and biomedical researchers to harness such tools as DNA-sequencing and analysis software, as well as storage capacity for large quantities of scientific data. Details of the development will be published as a letter in the journal Nature Biotechnology. Earlier papers by Nekrutenko and co-authors describing the technology and its uses are published in the journals Genome Research and Genome Biology
Now, Nekrutenko's team has taken Galaxy to the next level by developing an "in the cloud" option using, for example, the popular Amazon Web Services cloud. "A cloud is basically a network of powerful computers that can be accessed remotely without the need to worry about heating, cooling, and system administration. Such a system allows users, no matter where they are in the world, to shift the workload of software storage, data storage, and hardware infrastructure to this remote location of networked computers," Nekrutenko explained. "Rather than run Galaxy on one's own computer or use Penn State's servers to access Galaxy, now a researcher can harness the power of the cloud, which allows almost unlimited computing power." As a case study, the authors report on recent research published in Genome Biology in which scientists, with the help of Ian Paul, a professor of pediatrics at Penn State's Hershey Medical Center, analyzed DNA from nine individuals across three families using Galaxy Cloud. Thanks to the enormous computing power of the platform, the researchers were able to identify four heteroplasmic sites -- variations in mitochondria, the part of the genome passed exclusively from mother to child.
"Galaxy Cloud offers many advantages other than the obvious ones, such as computing power for large amounts of data and the ability for a scientist without much computer training to use DNA-analysis tools that might not otherwise be accessible," Nekrutenko said. "For example, researchers need not invest in expensive computer infrastructure to be able to perform data-intensive, sophisticated scientific analyses."

Mask-Bot: A Robot With a Human Face

What at first looks deceptively like a real talking person is actually the prototype of a new robot face that a team at the Institute for Cognitive Systems (ICS) at TU München has developed in collaboration with a group in Japan. "Mask-bot will influence the way in which we humans communicate with robots in the future," predicts Prof. Gordon Cheng, head of the ICS team. The researchers developed several innovations in the course of creating Mask-bot.
The projection of any number of realistic 3D faces is one of these. Although other groups have also developed three-dimensional heads, these display a more cartoon-like style. Mask-bot, however, can display realistic three-dimensional heads on a transparent plastic mask, and can change the face on-demand. A projector positioned behind the mask accurately beams a human face onto the back of the mask, creating very realistic features that can be seen from various angles, including the side.
Many comparable systems project faces onto the front of a mask -- following the same concept as cinema projection. "Walt Disney was a pioneer in this field back in the 1960s," explains Kuratate. "He made the installations in his Haunted Mansion by projecting the faces of grimacing actors onto busts." Whereas Walt Disney projected images from the front, the makers of Mask-bot use on-board rear projection to ensure a seamless face-to-face interaction.
This means that there is only a twelve centimeter gap between the high-compression, x0.25 fish-eye lens with a macro adapter and the face mask. The CoTeSys team therefore had to ensure that an entire face could actually be beamed onto the mask at this short distance. Mask-bot is also bright enough to function in daylight thanks to a particularly strong and small projector and a coating of luminous paint sprayed on the inside of the plastic mask. "You don't have to keep Mask-bot behind closed curtains," laughs Kuratate. This part of the new system could soon be used in video conferences. "Usually, participants are shown on screen. With Mask-bot, however, you can create a realistic replica of a person that actually sits and speaks with you at the conference table. You can use a generic mask for male and female, or you can provide a custom-made mask for each person," explains Takaaki Kuratate.

French Digitial Kitchen Is a Recipe for Success

From pear clafouti to croque monsieur, real-time cooking instructions for preparing each recipe are delivered in a similar way to an in-car sat nav.
Motion sensor-technology on the kitchen equipment and ingredients then help track whether each step has been completed successfully.
Developed by language experts and computer scientists at Newcastle University, the kitchen breaks new ground by taking language learning out of the classroom and combining it with an enjoyable and rewarding real-life activity.
It is supported by the Engineering and Physical Sciences Research Council's Digital Economy Programme

Human, Artificial Intelligence Join Forces to Pinpoint Fossil Locations

Using artificial neural networks (ANNs) -- computer networks that imitate the workings of the human brain -- Conroy and colleagues Robert Anemone, PhD, and Charles Emerson, PhD, developed a computer model that can pinpoint productive fossil sites in the Great Divide Basin, a 4,000-square-mile stretch of rocky desert in Wyoming.
The basin has proved to be a productive area for fossil hunters, yielding 50 million- to 70 million-year-old early mammal fossils.
The software builds on satellite imagery and maps fossil-hunters have used for years to locate the best fossil sites. It just takes the process a step further, Conroy says

 
Design by Free WordPress Themes | Bloggerized by Lasantha - Premium Blogger Themes | Affiliate Network Reviews