--- --- --- --- --- --- ---

quarta-feira, 27 de junho de 2012

Top 5 Things The Cloud Is Not

At http://www.wired.com/cloudline/2012/06/top-5-things-the-cloud-is-not/

Top 5 Things The Cloud Is Not

It's clear that the technology industry is moving from the PC era to the cloud era in several significant ways.  While cloud represents a new way for IT to deliver — and end users to consume — IT applications and services, this transition also represents a significant change in how applications, services and systems are defined.  The move to cloud computing is the most important technology disruption since the transition from mainframe to client-server, or even since Al Gore invented the internet. While industry veterans like Oracle's commander in chief declared it a fad, this is a decade-long trend that is here to stay, and one that will define the next generation of IT.

The movement itself has been in play for the last decade, however there continues to be a lot of (mis)information in the marketplace about the cloud. So much so that it is difficult for organizations to figure out what is real and what is not to help them develop a successful cloud strategy, or simply learn about technologies  that have been specifically designed and purpose-built to meet  this dramatic shift in technology. While it's important to know what the cloud is, it's just as important to separate the wheat from the chaff, and for IT to understand what cloud is not.

To this end, I encourage you not to add yet another definition of the cloud to your glossary, but to truly understand the top 5 things the cloud is not.

1. Cloud is not a place. People often talk about moving to the cloud as if they were moving to another city. But the cloud is not a place. In fact, the cloud can be anywhere, in your data center or someone else's. Organizations that believe they are moving to a strategy that leaves legacy apps and systems behind are in for a rude awakening. The single most important way for enterprise organizations to prepare themselves for the cloud is to understand that the cloud is a radically new way of delivering, consuming and adopting IT services in a far more agile, efficient, and cost-effective manner, which will spread throughout the ether and be a mix of public, private, managed or hybrid clouds. By looking holistically at the cloud, organizations can optimize its benefits for their budgets, privacy needs, geographies and overall business needs.

2. Cloud is not server virtualization. Despite what many believe, and what many will tell you, the cloud is not the same as next-gen server virtualization. It doesn't surprise me that many believe that by virtualizing their data center they will create a private cloud. Some vendors are intentionally trying to blur that line, aiming to convince customers that their vCenter clusters somehow deliver a private cloud. On the contrary, that is a gross exaggeration of the term cloud.

If you take a look at the way Amazon has built its cloud architecture, it becomes very clear that there are some fairly stark differences between a server virtualization environment and a true cloud architecture.  While Amazon starts with Xen virtualization technology, the brains of its architecture comes with a new layer of software that Amazon built in an effort to create a new control plane, a new cloud orchestration layer that can manage all the infrastructure resources (compute, storage, networking) across all of their data centers. This is at the heart of the cloud's technology disruption.  Some analysts refer to this as the "hypervisor of hypervisors," or a "new software category of cloud system software."

The fact of the matter is that some of the major players are doing cloud without server virtualization. Take Google for example. They have deployed a cloud architecture that is not using server virtualization, but rather a bare metal infrastructure. So while virtualization can be an important ingredient of cloud, it is not always a requirement.

3. Cloud is not an island. Depending on what you're reading, you'll hear a lot about public clouds versus private clouds, and it may feel as if enterprises must make a wholesale decision on which way to go. But the cloud is not an island, it is not a place where you put all of your IT services, and then lose all interconnectivity and access. The recent Amazon outages have proven this to be an important point for any organization leveraging the cloud. The right cloud strategy will be one that enables you to have a hybrid approach with the ability to easily connect private and public clouds. Even the recent move by NASA to include Amazon Web Services as part of its cloud rollout after a significant investment in the build-out of its own technology proves that the market is moving to open, interoperable multi-cloud environments.

4.  Cloud is not top-down. The cloud has up-ended the traditional IT approach to delivering services. The lines of business have been leading the charge in making the decision to move to cloud computing. With specific needs to get to market quickly, functional business leaders are consuming cloud services to avoid traditional IT processes. But we don't need surveys to clarify this movement.  The reality is that with the simple swipe of a credit card and the creation of an account, end users can... ( more at http://www.wired.com/cloudline/2012/06/top-5-things-the-cloud-is-not/ )

segunda-feira, 25 de junho de 2012

How Much Should a Small Business Pay for a Website?

At http://arielmarketinggroup.com/blog/?p=3568#

How Much Should a Small Business Pay for a Website?

I just met with a potential new client who is looking to move past her company's first (free) website and really build the internet as a referral stream. The client is smart, and their 8 year old business is successful. Thus far the referrals have been coming via word of mouth, networking, and limited and sporadic print advertising. The primary reason they haven't invested in the internet is their lack of understanding of how to do it effectively. This small business owner was scared off by many horror stories, including one scenario where her small business owner friend paid $10,000 for a website that brings in NO leads.

Unlike many business owners, this one came to our meeting prepared with a budget they could afford, but also was willing to be 'educated' on what marketing on the internet means. Here's what I told her:

1. Yes, you can get a decent website for under $2,000 depending upon your needs and how much fancy programming you require.

2. You need to not only consider your initial investment, but understand that you will need to be tending your website regularly; it needs to grow and evolve over time. $2000 may be ok for now, but you need to budget for regular updates, alterations and most importantly, additions to your site.

3. Building the website is only the beginning; in order to get leads from your site you need to first of all show up when someone searches for your service. That requires a long term commitment to organic SEO – search engine optimization, and very probably a commitment to pay per click advertising while you work on building your ranking organically on the... ( more at http://arielmarketinggroup.com/blog/?p=3568# )

sexta-feira, 22 de junho de 2012

New Bath Gel Doesn't Need a Tub



At http://www.livescience.com/21060-rub-dub-bath-gel.html

Rub-a-Dub-Dub: New Bath Gel Doesn't Need a Tub

Date: 20 June 2012 Time: 10:28 AM ET



For people who don't have time to bathe or access to fresh water, a South African college student has a solution: a shower gel users simply rub onto their skin. One small packet replaces one bath, and users never need any water. Ludwick Marishane's inspiration was a lazy friend, but his invention will be a boon to people who live in areas where clean water is in short supply. 

The gel, called Drybath, kills germs, moisturizes the skin and exudes a pleasant, light smell, unlike hand sanitizers, according to Marishane's website, Headboy Industries. The gel is packaged in small, easy-to-open sachets that were a South African invention.

Marishane got the idea to sell individual packets when he learned from mentors that the world's poorest people buy things in very small quantities, such as one cigarette at a time as opposed to a pack or a carton, he said in a presentation.

In poor communities, Marishane said he sells Drybath for 50 cents a packet, according to an interview with the Global Student Entrepreneur Awards. (Marishane won the competition's top prize in December 2011.) For corporate customers, such as airlines or hotels, each Drybath packet costs $1.50. Marishane donates one packet to charity for every corporate packet sold. 

Marishane, who is 22 and attends the University of Cape Town, got his inspiration as a teenager growing up in a rural part of South Africa. He recalled that once, when he nagged a friend to take a shower, his friend ( more at http://www.livescience.com/21060-rub-dub-bath-gel.html )

quarta-feira, 20 de junho de 2012

Why Smart People Are Stupid

At http://www.newyorker.com/online/blogs/frontal-cortex/2012/06/daniel-kahneman-bias-studies.html

June 12, 2012

Why Smart People Are Stupid

Editors' Note: The introductory paragraphs of this post appeared in similar form in an October, 2011, column by Jonah Lehrer for the Wall Street Journal. We regret the duplication of material.

Intelligence-Stvenson.jpg

Here's a simple arithmetic question: A bat and ball cost a dollar and ten cents. The bat costs a dollar more than the ball. How much does the ball cost?

The vast majority of people respond quickly and confidently, insisting the ball costs ten cents. This answer is both obvious and wrong. (The correct answer is five cents for the ball and a dollar and five cents for the bat.)

For more than five decades, Daniel Kahneman, a Nobel Laureate and professor of psychology at Princeton, has been asking questions like this and analyzing our answers. His disarmingly simple experiments have profoundly changed the way we think about thinking. While philosophers, economists, and social scientists had assumed for centuries that human beings are rational agents—reason was our Promethean gift—Kahneman, the late Amos Tversky, and others, including Shane Frederick (who developed the bat-and-ball question), demonstrated that we're not nearly as rational as we like to believe.

When people face an uncertain situation, they don't carefully evaluate the information or look up relevant statistics. Instead, their decisions depend on a long list of mental shortcuts, which often lead them to make foolish decisions. These shortcuts aren't a faster way of doing the math; they're a way of skipping the math altogether. Asked about the bat and the ball, we forget our arithmetic lessons and instead default to the answer that requires the least mental effort.

Although Kahneman is now widely recognized as one of the most influential psychologists of the twentieth century, his work was dismissed for years. Kahneman recounts how one eminent American philosopher, after hearing about his research, quickly turned away, saying, "I am not interested in the psychology of stupidity."

The philosopher, it turns out, got it backward. A new study in the Journal of Personality and Social Psychology led by Richard West at James Madison University and Keith Stanovich at the University of Toronto suggests that, in many instances, smarter people are more vulnerable to these thinking errors. Although we assume that intelligence is a buffer against bias—that's why those with higher S.A.T. scores think they are less prone to these universal thinking mistakes—it can actually be a subtle curse.

West and his colleagues began by giving four hundred and eighty-two undergraduates a questionnaire featuring a variety of classic bias problems. Here's a example:

In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?

Your first response is probably to take a shortcut, and to divide the final answer by half. That leads you to twenty-four days. But that's wrong. The correct solution is forty-seven days.

West also gave a puzzle that measured subjects' vulnerability to something called "anchoring bias," which Kahneman and Tversky had demonstrated in the nineteen-seventies. Subjects were first asked if the tallest redwood tree in the world was more than X feet, with X ranging from eighty-five to a thousand feet. Then the students were asked to estimate the height of the tallest redwood tree in the world. Students exposed to a small "anchor"—like eighty-five feet—guessed, on average, that the tallest tree in the world was only a hundred and eighteen feet. Given an anchor of a thousand feet, their estimates increased seven-fold.

But West and colleagues weren't simply interested in reconfirming the known biases of the human mind. Rather, they wanted to understand how these biases correlated with human intelligence. As a result, they interspersed their tests of bias with various cognitive measurements, including the S.A.T. and the Need for Cognition Scale, which measures "the tendency for an individual to engage in and enjoy thinking."

The results were quite disturbing. For one thing,... ( more at http://www.newyorker.com/online/blogs/frontal-cortex/2012/06/daniel-kahneman-bias-studies.html )

Drawing by James Stevenson.

Note: This article has been modified to include mention of Shane Frederick.

Intuitively, we’re all Aristotelians

At http://www.newyorker.com/online/blogs/frontal-cortex/2012/06/brain-experiments-why-we-dont-believe-science.html

June 7, 2012

Why We Don't Believe In Science

Editors' Note: Portions of this post appeared in similar form in a December, 2009, piece by Jonah Lehrer for Wired magazine. We regret the duplication of material.

galileo.jpg

Last week, Gallup announced the results of their latest survey on Americans and evolution. The numbers were a stark blow to high-school science teachers everywhere: forty-six per cent of adults said they believed that "God created humans in their present form within the last 10,000 years." Only fifteen per cent agreed with the statement that humans had evolved without the guidance of a divine power.

What's most remarkable about these numbers is their stability: these percentages have remained virtually unchanged since Gallup began asking the question, thirty years ago. In 1982, forty-four per cent of Americans held strictly creationist views, a statistically insignificant difference from 2012. Furthermore, the percentage of Americans that believe in biological evolution has only increased by four percentage points over the last twenty years.

Such poll data raises questions: Why are some scientific ideas hard to believe in? What makes the human mind so resistant to certain kinds of facts, even when these facts are buttressed by vast amounts of evidence?

A new study in Cognition, led by Andrew Shtulman at Occidental College, helps explain the stubbornness of our ignorance. As Shtulman notes, people are not blank slates, eager to assimilate the latest experiments into their world view. Rather, we come equipped with all sorts of naïve intuitions about the world, many of which are untrue. For instance, people naturally believe that heat is a kind of substance, and that the sun revolves around the earth. And then there's the irony of evolution: our views about our own development don't seem to be evolving.

This means that science education is not simply a matter of learning new theories. Rather, it also requires that students unlearn their instincts, shedding false beliefs the way a snake sheds its... ( more at http://www.newyorker.com/online/blogs/frontal-cortex/2012/06/brain-experiments-why-we-dont-believe-science.html )

While this new paper provides a compelling explanation for why Americans are so resistant to particular scientific concepts—the theory of evolution, for instance, contradicts both our naïve intuitions and our religious beliefs—it also builds upon previous research documenting the learning process inside the head. Until we understand why some people believe in science we will never understand why most people don't.

In a 2003 study, Kevin Dunbar, a psychologist at the University of Maryland, showed undergraduates a few short videos of two different-sized balls falling. The first clip showed the two balls falling at the same rate. The second clip showed the larger ball falling at a faster rate. The footage was a reconstruction of the famous (and probably apocryphal) experiment performed by Galileo, in which he dropped cannonballs of different sizes from the Tower of Pisa. Galileo's metal balls all landed at the exact same time—a refutation of Aristotle, who claimed that heavier objects fell faster.

While the students were watching the footage, Dunbar asked them to select the more accurate representation of gravity. Not surprisingly, undergraduates without a physics background disagreed with Galileo. They found the two balls falling at the same rate to be deeply unrealistic. (Intuitively, we're all Aristotelians.) Furthermore, when Dunbar monitored the subjects in an fMRI machine, he found that showing non-physics majors the correct video triggered a particular pattern of brain activation: there was a squirt of blood to the anterior cingulate cortex, a collar of tissue located in the center of the brain. The A.C.C. is typically associated with the perception of errors and contradictions—neuroscientists often refer to it as part of the "Oh shit!" circuit—so it makes sense that it would be turned on when we watch a video of something that seems wrong, even if it's right.

This data isn't shocking; we already know that most undergrads lack a basic understanding of science. But Dunbar also conducted the experiment with physics majors. As expected, their education enabled them to identify the error; they knew Galileo's version was correct.

But it turned out that something interesting was happening inside their brains that allowed them to hold this belief. When they saw the scientifically correct video, blood flow increased to a part of the brain called the dorsolateral prefrontal cortex,... ( more at http://www.newyorker.com/online/blogs/frontal-cortex/2012/06/brain-experiments-why-we-dont-believe-science.html )

Of course, that extra mental labor isn't always pleasant. (There's a reason they call it "cognitive dissonance.") It took a few hundred years for the Copernican revolution to go mainstream. At the present rate, the Darwinian revolution, at least in America, will take just as long.

Illustration courtesy of Hulton Archive/Getty Images.

The illusion of choice

At http://www.businessinsider.com/these-6-corporations-control-90-of-the-media-in-america-2012-6

These 6 Corporations Control 90% Of The Media In America


Ashley Lutz | Jun. 14, 2012, 9:49 AM

This infographic created by Jason at Frugal Dad shows that almost all media comes from the same six sources.

That's consolidated from 50 companies back in 1983. 

NOTE: This infographic is from last year and is missing some key transactions. GE does not own NBC (or Comcast or any media) anymore. So that 6th company is now Comcast. And Time Warner doesn't own AOL, so Huffington Post isn't affiliated with them.

But the fact that a few companies own everything demonstrates "the illusion of choice," Frugal Dad says. While some big sites, like Digg and Reddit aren't owned by any of the corporations, Time Warner owns news sites read by millions of Americans every year.

Here's the graphic: ( more at http://www.businessinsider.com/these-6-corporations-control-90-of-the-media-in-america-2012-6 )

segunda-feira, 18 de junho de 2012

On average, people over the age of 60 were 14 per cent more likely to die on their birthdays

At http://www.telegraph.co.uk/health/healthnews/9323562/We-are-more-likely-to-die-on-our-birthday-than-any-other-day.html

We are more likely to die on our birthday than any other day

Be careful blowing out the candles. Scientists have found we are more likely to die on our birthday than any other day.


On average, people over the age of 60 were 14 per cent more likely to die on their birthdays

7:57AM BST 11 Jun 2012

Researchers who studied more than two million people over 40 years found a rise in deaths from heart attacks, strokes, falls and suicides.

William Shakespeare died on his birthday on April 23 1616. The actress Ingrid Bergman also died on her birthday, in August 1982.

On average, people over the age of 60 were 14 per cent more likely to die on their birthdays.

Heart attacks rose 18.6 per cent on birthdays and were higher for men and women while strokes were up 21.5 per cent - mostly in women.

Dr Vladeta Ajdacic-Gross of the University of Zurich, said: 'Birthdays end lethally more frequently than might be expected.' He added that risk of birthday death rose as people got older.

Canadian data also showed that strokes were more likely on birthdays, especially among patients with high blood pressure.

There was a 34.9 per cent rise in suicides, 28.5 per cent rise in accidental deaths not related to cars, and a 44 per cent rise in deaths from falls on birthdays.

Psychologist prof Richard Wiseman, from the University of Hertfordshire, said... ( more at http://www.telegraph.co.uk/health/healthnews/9323562/We-are-more-likely-to-die-on-our-birthday-than-any-other-day.html )