Author: Discover Main Feed

  • The Most Sensitive Radio Telescope in the World: Arecibo Observatory

    Arecibo Observatory

    Puerto Rico is home to the largest, most sensitive radio telescope in the world. For more than 40 years, the Arecibo Observatory has measured the motions of galaxies, mapped the surface of Venus, studied the physics of pulsars, and listened for signals from extraterrestrial life.

    The 1,000-foot-wide dish, which rests in a natural sinkhole, consists of 40,000 aluminum panels (1) that form a radio-reflective surface. The panels gather radio waves from the sky and focus them onto a feed antenna (2) that amplifies the signals and sends them to a control room where data are analyzed…

  • The Best Way to Predict Box Office Hits: Twitter Chatter | Discoblog

    the-blind-side-posterWondering which Hollywood movie will be this weekend’s smash hit? Head straight to Twitter, as a new study (pdf) suggests the microblogging service offers the most accurate predictions of a movie’s success.

    In a new paper about Twitter’s success at gauging a film’s fortunes, Sitaram Asur and Bernando Huberman from HP devised a simple model that tracks people’s tweets about a certain movie (for their study, they collected almost 3 million tweets). The researchers found that compared to the industry’s gold standard for movie success prediction, the Hollywood Stock Exchange, tweets were far more accurate in predicting how much money a movie would make.

    The researchers’ system tracks the rate and frequency of movie mentions, and also categorizes the tweet reviews as either positive or negative. The Twitter findings reflect marketing realities, the researchers note: While movie studios can push people to the theaters with hype and pre-release marketing, it’s usually positive reviews and word-of-mouth that sustains people’s interest after a movie has been released.

    Mashable writes:

    One movie analyzed in this study, The Blind Side, had an “enormous increase in positive sentiment after release,” reads the paper. The film’s score jumped from 5.02 to 9.65 on HP’s scale. After a “lukewarm” first weekend, with sales around $34 million, the movie “boomed in the next week ($40.1 million), owing largely to positive sentiment.”

    The study suggests that the collective chatter of the crowds on Twitter can be used not just for predicting future movie hits, but could also be a valuable tool for marketers preparing to introduce new products. Perhaps even political candidates could use Twitter to check for interest in their policy positions. Hey, is #FinancialRegulations trending yet?

    Follow us on Twitter.

    Related Content:
    Discoblog:How To Make Your Twitter Followers Uneasy: Use ShadyURLs
    Discoblog: Astronauts in Space Finally Enter the Intertubes
    Discoblog: New Device Aims to Read Your Dog’s Mind—and Broadcast It on Twitter

    Image: The Blind Side


  • Cox on Ross | Bad Astronomy

    My friend and fellow science promoter Brian Cox will be on Friday Night with Jonathan Ross tonight on BBC America (it already aired last week in the UK). Brian is funny and smart, as is Wossy (as Ross is called), so this should be a great time. And speaking of Matt Smith and Doctor Who, Mr. Smith will be on Wossy’s show as well.

    Speaking of speaking of Doctor Who, oh how I wish this were true.

    Tip o’ the sonic screwdriver to Fizzygoo for the Obama link.


  • Why Didn’t the Young Earth Freeze Into an Ice Ball? | 80beats

    SnowballEarthThe “young sun paradox” just won’t go away. For decades, scientists like Carl Sagan have tried to resolve this mystery of the early solar system—how the newborn Earth stayed warm enough to keep liquid water—but it continues to bob and weave around an answer. In the journal Nature, a team led by Minik Rosing proposes an alternate solution to the leading theory, which relies on the greenhouse effect hypothesis. But don’t expect the debate to end here.

    The problem is this: The young Earth received much less heat from the sun. Four billion years ago, a lower solar luminosity should have left Earth’s oceans frozen over, but there is ample evidence in the Earth’s geological record that there was liquid water — and life — on the planet at the time [Space.com]. So what gives? The traditional explanation going back to the 1970s has been that a powerful greenhouse effect, far stronger than the one we experience today, kept the Earth basked in enough warmth to keep water sloshing around the planet’s surface as a liquid and not packed in solid ice. In 1972, Sagan and colleague George Mullen wrote that such an effect would have required intense carbon dioxide concentrations in the atmosphere during that period, the Archaen.

    But the evidence isn’t there, Rosing argues. To research the greenhouse hypothesis, he and his team studied banded iron formations in Greenland ice dating back 3.8 billion years. They focused on two minerals, magnetite and siderite, that can provide a bellwether of the CO2 concentrations in the atmosphere. Too much CO2, and magnetite can’t form, whereas the opposite is true for siderite [ScienceNOW]. According to Rosing’s analysis, the CO2 concentration in the atmosphere could have been as high as three times what we see today, but that’s not nearly enough to account for the warmth that would’ve been needed to stave off a snowball Earth.

    So, Rosing puts forth his own solution. Back then, he says, the continents were smaller and thus more of the Earth’s surface was covered by water. Since water tends to absorb more heat than land, an ultra-watery Earth could have helped conserve warmth. Secondly, he says, the early earth wasn’t a cloudy place back then, due to the fact that life had just arisen. The droplets of water that make up clouds form by glomming on to tiny particles, called cloud condensation nuclei, many of which are chemical substances produced by algae and plants, which weren’t present on the Earth at that time [Space.com]. You see the same effect today, he says, in areas of the open ocean that have neither much marine life nor much cloud cover. And if the young Earth truly had few clouds, more sunlight reached the surface.

    Atmospheric scientist James Kasting says that if Rosing’s team is right, the idea could have consequences beyond our own planet’s history—it could widen the habitable zone for exoplanet hunters seeking extraterrestrial life. But regarding the young sun paradox, he says, this new answer is incomplete. “I think their mechanism fails because they just barely get up to freezing point,” Kasting said, adding that it also fails to take into account the reflectively of the planet’s ice [Discovery News].

    Related Content:
    DISCOVER: The Fast Young Earth
    DISCOVER: Did Life Evolve in Ice?
    DISCOVER: Snowball Earth
    DISCOVER: Our Solar System’s Explosive Early Years

    Image: Wikimedia Commons / Neethis


  • The Rare Humans Who See Time & Have Amazing Memories | Discoblog

    synaesthesiaThe “normal” form of the condition called synesthesia is weird enough: For people with this condition, sensory information gets mixed in the brain causing them to see sounds, taste colors, or perceive numbers as having particular hues.

    But psychologist David Brang is studying a bunch of people with an even odder form of synesthesia: These people can literally “see time.”

    Brang’s subjects have time-space synesthesia; because they have extra neural connections between certain regions of the brain, the patients experience time as a spatial construct.

    In his research, Brang describes one patient who was able to see the year as a circular ring surrounding her body that rotated clockwise throughout the year. The current month was reportedly inside her chest with the previous month in front of her chest, reports New Scientist.

    Describing the condition to New Scientist, Brang said:

    “In general, these individuals perceive months of the year in circular shapes, usually just as an image inside their mind’s eye.”

    So, while boring, normal folks whip out their physical or online calendars to jot down plans for the future, time-space synesthetes just look straight ahead and can tell you precisely what days work for them. It’s almost like their personal version of augmented reality.

    Brang found these uncommon individuals by recruiting 183 students for a trial, in which he asked them to visualize the months of the year and reconstruct that representation on a computer screen.

    New Scientist writes:

    Four months later the students were shown a blank screen and asked to select a position for each of the months. They were prompted with a cue month – a randomly selected month placed as a dot in the location where the student had originally placed it. Uncannily, four of the 183 students were found to be time-space synesthetes when they placed their months in a distinct spatial array – such as a circle – that was consistent over the trials.

    A second test was conducted to compare the memories of time-space synesthetes to those of regular folks. Brang asked his subjects to memorize an unfamiliar spatial calendar and reproduce it. The results showed that when it came to recalling events in time, time-space synesthetes far outperformed the others.

    Earlier research has shown that people with synesthesia, on an average, remember about 123 different facts related to a certain period in their life, while a regular person could recall just 39 facts.

    Psychologist Julia Simner last year told the BBC:

    “So the average person might remember that they went on holiday to America when they were seven…. This person would recall the name of the guesthouse, the name of the guesthouse owner and the breed of the owner’s dog.”

    Related Content:
    Discoblog: “Seeing” Sounds and “Hearing” Food: The Science of Synesthesia
    80beats: Revealed: The Genetic Root of Seeing Sounds and Tasting Colors
    Cosmic Variance: Your Mental Image of Time
    Cosmic Variance: Martian Colors explains how to test for synesthesia
    DISCOVER: Are We All Synesthetes?

    Image: Julia Simner / BBC


  • Yet-Another-Genome Syndrome | The Loom

    There’s a certain kind of headline I have become sick of: “Scientists Have Sequenced the Genome of Species X!”

    haemophilus genome 440Fifteen years ago, things were different. In 1995, scientists published the first complete genome of a free-living organisms ever–that of a nasty germ called Haemophilus influenzae. Bear in mind, this was in the dark ages of the twentieth century, when a scientist might spend a decade trying to decipher the sequence of a single gene.

    And then, with a giant thwomp, a team of scientists dropped not just one gene, but 1740 genes, spelled out across 1.8 million base pairs of DNA. At the core of the paper was an image of the entire genome, a kaleidoscopic wheel marking all the genes that it takes to be H. influenzae. It had a hypnotizing effect, like a genomic mandala. Looking at it, you knew that biology would never be the same.

    Looking over that paper today, I’m struck by what a catalog it is. The authors listed every gene and sorted them by their likely function. They didn’t find a lot of big surprises about H. influenzae itself. It had genes for metabolism, they reported, and genes for attaching to host cells, and for sensing the environment. But scientists already knew that. What was remarkable was the simple fact that scientists could now sequence so much DNA in so little time.

    Then came more microbes. Then, ten years ago this coming June, came a rough draft of the human genome. Then finished drafts of other animals: chickens, mice, flies, worms. Flowers, truffles, and malaria-causing parasites. They came faster and faster, cheaper and cheaper. The acceleration now means that the simple accomplishment of sequencing a genome is no longer news.

    torrent220On Wednesday I caught a talk at Yale by Jonathan Rothberg, a scientist who invented “next-generation sequencing” in 2005. By sequencing vast numbers of DNA fragments at once, the new technology made it possible to get a genome’s sequence far faster than earlier methods. Rothberg’s company, 454, got bought up by Roche, leaving him without a lot to do. So he came up with a new machine: a genome sequencer about the size of a desktop printer that can knock out complete genomes with high accuracy in a matter of hours. Rothberg has been unveiling details of the machine over the past few weeks. To convince his audience that the machine actually works, he flashed another mandala wheel–the 4.7 million base pairs of E. coli.

    I recalled when the first E. coli genome was published in 1997. It was the result of years of work by 17 co-authors, an event celebrated in newspapers. Now Rothberg just threw up a quick slide of the germ’s genome just to show what he could do in a matter of hours. And I have to say that the sight of yet another circular map of a genome, on its own, no longer gave me a thrill. It’s a bit like someone waving you over to a telescope and saying, “Look! I found a star!”

    What remains truly exciting is the kind of research starts after the genomes are sequenced: discovering what genes do, mapping out the networks in which genes cooperate, and reconstructing the deep history of life. Thanks to the hundreds of genomes of microbes scientists can now compare, for example, they can see how the history of life is, in some ways, more like a web than a tree. Insights like these are newsworthy. The sequencing of those genomes, on its own, is not.

    And yet my email inbox still gets overwhelmed with press releases about the next new genome sequence. The press releases typically read like this:

    “Scientists have sequenced the genome of species X. Their research, published today in the Journal of Terribly Important Studies, will lead to new insights about this important species. Maybe it will even cure cancer or eliminate world hunger!”

    And then those press releases give rise to news articles. Here are dozens of pieces that came out over the past couple days, describing the freshly-sequenced genome of the zebra finch. What did the scientists actually learn about zebra finches through this exercise? The articles typically referred to 800 genes involved in the birds learning how to sing. Of course, nobody seriously would expect them to use just one or two genes for something so complex, so this was no big surprise. The articles also mentioned that a lot of the genes were similar to human language genes. This is not really news, either. For the most part, the articles look to the future–to experiments that scientists will be able to do on zebra finches, in the hopes of learning about human speech. But that’s not news of an achievement–that’s a promise.

    Perhaps press release writers and journalists are still operating on the assumption that any new genome is news. But that assumption has been wrong for years now. Let’s wait to see what scientists actually discover in those marvelous mandalas.

    [Update: Thanks to Jonathan Eisen for coining the easy-t0-remember acronym for my current disorder: YAGS. That goes into my personal lexicon right away.]


  • Ed Roberts, Personal Computer Pioneer & Mentor to Bill Gates, Dies at 68 | 80beats

    edrobertsHenry Edward Roberts didn’t set out to kick-start the computer revolution. He was just trying to get out of debt.

    Roberts, who died yesterday at 68, was an Air Force man in his younger days and a medical doctor in his later ones, but it was the middle part of his life that changed the world. In the mid-1970s, Roberts started a company call Micro Instrumentation and Telemetry Systems (MITS), and in 1975 introduced the Altair 8800—one of the first computers available and affordable for home hobbyists.

    When Popular Electronics magazine featured him and the computer on its cover, it caught the attention of two young computer-philes, Bill Gates and Paul Allen. Gates and Allen quickly reached out to Roberts, looking to create software for the Altair. Landing a meeting, the pair headed to Albuquerque, N.M., where Roberts’ company was located. The two went on to set up Microsoft, which had its first offices in Albuquerque [CNET].

    When Roberts founded MITS, he was using his technological know-how to make calculators. But when companies like Texas Instruments began to dominate that market, Roberts got squeezed out. In the mid-1970’s, with the firm struggling with debt, Dr Roberts began to develop a computer kit for hobbyists. The result was the Altair 8800, a machine operated by switches and with no display [BBC News]. Prior to the Altair, most computers were still giant machines in university labs, but Roberts said he believed there were enough tech nerds like him in the world that a personal computer—even one as rudimentary as the 8800—would be a success.

    Gates reportedly visited Roberts at the hospital days before he died, and Gates and Allen paid tribute to their mentor in a statement. “Ed was willing to take a chance on us–two young guys interested in computers long before they were commonplace–and we have always been grateful to him,” Gates and Allen said. “The day our first untested software worked on his Altair was the start of a lot of great things” [CNET].

    Roberts sold off his company in 1977 and retired to farming before becoming an internist. However, his son David Roberts says, he remained interested to the last in his accidental revolution. He never lost his interest in modern technology, even asking about Apple’s highly anticipated iPad from his sick bed. “He was interested to see one,” said Roberts, who called his father “a true renaissance man” [AFP].

    Related Content:
    80beats: Happy 40th Birthday, Internet! (Um, Again.)
    80beats: Happy 40th Birthday, Internet!
    80beats: 40 Years Ago Today, the World Saw Its First Personal Computer
    DISCOVER: The “Father of the Internet” Would Rather You Call Him “Vint”
    DISCOVER: The Emoticon Turns 25

    Image: DigiBarn Computer Museum


  • Pigeons outperform humans at the Monty Hall Dilemma | Not Exactly Rocket Science

    Monty_hall_dilemmaImagine that you’re in a game show and your host shows you three doors. Behind one of them is a shiny car and behind the others are far less lustrous goats. You pick one of the doors and get whatever lies within. After making your choice, your host opens one of the other two doors, which inevitably reveals a goat. He then asks you if you want to stick with your original pick, or swap to the other remaining door. What do you do?

    Most people think that it doesn’t make a difference and they tend to stick with their first pick. With two doors left, you should have even odds of selecting the one with the car. If you agree with this reasoning, then you have just fallen foul of one of the most infamous of mathematical problems – the Monty Hall Dilemma. In reality, you should actually swap every time – doing so means double the odds of getting the car. I will explain why shortly but if you’re currently confused, you are not alone. Over the years, the problem has ensnared countless people, including professional mathematicians. But not, it seems, pigeons.

    Walter Hebranson and Julia Schroder showed that, after some training, the humble pigeon can learn the best tactic for the Monty Hall Problem, switching from their initial choice almost every time. Amazingly, humans who get similar extensive practice never develop the optimal strategies that the pigeons pick up. This doesn’t mean that pigeons are “smarter than humans” as some news stories have claimed, but it does mean that the two species approach probability problems in different ways. We suffer because we overthink the problem.

    The Monty Hall Dilemma takes its name from Monty Hall, the presenter of a show called Let’s Make a Deal, which involved similar choices. The dilemma became truly legendary when it featured in a column called Ask Marilyn in Parade magazine. When columnist Marilyn vos Savant, the then holder of the Guinness World Record for highest IQ, wrote the correct solution, she was inundated with complaints.

    Around 10,000 readers disagreed with her, and wrote in to say as much (these were the days when trolls had to actually pay for postage; read the last letter in particular). Many of them had PhDs and many were mathematicians. Even Paul Erdos, the most prolific mathematician in history, refused to believe this explanation until computer simulations proved beyond all doubt that always switching was the best strategy.

    The problem is that most people assume that with two doors left, the odds of a car lying behind each one are 50/50. But that’s not the case – the actions of the host beforehand have shifted the odds, and engineered it so that the chosen door is half as likely to hide the car.

    At the very start, the contestant has a one in three chance of picking the right door. If that’s the case, they should stick. They also have a two in three chance of picking a goat door. In these situations, the host, not wanting to reveal the car, will always pick the other goat door. The final door hides the car, so the contestant should swap. This means that there are two trials when the contestant should swap for every one trial when they should stick. The best strategy is to always swap – that way they have a two in three chance of driving off, happy and goatless.

    All over the world, people are spectacularly bad at this. We almost always stick or, at best, show indifference. Hebranson and Schroder wanted to see if other species would be similarly vexed. They worked with six Silver King pigeons and altered the game show format to suit their beaks.

    Each pigeon was faced with three lit keys, one of which could be pecked for food. At the first peck, all three keys switched off and after a second, two came back on including the bird’s first choice. The computer, playing the part of Monty Hall, had selected one of the unpecked keys to deactivate. If the pigeon pecked the right key of the remaining two, it earned some grain. On the first day of testing, the pigeons switched on just a third of the trials. But after a month, all six birds switched almost every time, earning virtually the maximum grainy reward.

    Monty_pigeonEvery tasty reward would reinforce the pigeon’s behaviour, so if it got a meal twice as often when it switched, you’d expect it to soon learn to switch. Hebranson and Schroder demonstrated this with a cunning variant of the Monty Hall Dilemma, where the best strategy would be to stick every time. With these altered probabilities, the pigeons eventually learned the topsy-turvy tactic.

    It may seem obvious that one should choose the strategy that would yield the most frequent rewards and even the dimmest pigeon should pick up the right tactic after a month of training. But try telling that to students. Hebranson and Schroder presented 13 students with a similar set-up to the pigeons. There were limited instructions and no framing storyline – just three lit keys and a goal to earn as many points as possible. They had to work out what was going on through trial and error and they had 200 goes at guessing the right key over the course of a month.

    At first, they were equally likely to switch or stay. By the final trial, they were still only switching on two thirds of the trials. They had edged towards the right strategy but they were a long way from the ideal approach of the pigeons. And by the end of the study, they were showing no signs of further improvement.

    Monty_humanWhy is the Monty Hall Dilemma so perplexing to humans, when mere pigeons seem to cope with it? Hebranson and Schroder think this is a case of our own vaunted intelligence working against us. When faced with a problem like this, we try to think it through, working out the best solution before we do anything. This would be fine, except we’re really quite bad at problems involving conditional probability (such as “if this happens, what are the odds of that happening?”). Despite our best attempts at reasoning, most of us arrive at the wrong answer.

    Pigeons, on the other hand, rely on experience to work out probabilities. They have a go, and they choose the strategy that seems to be paying off best. They also seem immune to a quirk of ours called “probability matching”. If the odds of winning by switching are two in three, we’ll switch on two out of three occasions, even though that’s a worse strategy than always switching. This is, of course, exactly what the students in Hebranson and Schroder’s experiments did. The pigeons, on the other hand, always switched – no probability matching for them.

    In short, pigeons succeed because they don’t over-think the problem. It’s telling that among humans, it’s the youngest students who do best at this puzzle. Eighth graders are actually more likely to work out the benefits of switching than older and supposedly wiser university students. Education, it seems, actually worsens our performance at the Monty Hall Dilemma.

    Reference: Journal of Comparative Psychology http://dx.doi.org/10.1037/a0017703

    More on bird brains:

    Twitter.jpg Facebook.jpg Feed.jpg Book.jpg

  • Tourist gets dramatic volcano plume snapshot | Bad Astronomy

    soufriere_collapseA little while back I posted a dramatic satellite image of the Soufrière Hills volcano on Montserrat erupting (thumbnail on the right; click to get the embiggened shot).

    Well, as from above, so it is from the side as well. Canadian tourist Mary Jo Penkala was on a plane near Montserrat at the time, and after the pilot made an announcement for passengers to look out the port window, she snapped this:

    montserrat_volcano

    Wow! I’ve seen a lot of amazing things out my airplane window, but never anything close to this. Ms. Penkala is enjoying a bit of well-deserved notoriety for the picture. It’s amazing in and of itself, but I love how we now can get so many views of volcanic plumes: from below, the side, and even from space. These natural disasters cause a huge amount of damage and grief, of course, but the good news is that with enough study we can learn more about them, how to predict them, and when is the best time to get people out of any potentially dangerous regions.


  • Something Beautiful for a Friday | Cosmic Variance

    The Seed Cathedral — tens of thousands of undulating fiber optic rods

    Seed Cathedral

    …with different varieties of seeds embedded in the tips.

    Seed Cathedral close-up

    (h/t SLOG)


  • “Counterintuitive” social science finding of the day | Gene Expression

    Quotes because you might not find it counterintuitive, Self-Esteem Development From Young Adulthood to Old Age: A Cohort-Sequential Longitudinal Study:

    The authors examined the development of self-esteem from young adulthood to old age. Data came from the Americans’ Changing Lives study, which includes 4 assessments across a 16-year period of a nationally representative sample of 3,617 individuals aged 25 years to 104 years. Latent growth curve analyses indicated that self-esteem follows a quadratic trajectory across the adult life span, increasing during young and middle adulthood, reaching a peak at about age 60 years, and then declining in old age. No cohort differences in the self-esteem trajectory were found. Women had lower self-esteem than did men in young adulthood, but their trajectories converged in old age. Whites and Blacks had similar trajectories in young and middle adulthood, but the self-esteem of Blacks declined more sharply in old age than did the self-esteem of Whites. More educated individuals had higher self-esteem than did less educated individuals, but their trajectories were similar. Moreover, the results suggested that changes in socioeconomic status and physical health account for the decline in self-esteem that occurs in old age

    selfestAs a person well under 60 but slowing walking in that direction I’m pretty heartened by this. On the other hand, I’m one o those people who also tend to think that “self-esteem” is a bit overrated, so I’m not that heartened.

    Via Randall Parker

  • Who You Callin’ “Bird Brain”?

    The amazing smarts of crows, jays, and other corvids are forcing scientists to rethink when and why intelligence evolved.

  • Vital Signs: There’s Hyperactivity…and There’s Hyperactivity

    Most 3-year-olds are hyperactive sometimes. But some cases are a sign of something more serious going on.

  • Photo Gallery: “The People’s Camera” Snaps Pictures of Mars on Request | 80beats

    NEXT>

    Lobate Debris

    The Mars Reconnaissance Orbiter has been circling the red planet doing NASA’s work since 2006. Now, it’s finally following your direction. Using the HiWish page, Mars enthusiasts have been requesting sites for the HiRISE (High Resolution Imaging Science Experiment) on board the orbiter to photograph. This week, NASA released the first batch of images from what it’s calling “the people’s camera.”

    This image of an area, Deuteronilus Mensae, shows high mesas surrounded by buildups called lobate debris aprons. These are particularly interesting as they seem to contain nearly pure ice.

    All images: NASA/JPL/University of Arizona


    NEXT>


  • New Web Site Is Like LinkedIn, But With More Anonymous Slander | 80beats

    unvarnished

    With so much personal information floating around the Internet, managing an online reputation can be a challenge, especially for people looking for a job or hoping for a promotion. Professional networking sites like LinkedIn have helped people manage their reputations by allowing them to post tightly controlled professional profiles–on LinkedIn, users can request recommendations from colleagues, which they can first approve before posting them on their profiles. But while those profiles are useful, some people see them as a little more than organized puffery.

    Soon, however, more daring professionals can also use the services of Unvarnished–a controversial new Web site where users can leave anonymous reviews of a person’s work. Billed as Yelp for people, the site is built on user-generated reviews, and it aims to present an “unvarished” picture of a worker’s strengths or weaknesses. So far, the reviews of the beta version of the site have been scathing. Apart from being named “2010’s worst startup” [Econsultancy], the site has also been described as a “clean, well-lighted place for defamation” [Vator News].

    The site, created by Peter Kazanjy, is currently available by invitation-only and was released in its beta form a few days ago. You can either join a waiting list or wait for someone to send you an invite via Facebook, asking for a review. Once you connect, you have to submit a review for your account to be activated; that allows you to “claim” your profile–because if someone has already submitted a review of you, your profile already exists. Once your profile exists you can request reviews of your work. And of course, you can submit as many anonymous reviews as you like.

    If someone posts a nasty review of your work, however, the site does not allow you the option of removing the post or deleting your profile, leading some to worry that the anonymous reviews opens the forum up to personal vendettas and amplifies everything that is awful about the web right now: anonymous, drive-by, ad hominem attacks that can’t be erased or edited and that live in search forever [CNET].

    There is also no way of judging the value of the reviews left on a user’s profile, which some people argue diminishes the value of the whole site. After all, an endorsement from a top executive at a well-known company is going to be far more compelling than a negative review by a former entry-level co-worker who never worked with you directly. In the absence of any ability to truly assess a reviewers credibility (either through identity or review history), Unvarnished anonymous reviews have little to no inherent value [Econsultancy].

    Other critics worry that Unvarnished may do more harm than good to professional reputations as people tend to magnify the negative; an employee with 50 extremely positive reviews and 5 very negative reviews would be at a disadvantage against someone with no Unvarnished profile [Techcrunch].

    Creator Peter Kazanjy stands by his decision to allow anonymous posts which cannot be deleted by the user, pointing out that other review sites such as TheFunded.com, where people can rate venture capitalists and Tripadvisor, where people rate hotels and vacation spots would be useless, if the business owners could delete negative reviews. “The idea is to create a place, not where people only give F-minuses, but a place where people can feel comfortable to give B-pluses or A-minuses,” Kazanjy said. “Reviews then actually mean something” [Los Angeles Times].

    There are some brave souls who are willing to give the new site a shot, describing it as a LinkedIn with teeth: minus the sappy reviews people post to each others’ profiles on that site. LinkedIn with teeth makes it seem more mundane, and that is the truth of the matter. Browse around a little and you’ll calm down pretty quickly. Come back later when you’re considering working with someone and you may find it useful [ReadWriteWeb].

    Related Content:
    80beats: Some M.D.s Try to Amputate Online Reviews
    80beats: Hey Perp: That Facebook Friend Request May Come From the FBI
    Discoblog: Class-Action Lawsuit Accuses Yelp of Extortion
    Discoblog: Worst Science Article of The Week: Facebook Causes Syphilis

    Image: Unvarnished


  • Testeidolia | Bad Astronomy

    hauntedscrotum[Note: This post honors the day that is April 1.]

    I have posted many a picture purporting paranormal parts that are actually just our minds playing tricks on us. But this one really puts us to the test. Or testis.

    Yes. It’s a haunted scrotum.

    It looks more like a monkey to me than a ghostly face, and there’s a vas deferens between them. Maybe you see something different. Leave a comment if you do, and please keep it clean… but have a ball.

    Tip o’ the urethra to Dr. Joe Albietz.


  • The Pixel Vision of Kirk Crippens | Visual Science

    NEXT>

    1-map

    I met Kirk Crippens at the Photo Alliance portfolio review in San Francisco a few weeks back, and was psyched to view his project Pixel Nation. I liked that the images in this series isolate everyday textures we take for granted, revealing the unexpected.

    Crippens reports that while photographing pixels he discovered an ornate world of color and structure that he photographed at varying magnifications. Crippens: “Many of the photographs were taken with long exposures while the images on the screen were changing. This process meant that I did not know what the photo would look like until after the image was captured. With analog television ending in 2009, I decided to include the patterns of older screens, the pre-pixel screens. Changes have occurred in pixel design over the years – from the simple dashes and dots of early color TVs and computer monitors to plasma’s gaseous pixels and HD television’s intricately designed pixels. The ubiquitous pixel has been transforming just under our gaze.” Behold, the lowly pixel!

    Sony Trinitron

    Images courtesy Kirk Crippens


    NEXT>

  • NCBI ROFL: Study proves cheating good for marriage. | Discoblog

    3177793817_862443a587“It is commonly assumed that infidelity, if not concealed from the other partner, is harmful to monogamous relationships (MRs). However, this assumption has never been directly tested. In order to determine if “cheating” does, in fact, harm MRs, we recruited 97 married heterosexual couples currently receiving marital counseling therapy, and randomly assigned each couple to an experimental or control group. The 31 couples in the control group continued their traditional marital counseling therapy. The remaining 66 couples were urged by their therapist to seek sexual attention outside the relationship… Read the rest of this entry »


  • Announcing My Next Point of Inquiry Guest: Eli Kintisch | The Intersection

    Over at the Point of Inquiry forums, I’ve just started a thread inviting listeners to pose questions for the author of Hack the Planet: Science’s Best Hope–Or Worst Nightmare–for Averting Climate Catastrophe. Kintisch is a reporter for Science magazine, and has also written for Slate, Discover, MIT Technology Review and The New Republic. He has worked as Washington correspondent for the Forward and science reporter for the St. Louis Post-Dispatch. In 2005 he won the Space Journalism prize for a series on private spaceflight. His new book, Hack the Planet, will be available April 19, and was just excerpted by Wired online. So head on over to the forums to ask your questions, or post them in the comments below….