Author: Serkadis

  • YouTube Rolls Out IPv6 Support Sending Traffic Through the Roof

    IPv4 addresses are running out and, while it’s not exactly an emergency yet, most companies and providers have been slow in deploying support for the ‘next-generation’ version of the Internet protocol, IPv6. Things got a major boost recently, though, as YouTube seems to have quietly introduced IPv6 support at the production level…. (read more)

  • Ford Resumes Chinese Transit Production

    After earlier today Japanese carmaker Toyota announced it will resume production at the five plants it idled this week on account of the accelerator pedal recall, American manufacturer Ford and its Chinese partner, Jiangling Motors, announced the production of Transit Classic vans in China will also be resumed.

    Ford announced the halt in production for the Transit Classic last week, as a precautionary measure. The Ford’s built in China use accelerator pedals manufactured by CTS Corp., the sa… (read more)

  • Abarth 2010 Punto Evo and 500C: what happened to Alfa?

    Abarth Punto Evo rendering

    The Italians are impatient to see what the next Abarth Punto Evo will look like, perhaps something like the rendering above or the Abarth Grande Punto after the jump. It is only a question of when the new Abarth Punto will be presented, as Sergio Marchionne has already confirmed the Abarth models for 2010: the Punto Evo, the 500C and perhaps even a 500 TC.

    The Abarth Punto Evo is sure to have the new Multiair engine with 170 hp, meaning in the future the EsseEsse model will have 200 hp. The 500 TC stands for “Turismo Competizione” and was originally the name given to race cars based on street-legal models.

    The big question is why Abarth should focus on just two models and get extreme handling out of the A and B segments represented by the Fiat 500 and Fiat Punto Evo. Frustrated with these tuned versions and lack of style, some Italians are asking where Alfa Romeo is in all this. Alfa represents a true, stylish sports brand with plenty of pedigree, and the Italians want to know why Abarth should be the only brand representing Fiat in the world of racing.

    Perhaps Sergio Marchionne knows something we don’t about Alfa Romeo and its future, or perhaps we should wait a little longer and see what comes out of the Alfa-Abarth-Maserati mix. Whatever the case, there’s more than one Fiat/Alfa Romeo fan peeved that the 500 and Punto get the tuned treatment when money could be invested seriously in Alfa Romeo sports style, too. As one comment on Autoblog.it reads: “You can’t ever be a passionate fan of cars until you’ve got yourself an Alfa Romeo.”

    Abarth Grande Punto Abarth Grande Punto Abarth Grande Punto Abarth Grande Punto

    Source | Autoblog.it and Caradisiac via Autoblog.com


  • 2012 Cadillac ATS Still Has a Chance

    Cadillac is still working on the ATS model and the official launch of the car might take place as soon as 2012. At least, this is what insideline.com reports citing an insider who also revealed that the company is looking into ways to introduce a whole new range of the ATS, including a coupe and a convertible.

    "We’re finalizing the four-door showcar of the ATS first. Then we’ll follow quickly after with a wagon, a coupe and a convertible," the source said. However, keep in mind that… (read more)

  • Mitsubishi Lancer Sedans Equipped With CVT Recalled in China

    Southeast Motor, a Chinese partner of Mitsubishi Motors, will recall 734 Lancer saloons equipped with the CVT (continuously variable transmission) gearbox to fix a bolt safety issue, Xinhua News reported via Gasgoo, citing China’s quality regulator.

    The Mitsubishi Lancers will have to pay a visit to service units because the brittle lever-bolt was manufactured using improperly modified techniques, the General Administration of Quality Supervision, Inspection and Quarantine said in a statement… (read more)

  • Aardvark Publishes A Research Paper Offering Unprecedented Insights Into Social Search

    In 1998, Larry Page and Sergey Brin published a paper[PDF] titled Anatomy of a Large-Scale Hypertextual Search Engine, in which they outlined the core technology behind Google and the theory behind PageRank. Now, twelve years after that paper was published, the team behind social search engine Aardvark has drafted its own research paper that looks at the social side of search. Dubbed Anatomy of a Large-Scale Social Search Engine, the paper has just been accepted to WWW2010, the same conference where the classic Google paper was published.

    Aardvark will be posting the paper in its entirety on its official blog at 9 AM PST, and they gave us the chance to take a sneak peek at it. It’s an interesting read to say the least, outlining some of the fundamental principles that could turn Aardvark and other social search engines into powerful complements to Google and its ilk. The paper likens Aardvark to a ‘Village’ search model, where answers come from the people in your social network; Google is part of ‘Library’ search, where the answers lie in already-written texts. The paper is well worth reading in its entirety (and most of it is pretty accessible), but here are some key points:

    • On traditional search engines like Google, the ‘long-tail’ of information can be acquired with the use of very thorough crawlers. With Aardvark, a breadth of knowledge is totally reliant on how many knowledgeable users are on the service. This leads Aardvark to conclude that “the strategy for increasing the knowledge base of Aardvark crucially involves creating a good experience for users so that they remain active and are inclined to invite their friends”. This will likely be one of Aardvark’s greatest challenges.
    • Beyond asking you about the topics you’re most familiar with, Aardvark will actually look at your past blog posts, existing online profiles, and tweets to identify what topics you know about.
    • If you seem to know about a topic and your friends do too, the system assumes you’re more knowledgeable than if you were the only one in a group of friends to know about that topic.
    • Aardvark concludes that while the amount of trust users place in information on engines like Google is related to a source website’s authority, the amount they trust a source on Aardvark is based on intimacy, and how they’re connected to the person giving them information
    • Some parts of the search process are actually easier for Aardvark’s technology than they are for traditional search engines. On Google, when you type in a query, the engine has to pair you up with exact websites that hold the answer to your query. On Aardvark, it only has to pair you with a person who knows about the topic — it doesn’t have to worry about actually finding the answer, and can be more flexible with how the query is worded.
    • As of October 2009, Aardvark had 90,361 users, of whom 55.9% had created content (asked or answered a question). The site’s average query volume was 3,167.2 questions per day, with the median active user asking 3.1 questions per month. Interestingly, mobile users are more active than desktop users. The Aardvark team attributes this to users wanting quick, short answers on their phones without having to dig for anything. They also think people are more used to using more natural language patterns on their phones.
    • The average query length was 18.6 words (median of 13) versus 2.2-2.9 words on a standard search engine.  Some of this difference comes from the more natural language people use (with words like “a”, “the”, and “if”).  It’s also because people tend to add more context to their queries, with the knowledge that it will be read by a human and will likely lead to a better answer.
    • 98.1% of questions asked on Aardvark were unique, compared with between 57 and 63% on traditional search engines.
    • 87.7% of questions submitted were answered, and nearly 60% of them were answered within 10 minutes.  The median answering time was 6 minutes and 37 seconds, with the average question receiving two answers.  70.4% of answers were deemed to be ‘good’, with 14.1% as ‘OK’ and 15.5% were rated as bad.
    • 86.7% of Aardvark users had been asked by Aardvark to answer a question, of whom 70% actually looked at the question and 38% could answer.  50% of all members had answered a question (including 75% of all users who had ever actually interacted with the site), though 20% of users accounted for 85% of answers.
    Information provided by CrunchBase


    Buy This Item: [Click here to buy this item]

    Read Original Article

  • Paffett Says New Tires Will Be a Challenge in 2010

    Gary Paffett conducted the first test drive of the McLaren Mercedes MP4-25 in Valencia, during the course of yesterday, and revealed that the biggest challenge he found when driving the new car – as compared to its predecessor in 2009 – will be getting used to the new tires for the 2010 season.

    Bridgestone confirmed during the late months of 2009 that it will launch a new tire for the upcoming season of Formula One, being that the FIA has banned refueling in the sport. Consequently, the tire … (read more)

  • Enova to Supply All-Electric Vans to the US

    Have you ever thought about a SWAT all-electric van? Those guys don’t really look like tree huggers, but without the sound of an internal combustion engine that vehicle would be really stealth. However, there’s not a long way until we will see an all electric SWAT van, as the government has signed a contract to get some all electric, walk-in step vans.

    Enova Systems, a hybrid and electric drive systems producer, just signed a contract with the General Services Administration (GSA), which prov… (read more)

  • Daimler to Assemble Trucks in Iraq

    A recent partnership agreement between Daimler and the State Company for Automotive Industry (SCAI) will allow the German automotive group to supply the complete Mercedes-Benz trucks and assembly kits to Iraq.

    The agreement, which was signed in the German embassy in Baghdad yesterday, is the result of a Memorandum of Understanding (MoU) that Daimler signed with the Iraqi government in July 2008. Daimler will therefore supply tools and equipment for truck assembly and provide technical support… (read more)

  • Two More Toyota Accidents Reported

    The huge recall announced by Japanese carmaker Toyota was poised to give birth, at one point or another, to more and more accident claims. While it is not our place to judge the truth behind some of those claims, for the sake of objectivity, we have to have our say on at least those who seem far fetched.

    We will below tell you two short stories about two Toyota owners who crashed their cars and then blamed it on their malfunction accelerator pedal. One of the stories seems to be backed by pr… (read more)

  • Biochar Collection for Industrial Agriculture

    The one problem that I have had with Biochar, was how to adapt its production to the needs of industrial farming.  I think that we can now make some progress.
    First off, biochar is elemental carbon reduced by the application of heat to plant material usually drawn from crop wastes.  The waste should not include woody material because it retains structural integrity and preserves difficult to reduce gross structure.  We have determined that the likeliest feed stock is corn stover for several good reasons:
    1                    It produces at least ten tons per acre of material and often much more.
    2                    Unless converted to silage when green it is unsuitable for feed or for plowing back into the soil and is normally burned.
    3                    Once ripened, the water is drawn back down the stalks making the stover fairly dry.
    4                    It can be chopped or bailed for handling easily enough and its coarse nature encourages further drying under cover.
    5                    I have reason to believe it was the primary crop used by the Amazonian Indians to produce terra preta over a two thousand year span.
    In short, we do not need to promote a new crop in order to produce biochar.
    A lot has already been discussed about collecting plant waste for some form of power plant type facility.  Let us cover the handling problem first.
    I think it makes a lot of sense for a facility to accept chopped corn waste in exchange for a one for ten biochar load.  A farmer would at his expense truck in typically chopped corn stover to the processing facility and receive back a chit for a load of biochar on a one in ten basis.  A water content measure would be conducted and a penalty applied.  Some time later (perhaps weeks) the farmer would return to pick up a load of biochar in the form of powdered elemental carbon.  This process integrates smoothly into a farming operation without incurring significant costs but clearly defraying haulage and direct purchase costs for the processor.
    The farmer has disposed of his waste for the cost of trucking it to the plant and he receives in return the produced biochar which he stores in fertilizer tanks.  Before using the biochar, he can blend in fertilizer and apply the combined blend as he would fertilizer.  It should be far gentler on the machinery also.
    We know that a field of corn should produce at least a ton of biochar per acre, so this is a significant contribution even in its first year of operation.  Obviously over many years, the carbon content of the soils will become dominant and worked deep into the soils.
    Even at this initial level of amendment the crops will respond because fertilizer is been retained.  Over several years the fertilizer needs will continue to decline freeing the farm from the need to supply heavy fertilization at all. Recall that in the tropical rainforest, such terra preta soils have been observed cropped continuously over sixty years without any amendment beyond the return of waste to the soil.
    Thus we have shed the collection costs for our processing plant and have stored huge amounts of plant waste under cover so that it will continue to air dry.  The next step is the plant itself.
  • NVIDIA’s first two Fermi cards to be known as GeForce GTX 470 and GTX 480

    Don’t get too excited, we don’t have specs or release windows yet, but we do have hilariously inflated model names to share with you. NVIDIA’s all-new graphics architecture, commonly known as Fermi and recently re-coded as the GF100, has its first two commercial product names — the GeForce GTX 470 and GTX 480 — which as you’ll have noticed skip right past the 300s and nearly double the model numbers of the company’s current gen offerings. Let’s just hope the performance lives up to such a blusterous naming scheme.

    NVIDIA’s first two Fermi cards to be known as GeForce GTX 470 and GTX 480 originally appeared on Engadget on Tue, 02 Feb 2010 04:06:00 EST. Please see our terms for use of feeds.

    Permalink   |  sourceTwitter  | Email this | Comments

    Buy This Item: [Click here to buy this item]

    Article

  • America’s Concern Troll

    concern-troll.thumbnailOne thing that can be said of the nation’s most consistently underwhelming editorial columnist, Richard Cohen, it is that he goes about reaffirming his moniker without delay:

    There is almost nothing the Obama administration does regarding terrorism that makes me feel safer.

    Of course there isn’t, that’s why you’re America’s Concern Troll!

    Obama hasn’t invaded any new countries. New invasions always make Dicky Cohen feel better. He aspires to close down Guantanamo and ignore crimes instead of using it to facilitate new ones. Obama is merely inconsistent in applying the Constitution, as opposed to completely pretending it doesn’t exist. Thank goodness, the Administration decided to let Bybee and Yoo off the hook — I don’t think Cohen’s bladder could have handled a different result, Cohen’s kidneys almost shut down over Scooter Libby.

    Your ‘Liberal’ Washington Post editorial page ladies and gentlemen, as defined by Howard Kurtz.

    Boy I hope Cohen doesn’t see his shadow today.

  • CONFIRMED: Facebook Gets Faster, Debuts Homegrown PHP Compiler

    thefacebook.jpgThe rumors have been flying over what’s going on over at Facebook headquarters. The word has been that a PHP team was brought in and made to sign non-disclosure agreements before discussing a PHP project that has been in development for the past two years. Alex Handy, senior editor of the Software Development Times Blog, predicted last Saturday that Facebook “has rewritten the PHP runtime from scratch,” and several sources have confirmed for us tonight that Facebook has indeed been making some changes to the basic PHP runtime environment.

    According to our sources, Facebook has been working on a PHP compiler that will
    increase speed by around 80% and offer a just-in-time (JIT) compilation engine that will offer a number of advantages. The project is very similar to Google’s Unladen Swallow project, which rebuilt the Python compiler, boosting the speed fivefold and opening the door for multi-language integration.

    Sponsor

    Richard Crowley, an engineer at OpenDNS who is familiar with the project, told us that David Recordon, an engineer at Facebook, invited him to come to Facebook’s headquarters Tuesday moning but wouldn’t give a reason. Crowley mused to us that it was likely his PHP skills and personal connections that got him through the doors and clued us in on what he thinks will be going on behind them.

    PHP is normally an interpreted language, which means that every time a user accesses a PHP page, the server needs to take the code and interpret it to produce the final product. A compiler, however, makes this process much quicker, as the code is interpreted before the user ever asks for the page. The problem here is that any time that page needs to be changed, the code needs to be recompiled.

    Crowley explained to us that the JIT compiler Facebook is introducing occupies a middle ground that not only retains the flexibility of PHP as an interpreted language, but offers the speed of compiled languages like C.

    “Compiling PHP to code a CPU can directly execute certainly has performance implications. It would be silly to alter the workflow to be more like C or C++ by doing all parsing and compilation ahead of time. At the other end of the spectrum, it’s slow and out-of-fashion to interpret every statement within the runtime. A JIT (Just In Time) compiler compiles frequently-executed portions of the program to machine code for speed while maintaining the flexibility of interpreted code.”

    Essentially, the closer a coding language comes to bare metal, the faster the program will run. Facebook’s programs are reported to run around 80% faster than before now that the runtime – the code-compiler-program sequence – has been restructured and rewritten.

    Crowley continued, saying that he expected Facebook to announce a JIT compiler based on the Low Level Virtual Machine, which lays at the heart of Google’s Unladen Swallow project. Crowley’s suppositions were confirmed by anonymous sources tonight.

    When we asked what he thought of this style of release – years of secretive development by two lonely coders, likely locked in a deep, dark vault – Crowley said it was typical.

    “[Facebook] tends to do giant code dumps,” he said. “Facebook tends to build something big, use it, and open source it.”

    We wish Facebook had made this project open, as we’re sure many PHP developers would concur that the efforts would have been much swifter and more beneficial to the public had more folks and a larger team been involved from the outset. And we do wonder about possible duplication of effort from others who may have been working on the same issue.

    There are a number of us out here running PHP (ahem!) that could certainly have benefited from the speed boost.

    Nonetheless, we’re looking forward to see what this does for our PHP browsing experience and how the open-source community reacts to the news. Will the community rally around the new compiler and push PHP into a new realm? Let us know your thoughts in the comments.

    Discuss


    Buy This Item: [Click here to buy this item]

    Read Original Article

  • The Nirvana Phone: Citrix and OK Labs Extend The Convenience of the Smart Phone

    citrix_logo_gray_200x78.gifToday’s smartphones are useful for messaging and some collaboration applications.

    But connecting to applications on a desktop with a smartphone is impractical. It’s far easier to see your desktop using a laptop with virtualization software.

    Citrix Systems and Open Kernal Labs (OK Labs) are teaming up so the smartphone has its own virtualization software, allowing it to access any application on a virtual desktop.

    Sponsor

    The partnership marries Citrix software with the virtual machine technology provided by OK Labs. Citrix Receiver on the mobile device calls the virtual machine through a wireless connection. The software on the device then translates what is being called from the virtual desktop so it may be viewed, for instance, on a desktop monitor. The two companies call it the Nirvana Phone:

    What this really about is extending the convenience of the smartphone. These devices are proving that the mobile web is here to stay. Laptops are increasingly viewed as great for the home or office but can be cumbersome on the road. They’re still necessary to do any extended amount of work. Smartphone technology like what we see with Nirvana makes it easier for people to access documents and applications that are being stored remotely.

    The service is expected to be integrated into mobile devices that come out within the next 12-18 months.

    Mobile devices will become virtual machines with the ability to access any application or document. The Nirvana Phone is a glimpse of what that future looks like.

    Discuss


    Buy This Item: [Click here to buy this item]

    Read Original Article

  • Schumacher Joined by Doctors at Valencia

    Michael Schumacher showed some impressive pace during the first day of testing on Valencia’s Ricardo Tormo Circuit, but the journalists present at the Spanish track did wonder at a certain time whether the German returnee will complete his first running of the W01 without any medical problems.

    And that’s because the Mercedes driver did not come to Valencia alone, but joined by his personal doctor Johannes Peil. Additionally, Schumacher also brought his physiotherapist Kai Schnapka to the Span… (read more)

  • Climate Hoax

    What was published under IPCC’s auspicious is clearly falling under vigorous scrutiny and it turns out that there is plenty to work with.  Outright garbage work was packed into the full report and this is now coming back in full force to haunt them.  We can see from this article that the unraveling is well advanced and that calling global warming a hoax is well supported.
    This is unfair of course, but considering how unfairly critics were handled over the past couple of years, it is perhaps rough justice.  There is still plenty of good science published in the IPCC reports, except they should no longer be deemed as peer reviewed.
    Some of the claims that found their way into the IPCC reports were obviously silly gross exaggerations and recognized as such at the time by informed observers.  Except that IPCC cared little for informed observers, whom they had largely silenced.  They were advocates and not terribly polished ones either.  They made too many people angry.
    As I have already posted, the past decade of temperature moderation and decline has now countered the preceding decade of warming and in the process neutralized the theory of CO2 linkage to global warming at least at present magnitudes. CO2 content has continued to climb in the meantime and that is still not likely a good thing.
    It is still too early to refocus attention back to CO2 management but I think we have to go there.
    The Hottest Hoax in the World
    30 January 2010
    It was presented as fact. The UN’s Intergovernmental Panel on Climate Change, led by India’s very own RK Pachauri, even announced a consensus on it. The world was heating up and humans were to blame. A pack of lies, it turns out.
     would be what it isn’t. And contrarywise, what is, it wouldn’t be. And what it wouldn’t be, it would. You see? —Alice in Wonderland
    The climate change fraud that is now unravelling is unprecedented in its deceit, unmatched in scope—and for the liberal elite, akin to 9 on the Richter scale. Never have so few fooled so many for so long, ever.
    The entire world was being asked to change the way it lives on the basis of pure hyperbole. Propriety, probity and transparency were routinely sacrificed.
    The truth is: the world is not heating up in any significant way. Neither are the Himalayan glaciers going to melt as claimed by 2035. Nor is there any link at all between natural disasters such as Hurricane Katrina and global warming. All that was pure nonsense, or if you like, ‘no-science’! 

    The climate change mafia, led by Dr Rajendra K Pachauri, chairperson of the Intergovernmental Panel on Climate Change (IPCC), almost pulled off the heist of the century through fraudulent data and suppression of procedure. All the while, they were cornering millions of dollars in research grants that heaped one convenient untruth upon another. And as if the money wasn’t enough, the Nobel Committee decided they should have the coveted Peace Prize.
    But let’s begin at the beginning. Mr Pachauri has no training whatsoever in climate science. This was known all the time, yet he heads the pontification panel which proliferates the new gospel of a hotter world. How come? Why did the United Nations not choose someone who was competent? After all, this man is presumably incapable of differentiating between ocean sediments and coral terrestrial deposits, nor can he go about analysing tree ring records and so on. That’s not jargon; these are essential elements of a syllabus in any basic course on climatology.
    You cannot blame him. His degree and training is in railroad engineering. You read it right. This man was educated to make railroads from point A to point B.
    THE GATHERING STORM
    There are many casualties in this sad story of greed and hubris. The big victim is the scientific method. This was pointed out in great detail by John P Costella of the Virginia-based Science and Public Policy Institute. Science is based on three fundamental pillars. The first is fallibility. The fact that you can be wrong, and if so proven by experimental input, any hypothesis can be—indeed, must be—corrected.
    This was systematically stymied as early as 2004 by the scientific in-charge of the University of East Anglia’s Climate Change Unit. This university was at the epicentre of the ‘research’ on global warming. It is here that Professor Phil Jones kept inconvenient details that contradicted climate change claims out of reports.
    The second pillar of science is that by its very nature, science is impersonal. There is no ‘us’, there is no ‘them’. There is only the quest. However, in the entire murky non-scientific global warming episode, if anyone was a sceptic he was labelled as one of ‘them’. At the very apex, before his humiliating retraction, Pachauri had dismissed a report by Indian scientists on glaciers as “voodoo science”.
    The third pillar of science is peer group assessment. This allows for validation of your thesis by fellow scientists and is usually done in confidence. However, the entire process was set aside by the IPCC while preparing the report. Thus, it has zero scientific value.
    The fact that there was dissent within the climate science teams that some people objected to the very basis of the grand claims of global warming, did not come out through the due process. It came to light when emails at the Climate Research Centre at East Anglia were hacked in November 2009. It is from the hacked conversations that a pattern of conspiracy and deceit emerge. It is a peek into the world of global warming scaremongering—amplify the impact of CO2, stick to dramatic timelines on destruction of forests, and never ask for a referral or raise a contrary point. You were either a believer in a hotter world or not welcome in this ‘scientific fold’.
    HOUSE OF CARDS AND COLOUR OF CASH
    So we have the fact that a non-expert heads the IPCC. We have the fact that glaciers are not melting by 2035; this major scaremongering is now being defended as a minor error (it was originally meant to be 2350, some have clarified). The date was spouted first by Syed Hasnain, an Indian glacier expert, in an interview to a magazine. It had no scientific validity, and, as Hasnain has himself said, was speculative.
    On the basis of that assertion, The Energy and Resources Institute (Teri) that Pachauri heads and where Hasnain works in the glaciology team, got two massive chunks of funding. The first was estimated to be a $300,000 grant from Carnegie Corporation and the second was a part of the $2 million funding from the European Union. So you write a report that is false on glaciers melting and get millions to study the impact of a meltdown which will not be happening in the first place. Now if this is not a neat one, what is?
    The same goes for dire predictions on Amazon rain forests. The IPCC maintained that there would be a huge depletion in Amazon rain forests because of lack of precipitation. Needless to add, no Amazon rain forest expert could be trusted to back this claim. They depended on a report by a freelance journalist and activist, instead, and now it has blown up in their faces.
    There’s plenty more in this sordid tale. For one thing, there is no scientific consensus at all that man-made CO2 emissions cause global warming, as claimed by the IPCC. In a recent paper, Lord Monckton of Brenchley, who has worked extensively on climate change models, argues: ‘There is no scientific consensus on how much the world has warmed or will warm; how much of the warming is natural; how much impact greenhouse gases have had or will have on temperature; how sea level, storms, droughts, floods, flora, and fauna will respond to warmer temperature; what mitigative steps—if any—we should take; whether (if at all) such steps would have sufficient (or any) climatic effect; or even whether we should take any steps at all.’
    An investigation by Dr Benny Peiser, director, Global Warming Policy Foundation, has revealed that only 13 of the 1,117, or a mere 1 per cent of the scientific papers crosschecked by him, explicitly endorse the consensus as defined by the IPCC. Thus the very basis of the claim of consensus on global warming is false. And so deeply entrenched is the global warming lobby, the prestigious journal Science did not publish a letter that Dr Peiser wrote pointing out the lack of consensus.
    Speaking to Open, says Dr Peiser, “The IPCC process by which it arrives at its conclusions lacks balance, transparency and due diligence. It is controlled by a tightly knit group of individuals who are completely convinced that they are right. As a result, conflicting data and evidence, even if published in peer-reviewed journals, are regularly ignored, while exaggerated claims, even if contentious or not peer-reviewed, are often highlighted in IPCC reports. Not surprisingly, the IPCC has lost a lot of credibility in recent years. It is also losing the trust of more and more governments who are no longer following its advice. Until it agrees to undergo a root and branch reform, it will continue to haemorrhage credibility and trust. The time has come for a complete overhaul of its structure and workings.”
    Another fraud is in the very chart central to Pachauri’s speech at the Copenhagen summit. As Lord Monckton has pointed out, ‘The graph is bogus not only because it relies on made-up data from the Climate Research Unit at the University of East Anglia, but also because it is overlain by four separate trendlines, each with a start-date carefully selected to give the entirely false impression that the rate of warming over the past 150 years has itself been accelerating, especially between 1975 and 1998. The truth, however—neatly obscured by an ingenious rescaling of the graph and the superimposition of the four bogus trend lines on it—is that from 1860 to 1880 and again from 1910 to 1940 the warming rate was exactly the same as the warming rate from 1975 to 1998.’
    PACHAURI’S WRONG NUMBERS
     — omitted chart
    This chart, tracking mean global temperature over the past 150 years, was central to the presentation that IPCC Chairman Rajendra K. Pachauri made at the Copenhagen environment summit. Many scientists believe that the graph is fraudulent. First, there are strong allegations that the data, collected from the Climate Research Unit at the University of East Anglia, is a tissue of lies. Plus, as British climate change expert Lord Christopher Monckton puts it: “(The main graph, in darker blue) is overlain by four separate lines, each carefully selected to give the entirely false im•pression that the rate of warming over the past 150 years has itself been accelerating, especially between 1975 and 1998. The truth, however… is that from 1860 to 1880 and again from 1910 to 1940, the warming rate was exactly the same as the warming rate from 1975 to 1998.” In other words, the graph has been drawn with a motive to prove one’s point, and not to show the truth.
    Thus the earth has warmed at this rate at least twice in the last 100 years and no major catastrophe has occurred. What is more, the earth has cooled after that warming. Why is the IPCC not willing to explore this startling point?
    Another total lie has been that the Sunderbans in Bangladesh are sinking on account of the rise in sea level. The IPCC claimed that one-fifth of Bangladesh will be under water by 2050. Well, it turns out this is an absurd, unscientific and outrageous claim. According to scientists at the Centre for Environmental and Geographical Information Services (Cegis) in Dhaka, its surface area appears to be growing by 20 sq km annually. Cegis has based its results on more than 30 years of satellite imagery. IPCC has not retracted this claim. As far as they are concerned, Bangladesh is a goner by 2050, submerged forever in the Bay of Bengal.
    THE COOKIE CRUMBLES
    The fallout of Climategate is slowly but surely unfolding right where it hurts a large number of special interests—in the field of business. Yes, the carbon trading business is now in the line of fire. Under a cap-and-trade system, a government authority first sets a limit on emissions, deciding how much pollution will be allowed in all. Next, companies are issued credits, essentially licences to pollute, based on how large they are, and what industries they work in. If a company comes in below its cap, it has extra credits which it may trade with other companies, globally.
    Post Climategate, this worldwide trade, estimated at about $30 billion in 2006, is finding few takers. It is under attack following the renewed uncertainty over the role of human-generated CO2 in global warming. In the US, which never adopted any of this to begin with, there is a serious move now to finish off the cap-and-trade regime globally. It’s a revolt of sorts. Six leading Democrats in the US Congress have joined hands with many Republicans to urge the Obama Administration to back off from the regime.
    The collapse of the international market for carbon credits, a direct fallout of Climategate, has already sent shudders down many spines in parts of the world that were looking forward to making gains from it. It was big business, after all, and Indian businesses were eyeing it as well. In fact, Indian firms were expected to trade some $1 billion worth of carbon credits this year, and with the market going poof, they stand to lose quite some money (notional or otherwise).
    Besides the commercial aspect, there is also the issue of wider public credibility. There have been signs of scepticism all along. In a 2009 Gallup poll, a record number of people—41 per cent—elected to say that global warming was an exaggerated threat. This slackening of public support is in sync with a coordinated political movement that is seeking to re-examine the entire issue of global warming from scratch. The movement is led by increasingly vocal Republicans in the US Senate and packs considerable political power.
    Pachauri’s position is also becoming increasingly untenable with demands for his resignation becoming louder by the day. In an interview to Open, Pat Michaels of the Cato Institute, a noted US think-tank, who has followed the debate for years, says, “Dr Pachauri should resign because he has a consistent record of mixing his political views with climate science, because of his intolerance of legitimate scientific views that he does not agree with, because of his disparagement of India’s glacier scientists as practising ‘voodoo science’, and because of his incomprehension of the serious nature of what was in the East Anglia emails.”
    Richard North, the professor who brought to light the financial irregularities in a write-up co-authored with Christopher Booker, has also said in a TV interview that, “If Dr Pachauri does not resign voluntarily, he will be forced to do so.”
    GLOBAL STORMING AHEAD
    The world awaits answers, based not on writings of sundry freelance journalists and non-experts, but on actual verifiable data on whether the globe is warming at all, and if so by how much. Only then can policy options be calibrated. As things stand, there is little doubt that the IPCC will need to be reconstituted with a limited mandate. This mess needs investigation and questions need to be answered as to why absurd claims were taken as gospel truth. The future of everything we know as ‘normal’ depends on this. The real danger is that the general public is now weary of the whole thing, a little tired of the debate, and may not really care for the truth, convenient or otherwise.
  • This Groundhog Day, Punxsutawney Phil Goes High-Tech [Voices]

    By Jennifer Valentino, Reporter, The Wall Street Journal

    He might not be using sophisticated technology in his weather forecasts, but famous groundhog prognosticator Punxsutawney Phil will take a step into the 21st century Tuesday morning when he sends his annual prediction by text.

    Phil, the most widely known Groundhog Day mascot, is set to emerge from his burrow in the early hours of the morning. According to the Groundhog Day legend, if he sees his shadow, there will be six more weeks of winter. If he doesn’t, there will be an early spring. Eager Phil followers can sign up to receive the news by texting “Groundhog” to 247365 ahead of time.

    “Punxsutawney Phil holds the fate of winter close to the vest and in his stump until daybreak on Feb. 2, but the moment he emerges you can be among the first to learn of the forecast on your mobile phone,” the Pennsylvania tourism office said in a press release.

    Read the rest of this post on the original site

    Buy This Item: [Click here to buy this item]

    Read Original Article

  • Fiat to Launch New Panda in 2011

    Italian carmaker Fiat will roll out the new Panda in 2011, after CEO Sergio Marchionne decided to suspend new car launches between October 2008 and December 2009 in a move to counteract the economic recession. Marchionne thus delayed the introduction of several models, including the new Panda and the long-awaited Alfa Romeo Giulietta.

    But according to autocar.co.uk, the new Panda will arrive in 2011, the first year when Marchionne expects the auto sector to show more significant signs of reco… (read more)

  • Google Chrome Gets 40,000 New Extensions with Greasemonkey

    Google Chrome introduced official extension support in December and things kicked off to a good start. There are a few thousands extensions in the online gallery now, some with hundreds of thousands of installs. And with support coming to the stable version of Chrome 4 things looked promising. Yet, Firefox still had the upper hand with… (read more)