Author: Serkadis

  • Mental Health in the Aftermath of Conflict

    Published: December 9, 2009
    Paper Released: November 2009
    Authors: Quy-Toan Do and Lakshmi Iyer

    Executive Summary:

    Wars are detrimental to the populations and the economy of affected countries. Over and above the human cost caused by deaths and suffering during a time of conflict, survivors of conflict are often left in poor economic circumstances and mental-health distress even after the conflict ends. How large are these costs? How long does it take for conflict-affected populations to recover from the mental stress of conflict? What policies are appropriate to assist mental health recovery? While considerable attention has been paid to post-war policies with regard to recovery in physical and human capital, mental health has received relatively less attention. The World Bank’s Quy-Toan Do and HBS professor Lakshmi Iyer review the nascent literature on mental health in the aftermath of conflict, discuss the potential mechanisms through which conflict might affect mental health, and illustrate the findings from their study of mental health in a specific post-conflict setting: Bosnia and Herzegovina. Key concepts include:

    • Mental health is an outcome that deserves greater attention from scholars and policymakers alike.
    • Mental health is an important dimension of human capital. Mental health distress, while a matter of concern in and of itself, might also have adverse consequences for individuals’ labor force participation and labor productivity in the post-conflict period, thereby delaying economic recovery after the conflict ends.
    • Quantifying the effect of conflict on mental health is likely to be important for designing appropriate post-conflict policies for recovery.
    • Somewhat surprisingly, findings showed no significant differences in overall mental health across people who experienced different levels of exposure to the conflict.
    • People with more education, as well as those who move to a different locality after the conflict, suffer fewer conflict-related mental health consequences.

    Abstract

    We survey the recent literature on the mental health effects of conflict. We highlight the methodological challenges faced in this literature, which include the lack of validated mental health scales in a survey context, the difficulties in measuring individual exposure to conflict, and the issues related to making causal inferences from observed correlations. We illustrate how some of these issues can be overcome in a study of mental health in post-conflict Bosnia and Herzegovina. Mental health is measured using a clinically validated scale; conflict exposure is proxied by administrative data on war casualties instead of being self-reported. We find that there are no significant differences in overall mental health across areas which are affected by ethnic conflict to a greater or lesser degree. 27 pages.

    Paper Information

  • Trading The Dollar/Stocks Correlation Is Complete Bunk

    This should throw some cold water on the 'weak dollar explains the market' camp. This includes even many professional traders, who frequently just sort of adopt market memes and start repeating, with short memories and a lack of investigative effort. As long as they're making money, and it seems to be due to their skill, then nobody complains.

    Yet here is a long-term correlation between the U.S. dollar and stocks, from Morgan Stanley.

    Actually, stocks have usually rallied while the U.S. dollar was strengthening. More importantly, it's clear that overall the correlation reverses pretty often, and suddenly. Thus the dollar seems pretty undependable as a signal for stocks. It looks pretty right for a bit... and then it's completely wrong.

    Morgan Stanley: First, there is now a tight inverse correlation between the dollar and equity markets. This is unusual: Exhibit 2 shows that equity markets and the dollar usually are positively correlated (so dollar strength goes hand-in-hand with rising equities). Consequently, this correlation could weaken if the dollar strengthens because US growth and rates are expected to rise.

    dollar

    (Via Morgan Stanley, 'Dollar: Ouch', Gerard Minack, 9 December 2009)

    Join the conversation about this story »

    See Also:

  • Chrysler UK says “Change is coming” on New Year’s Day

    Filed under: , , ,

    There’s a Chrysler UK site with the moody, Twilight-esque image you see above called “CHANGE is coming….” It heralds the date of January 1, 2010, and since it has a picture of the Chrysler 300 we’d like to think there will be something new announced for the six-year-old sedan. Perhaps that high zoot dash teased at the New York Auto Show will greet us along with the new year. However, since we haven’t seen anything of it or the car since, it could be something more generic.

    Which would be a shame, because we have a long list of wishes for the 300, and we’d be thrilled to discover the Pentastar folks had worked through even a quarter of it. The Detroit Auto Show‘s press days begin on January 11, so we’ll be keen for whatever preview Chrysler has coming. But we do wonder why the UK gets the announcement duties while Chrysler’s U.S. site is silent. Regardless, stay tuned… Hat tip to Aaron

    [Source: Chrysler UK]

    Chrysler UK says “Change is coming” on New Year’s Day originally appeared on Autoblog on Wed, 09 Dec 2009 08:31:00 EST. Please see our terms for use of feeds.

    Read | Permalink | Email this | Comments

  • Finally, A Smart Way To Fix The Banking System

    CoCo bonds

    The problem that brought the banking system to its knees, you’ll recall, was that loan losses wiped out the banks’ equity. 

    To remain solvent, banks are required to maintain certain ratios of equity to debt, and the losses on the banks’ boneheaded loans knocked these ratios below the required level.

    Normally, when a company’s equity gets wiped out, the company’s bondholders start picking up the tab.  The bondholders are higher in the capital-structure pecking order than the equity holders, so they get their money first, but, normally, this money isn’t guaranteed.   If the bank is so colossally stupid that it wipes out all its equity, then bondholders (usually) have to pay the price.

    But when the US government rescued the banks by pumping them full of taxpayer money, they invested the taxpayer money in the form of equity.  As they did so, they protected the bondholders to the tune of 100 cents on the dollar.

    This was outrageous–morally and otherwise.  The bondholders were free-willed adults who had voluntarily loaned money to the banks so the banks could lend it to anything with a pulse.  These (extremely well compensated) free-willed adults were saved from paying the price for this stupidity because they scared Tim Geithner the government into thinking that the world would end if they lost money.

    Specifically, the bondholders and bankers persuaded the government that they were TOO BIG TO FAIL.  And this concept–the taxpayer must bail out any bank big enough to make headlines–has basically become written into the Constitution.

    This is a disaster for everyone but banks and bondholders.  It makes a mockery of the whole premise of capital markets and capitalism–that a good balance of incentives and risk will cause people to behave in a reasonably intelligent manner most of the time (and pay the price if they don’t).  It exacerbates the “moral hazard” that academics and Fed chairmen are always mumbling about (“heads we win, tails the taxpayer loses, so roll away…”).

    Banks have understandably fought tooth and nail to prevent any reform of the current system.  For example, banks have argued vociferously that a calamity will befall the country if banks are forced to increase the amount of equity they are required to hold relative to their debt. (What calamity?  Less profit and smaller bonuses.).  And now that the stock market has somewhat recovered, no one cares about financial reform anymore, so bankers stand a good chance of preserving the status quo right up until the next financial crisis.

    At this point, the odds that we’ll get any reform at all seem next to nil, but there’s a simple solution that could fix the whole problem.  Shockingly, this solution has actually begun to be implemented in Europe.

    What’s the solution?

    Debt that automatically converts to equity when a bank’s capital ratio falls below the required level.

    What does that mean?  It means that equity holders will still get hit first if the bank makes dumb-ass loans.  But it also means that if the bank makes so many dumb-ass loans that its equity gets wiped out, bondholders, not taxpayers, will pick up the rest of the tab. 

    How does it work?

    When the bank’s equity falls too far, some of the convertible bonds convert to equity, thus restoring the bank’s capital ratio.

    This happens automatically, without bankruptcy or fuss.  It happens without surprise.  It happens without threatening to bring the whole economy to its knees.  It happens without Congressional moaning and hand-wringing and without Treasury secretaries dropping to their knees to beg and plead. 

    Bondholders who buy these bonds–now called CoCo’s, or “contingent convertibles”–know full well what they are buying, and the bonds are priced to reflect the equity conversion risk. Lloyds just sold a bunch of these in the UK, and there was a market for them.

    To fix the banking system, all regulators would have to do would be to require banks to issue enough CoCos that they could withstand financial Armageddon without the taxpayer getting involved.  The banks’ ability to make huge bets (and huge bonuses) with small amounts of equity would be preserved, so perhaps the bank lobbyists would agree to stand down for a while.  The world could rest assured that SOMETHING had been done to prevent the same mess from happening all over again.  And we could all return to peace, happiness, and prosperity.


    The FT has an excellent interactive tutorial on bank capital and CoCos here >

     

    Graphic: FT

    Join the conversation about this story »

    See Also:

  • Apparently The Marching Blondes Didn’t Work, As The Latvian Economy Shrinks 19%

    latvia blondes

    Remember earlier this year when Latvian blondes decided to hold a march to lift the spirits of Latvia and rescue its economy? It didn’t work.

    —–

    RIGA, Latvia (AP) — Latvia’s economy shed 19 percent of its value in the third quarter compared with a year earlier, the country’s statistics agency said Wednesday, highlighting the woes in the European Union’s worst economy.

    Latvian Statistics said that the third quarter fall in gross domestic product was led by a severe drop in retail trade, down 28.7 percent year-on-year, and construction, down 36 percent.

    From January to September Latvia, which is suffering its worst recession on record, saw its economy shrink a total 18.6 percent, the agency said.

    The Baltic state is undergoing a drastic correction after four years of stellar growth that followed the country’s membership to the EU. In 2006 the economy grew a dizzying 12.2 percent.

    In the past year, however, Latvia’s government has been forced to slash expenditures, raise taxes, and sign onto an emergency bailout loan worth euro7.5 billion ($11 billion) with international lenders in order to prevent bankruptcy.

    Unemployment, meanwhile, continues to climb, and last week Eurostat, the EU’s statistics agency, said Latvia has the highest rate of joblessness in the bloc, at 20.9 percent.

    Government officials and economists have said the economy would hit the lowest point sometime this winter, with economic growth likely to resume in 2011.

    Latvia’s GDP is expected to fall approximately 18 percent this year, and another 4 percent in 2010.

    Join the conversation about this story »

    See Also:

  • This Is A Test

    I want to see if I can put HTML commands in a post:

    Red
    Blue Underlined

    Thanks for your patience.

    NO.
    Maybe.
    Yes, success.

  • Mozilla Finally Releases Thunderbird 3

    Mozilla has finally released the long overdue update to its desktop email client Thunderbird. After a few years in development and a long beta stage, the final build of Mozilla Thunderbird 3 is now available for download. The latest version is a major upgrade and packs a lot of new or improved features, which should be greatly appreciated by those who haven’t been using the beta version, like tabbed emails and an overhauled search tool.

    “Thunderbird 3 represents more than two years of development from hundreds of developers, security experts, testers and localization and support communities from around the world,” David Ascher, CEO of Mozilla Messaging said. “Thunderbird 3 continues its history of giving users the most flexibility and control to get through their email faster and have it simply work the way they want it to plus many of the new features take the best of web mail and bring it to the desktop.”

    So, what have two years of development brought for Thunderbird? As you can expect, there are quite a lot of features, though not nearly as much as the two years schedule would have you believe, but the biggest and the ones most likely to make a difference for the users are the tabbed email capabilities, which allow users to switch between different emails and different sections of the client, and the improved … (read more)

  • Mad Money host Jim Cramer calls Whitacre ‘assassin of the bad’

    Filed under:

    We’ve spilled some ink about CNBC Mad Money host Jim Cramer before, the financial expert who – as The Daily Show pointed out – missed the whole Great Depression 2.0 thing. Of course, it’s not fair to single out Mr. Cramer, as 99 out of 100 economists thought credit default swaps based upon ill-conceived home loans to people with lousy credit was a sure-fired, long-term money-making bonanza.

    Regardless, Jim Cramer is very bullish on General Motors’ new interim CEO and former Southwestern Bell exec, Ed Whitacre. Says Cramer of Ed, “He was fierce and proud and was going to do whatever was necessary to rebuild the greatness of this once amazing company.” Sounds good to us. Especially if Mr. Whitacre applies that devil-may-care acumen to doing-better-but-still-ailing General Motors. GM’s problem is, for every Cadillac CTS-V, there’s a Chevrolet Aveo.

    Put another way, for every Ed Whitacre, there’s a Fritz Henderson. Or ten. Still, if Jim Cramer is to be believed, Whitacre is an “assassin of the bad.” Unions? Watch your butts, as “he’s going to take it to them.” Senior management might even be in worse trouble, as Cramer says, “He doesn’t need you. He likes to run the place with a shoestring administration.” Again, we hope Cramer’s right. But we also know that those ten Fritz Hendersons are dug into GM’s corporate structure like Alabama ticks.

    [Source: Blogging Stocks | Image: Rusty Jarrett/Getty]

    Mad Money host Jim Cramer calls Whitacre ‘assassin of the bad’ originally appeared on Autoblog on Wed, 09 Dec 2009 08:01:00 EST. Please see our terms for use of feeds.

    Permalink | Email this | Comments

  • Inside clamping beveling machine type PROTEM US80

    The US80 is a heavy duty portable pipe bevel-ling tool designed for high quality portable bevelling, facing and machining on pipe and tube from 3.5” ID to 14” OD (80 to 360 mm). Each standard tool plate accepts multiple tool bits allowing up to four simultaneous machin-ing operations. Such operations may include bevelling, facing, counterboring, compound bevelling and OD chamfering on heavy wall materials.
    The US80 features a self-accepting torque system and an integral drive motor. The ro-bust US80 is capable of performing repeatable high quality weld preparations on all machi-nable alloy pipes, including stainless steel and other high nickel alloys

    Machining capacity: 80 mm ID – 360 mm OD
    Clamping capacity: 80 mm ID – 355 mm ID
    Machine funcitons:
    bevel, face, counterbore or multiple angle bevel (I, V, J bevels)
    Clamping: manual, with key
    Expansion: 25 mm
    Feed: 60 mm travel, manual
    Weight: ca. 40 kg
    with pneumatic drive

    CONTACT AND INFORMATION

    PROTEM SAS
    ZI LES BOSSES
    F-26800 ETOILE SUR RHONE
    FRANCE

    TEL +33 4 75574141
    email : [email protected]

  • New Dosing Pump from Verder

    On this year´s Achema, Verder introduced the new dosing pump Verderflex Aura. This innovative tube pump is extremely quiet, self-priming and prevents gas entrapments within the pump head. The technical features allow for very precise, timer controlled dosing. The lock out function also ensures that the correct dose is always supplied with no possibility of overdosing. An easy tube load mechanism eliminates the possibility of trapped fingers when loading the tube onto the rollers.
    The Verderflex Aura dosing pump delivers flow rates up to 5 l/h with a maximum pressure of up to 4 bars. Different tube materials offer the best possible chemical resistance to a wide variety of fluids. This innovation from Verder is designed to offer high performance in the smallest of packages, this pump is ideal for dosing chlorine in swimming pools, leisure centres and spas.

  • Oriel IQE 200 measurement system

    Newport’s Oriel IQE 200 measurement system allows researchers to measure External Quantum efficiency (EQE) also known as Incident Photon to Charge Carrier Efficiency (IPCE) as well as Internal Quantum Efficiency (IQE) for solar cells, detectors, or any other photon-to-charge converting device.

    The system utilizes industry standard, durable Oriel components for the light management engine. Each model of the IQE-200 system provides a “turnkey” solution by providing the light source, monochromator, detectors, related electronics, software and PC in a preconfigured, assembled and calibrated format. A variety of accessory modules are available to provide positive sample positioning, temperature control, electrical probing capabilities and light bias. The IQE-200 incorporates a novel detector geometry which splits the beam allowing for simultaneous measurement of EQE and the reflective losses to quantify IQE. In addition, an accessory detector can be mounted which allows for measurement of Transmission through the cell for those samples on a transparent substrate. The unique design of the AC system meets the requirements outlined in ASTM Method E 1021-06.

    The IQE-200 accessories customize the system to perform QE measurements incorporating temperature control, light biasing, and even motorized mapping capabilities all controlled by our new QE Commander ™ software system. Adding a source meter also allows for point measurements of the IV response of the cell. In conjunction with motorized mapping, it allows the user to “map” the IV performance of the cell under test in a user definable pattern.

    All Oriel components are from Newport Corporation, an industry leader in light sources, spectroscopy products, precision motion control, and continuous wave solar simulators.

  • Low Cost Industrial PC boasts PCI Expansion

    PBOX 100 is a low cost Fan-less Embedded Computer System which is based on the Intel Atom N270 CPU and 945GSE Chipset. With DVI/VGA output and expansion via PCI and Mini PCI, PBOX 100 is a great low-cost computing solution for applications which require long-term CPU support, superb reliability and flexibility.
    Designed as a low cost multi-purpose Industrial Computing platform, PBOX 100 is equipped with an on-board 1.6GHz Intel Atom CPU and GMA 950 Graphic Engine. To increase functionality, PBOX 100 is equipped with one PCI slot and one Mini-PCI slot enabling connection to a variety of peripheral devices and expansions cards.
    To enable a wide variety of application to be addressed, PBOX 100 measures just 300mm x 194mm x 51mm and boasts DVI and VGA outputs. Storage is available via a 2.5” HDD, SATADOM and IDE/SATA interfaces. PBOX 100 will find applications within a wide variety of environments including VOIP, automation, digital signage, security gaming/entertainment and communication.
    The PBOX 100 is just part of the Nexcom range of Industrial Computing Solutions which boast long-term support, high-performance and exceptional reliability. Other products in the Nexcom range include Single Board Computers, Passive Backplanes, Embedded CPU Boards, ETX and Fan-less Embedded Systems.

  • Tetra:3 CO2 Wins Innovation Award at Sicurtech 2008 Security And Safety Awards

    Crowcon’s Tetra:3 CO2 portable multi-gas detector has won the Safety Innovation “Top Selection” Award at the Security and Safety Awards held at the Sicurtech 2008 Expo in Italy. Of the many entries for the Safety Awards, the Tetra:3 CO2 was chosen by respected independent judges from seven finalists for the Innovation Award.

    Photo: http://halmapr.com/crowcon/ssa.jpg (580kb)
    (Photo caption. Left–right: Fabio Dadati, President of Fiera Milano; Professor Oliviero Tronconi, Jury President; Tim Wilkes, Crowcon’s Marketing Manager; Fabio Binelli, CEO of Fiera Milano)

    “We are delighted that the innovative design and functionality of the Tetra:3 CO2 has been recognised with this Award,” said Crowcon’s Marketing Manager Tim Wilkes. “The product was introduced in response to customer requests for a compact, multi-gas monitor that offered protection from carbon dioxide alongside that of other gases. It is based on the popular Tetra:3 platform, acknowledged as one of the most easy to use products on the market.”

    Like the standard Tetra:3, the new CO2 variant is designed for use in demanding industrial environments, particularly where wet conditions might be experienced, and with its large, top display and simple, single button operation, requires minimal user training. Initial customers have included many breweries and wineries, where the recognition of carbon dioxide as a toxic threat rather than just a potential asphyxiant is growing. However, the company is also receiving enquiries from other users in areas at risk from toxic gases.

    Now in its second year, the Security and Safety Awards recognises products that demonstrate excellence in the fields of safety and security. The Safety Innovation Award considers features such as functionality, performance and technology and market standards. The jury of six leading scientists and industrialists was chaired by Professor Oliviero Tronconi from the Politecnico di Milano.

    Crowcon, a Halma company, is a world leader in portable and fixed gas detection instruments. For more news stories about Crowcon and to subscribe to the company’s RSS News Feed please visit the News Blog at: http://halmapr.com/news/crowcon/ .

  • US Futures Keep Head Above Water Despite Overnight Selloff And Fears About Greece

    So far at least — with over an hour to go before trading commences in New York — US futures are pointing up; this is in spite of an overnight selloff in Asia, and a decline in Europe due to fears of a Greece blowup and the demolition of the Euro.

    futures

    Join the conversation about this story »

    See Also:

  • Officially Official: VW and Suzuki tie the knot

    Filed under: , ,

    As you can see from the image above depicting Osamu Suzuki, chairman and CEO of Suzuki Motor Corp. and Martin Winterkorn, his counterpart at Volkswagen AG, the two auto companies have agreed to link arms. Volkswagen has properly announced that it has “reached a common understanding to establish a close longterm strategic partnership” with Suzuki Motor Corporation. The union means VW will buy 19.9% of Suzuki, then Suzuki will spend half the money they just received from Volkswagen reciprocating by buying VW shares.

    Beyond that, Suzuki gains access to VW technology and the comforting warmth of a hugely profitable big brother. Volkswagen gets a nearly turn-key solution to boosting its small car development and piggybacks on Suzuki’s presence in India and Southeast Asia. Suzuki, and of course, VW, will remain independent, but unofficial word suggests that VW might increase its stake to 33% or more at some point down the road, which could shift the balance of power.

    VW’s press release is after the jump, and there will be a press conference and webcast (you’ll need a user name and password to watch) at 5 p.m. with more details.

    [Source: Volkswagen | Image: Kazuhiro Nogi/AFP/Getty]

    Continue reading Officially Official: VW and Suzuki tie the knot

    Officially Official: VW and Suzuki tie the knot originally appeared on Autoblog on Wed, 09 Dec 2009 07:32:00 EST. Please see our terms for use of feeds.

    Permalink | Email this | Comments

  • So what is metadata, anyway?

    Writing about metadata is risky business, since every post and every tweet potentially starts the same discussion: what exactly is metadata, anyway? So here’s my ambitious attempt to cut to the chase, and open the can of worms again.

    Why would you care, anyway? Isn’t this just some highly technical or theoretical debate? Well, to some extent it is, but the fact remains that for any content technology, metadata is essential. Metadata is what allows us to use a system to manage content in the first place. Even if you take the brute force approach of using enterprise search, rather than meticulously tagging all your content with metadata, you’ll find results will be disappointing, at best. (In fact, if there’s no useful metadata available, search engines will have to create it themselves.) Metadata is so important that we now even get court rulings to define it.

    Of course, the essence is easily defined. Metadata is data about data. The problem is that, in the end, you can’t really define the distinction between data and metadata.

    The examples are abundant: a document’s author, the date content was created or published, the name of a database column, even the filename is metadata. You can see it in any system dealing with content, and often, helpfully, it will actually be marked as "metadata." There are standards for what metadata you could have (like Dublin Core, or EXIF) or how to store it in a document itself (like XMP). If that’s all you want to know, now might be a good time to stop reading. Because from there, it starts getting tricky.

    Some argue that the concept of metadata is just not very intuitive, because it’s artificial, something we’re not used to "in real life." I doubt it. (You need to look no further than the cover of a book to understand why.) In fact, we’re quite used to those meta-levels of looking at things. We need them to communicate. ("The color of my car is green.") So used to them, in fact, you could argue that any kind of content is metadata, since it always describes something else. (Even a picture of a chair is not really a chair, just a reference to it; and this blog post is not just text — it’s about…)

    In content management, we tend to define metadata by content’s use or purpose, rather than its nature. Something is metadata, because we want to use it as metadata. A CMS will use that metadata as a "hook," to instigate an action, such as displaying content on a particular page in a certain way. A developer may want to sort based on date, an information architect or knowledge manager may want to display content based on how it’s classified, or users need facets to refine the results in their search interface. Those uses are quite different, and sometimes at odds with each other.

    Your records manager may want to keep all the metadata together with the data, as one "document." A developer would often prefer a system to treat metadata just as it does any data (because then it’s accessible through the APIs in a uniform way, and the developer doesn’t need to jump hoops to get to it). On the other hand, for performance purposes, you might want to keep metadata and data separate (store the "about" stuff in the database, and the huge video itself on the filesystem – as DAM systems often do). But a web editor will often wonder why some important fields (their distinction will often seem entirely arbitrary) are marked "metadata" and hidden two tabs and several clicks away.

    You’re unlikely to resolve those conflicts by arguing who’s right. Some of these particular debates have been raging for thousands of years. Plato would say that you should consider metadata to be external to what it describes. Aristotle would tell you that these are inherent attributes of a file or record. A point excellently illustrated by Raphael’s painting in the Vatican, with Plato, at left, pointing to The Cloud, obviously, and Aristotle controlling the files.

    Plato and Aristotle

    You may want to hire several expert philosophers to argue on your behalf, while you get on with the job of actually managing content. Because in the end, everybody is going to disagree on what metadata is, and nobody is going to be "right".  For any content management project, you’ll want to be clear on what everybody needs, and how the system needs to use content. That’s what should define your metadata.

    (And by the way, if you completely disagree with me on this — have your philosopher contact my philosopher, and they can work out the epistemological and ontological fine print.)

  • Bill Ackman: Here’s Why I’m Super-Bullish On Shopping Malls And The US Consumer

    ackman mall consumerLast year around this time, several publications (including us) ran stories about the death of the American mall.

    They were supposed to be a relic from the days of cheap gas, ever-expanding suburbs, and an American consumer with unlimited access to credit.

    But that was a premature obituary. Like many different things expected to die during the crisis, things have actually come back nicely.

    Investor Bill Ackman — who profited enormously on the way down — remains a super-bull on malls.

    The Investment Linebacker blog got a hold of his amazing presentation at the International Council of Shopping Centers on the state of mall REITs. We highly recommend checking it out.

    Check out our favorite charts and slides from the presentation >>

    Join the conversation about this story »

    See Also:

  • Game Developer Won’t Edit ‘Aliens vs. Predator’ To Appease Australian Censors

    Rose M. Welch alerts us to the news that game developer Rebellion has decided not to resubmit an edited version of its game Aliens vs. Predator after it was rejected by the Australian Classification Board for being too violent. The company stated that it agrees the game is not suitable for children:


    “We agree strongly that our game is not suitable for game players who are not adults… it is bloody and frightening, that was our intent.”

    But Australia apparently doesn’t have an option for such “mature” content, and Rebellion seems to recognize how ridiculous that is:


    “We will not be releasing a sanitized or cut down version for territories where adults are not considered by their governments to be able to make their own entertainment choices.”

    Hopefully, things like this will make Australia reconsider its censorship of such content.

    Permalink | Comments | Email This Story





  • We Need 140,000 New Jobs A Month To Keep Unemployment From Going Higher

    Civilian Labor Force

    Asha Bangalore of Northern Trust estimates that the economy will need to add 140,000 jobs a month next year to keep unemployment flat (let alone bring it down).  Next year, the number may be a slightly lower but still high 86,000.

    Why?

    Because the “civilian labor force” grows about 1.2% a year (slightly slower in recovery years).  (See chart above.) As a result, we need an impressive number of new jobs each month just to stay even.

    Here’s Asha:

    The average growth of the civilian labor force in the last twenty years (1989-2008) was 1.2% and
    the median also works out to be 1.2%. During the business expansion of November 2001 –
    December 2007, the labor force grew at an average pace of 1.1% (median =1.1%). 

    For a steady unemployment rate, the rate of increase in employment should match the rate of
    growth of the labor force. Based on the average growth of the labor force in the last 20 years, it
    appears that roughly 140,000 jobs have to be created each month in 2010 to meet the increase in
    the labor force. We computed this number by using the level of employment in the household
    survey for November 2009 (138.502 million) as the starting point and raised the reading by 1.2%
    (growth of the labor force).  [Monthly increase in employment = 138.502 * 1.012=140.164,
    (140.164-138.502)/12 = 138,502 jobs per month].

    The labor force of the nation has recorded gains during the entire post-war period, with the exception of 1951. Data from the first eleven months of 2009 suggest that the labor force most likely held steady in 2009. The labor force typically grows at a sluggish pace compared with the historical average in the first year of an economic recovery (see chart 1), which leads us to
    consider a slightly lower growth rate for 2010.
    By implication, the prediction of the necessary
    increase in employment to match the growth of the labor force will be smaller than our estimate
    of approximately 140,000 per month.  Considering a 0.75% increase in the growth of the labor
    force (one standard deviation below the 1.2% growth rate) would roughly require 86,600 jobs per
    month.  The main result of this exercise is that employment has to increase roughly 86,600 each
    month in order for the unemployment rate to hold steady in the near term and a decline in the
    jobless rate would entail a much larger gain.

    Join the conversation about this story »

    See Also:

  • ZenShell, new Windows Mobile shell looks pretty nice

    We have just stumbled over this demo of a new Windows Mobile today screen launcher/shell for Windows Mobile, and it looks pretty nice.

    The main feature which differentiates it from others appears to be its interesting use of screen real estate, which should suite our ever enlarging screens quite well. It also seems to borrow some ideas from our favourite UI design company TAT, which can never be a bad idea.

    The application if currently in version 0.82 and does not seem to have a downloadable version yet , but we look forward to further development of the software.

    Share/Bookmark