Blog

  • Reuters – Michael Dell Ponies Up $750M in Cash for Deal

    Michael Dell and his investment firm are ponying up $750 million in cash toward the $24.4 billion purchase of Dell Inc. to help bankroll the largest private equity-backed buyout since the financial crisis, Reuters wrote. The Dell founder and CEO this week struck a deal to take private the company he created out of a college dorm room in 1984, partnering with private equity house Silver Lake and Microsoft Corp. Michael Dell will contribute $500 million of his own cash, and MSDC Management – an affiliate of his investment vehicle, MSD Capital – will contribute another $250 million, according to a company filing on Wednesday.

    (Reuters) – Michael Dell and his investment firm are ponying up $750 million in cash toward the $24.4 billion purchase of Dell Inc to help bankroll the largest private equity-backed buyout since the financial crisis.

    The Dell founder and CEO this week struck a deal to take private the company he created out of a college dorm room in 1984, partnering with private equity house Silver Lake and Microsoft Corp.

    Michael Dell will contribute $500 million of his own cash, and MSDC Management – an affiliate of his investment vehicle, MSD Capital – will contribute another $250 million, according to a company filing on Wednesday.

    Dell Inc also said it is targeting the repatriation of $7.4 billion of cash now parked abroad to help finance the deal. That may dismay some shareholders, as a hefty tax is usually levied on cash brought back from overseas.

    The deal, which ends Dell’s rocky 24-year run on the Nasdaq just as the once-dominant PC maker struggles to revive growth, is contingent on approval by a majority of shareholders — excluding Michael Dell himself.

    Several shareholders, including prominent investor Frederick “Shad” Rowe of Greenbrier Partners, have spoken out against the deal, protesting a lack of specifics as well as a potential conflict of interest with Michael Dell being the company’s single largest shareholder with a roughly 16 percent stake.

    “Some shareholders are glad. But there are others who feel it’s a raw deal,” said Shaw Wu, an analyst with Sterne Agee, who has spoken with several Dell shareholders since the announcement but declined to provide further details.

    AND SO IT BEGINS

    Dell was regarded as a model of innovation as recently as the early 2000s, pioneering online ordering of custom PCs and working closely with Asian suppliers and manufacturers to assure rock-bottom production costs. But it missed the big industry shift to tablet computers, smartphones and high-powered consumer electronics such as music players and gaming consoles.

    Executives said on Tuesday the company will stick to a strategy of expanding its software and services offerings for large companies, with the goal of becoming a provider of corporate computing services – like the highly profitable IBM . They played down speculation the company may spin off the low-margin PC business on which it made its name.

    The company has not given many specifics on what it would do differently as a private entity, angering some shareholders who said they needed more information to determine whether the $13.65-a-share deal price – a 25 percent premium to Dell’s stock price before buyout talks leaked in January – was adequate.

    On Wednesday, an individual shareholder filed the first lawsuit, in Delaware, attempting to stop the buyout. The lawsuit – which is seeking class-action status – maintains that the $13.65 per share offered sharply underestimated the company’s long-term prospects.

    “By engaging in the going private transaction now – in the midst of the company’s transition from a PC vendor to full service software and enterprise solution provider – the board is allowing defendants M. Dell and Silver Lake to obtain Dell on the cheap,” read the lawsuit filed by Catherine Christner.

    Dell, the world’s No. 3 personal computer maker, broke down details of the equity and debt financing secured for the buyout in Wednesday’s filing.

    Silver Lake is putting up $1.4 billion, while banks including Bank of America, Barclays, Credit Suisse and RBC will provide roughly $16 billion in term loans and other forms of financing.

    Wednesday’s filing also disclosed that under certain circumstances if the merger cannot be completed, Michael Dell and Silver Lake could have to pay a termination fee of up to $750 million to the company.

    The post Reuters – Michael Dell Ponies Up $750M in Cash for Deal appeared first on peHUB.

  • Reuters – Kazakhstan Sovereign Wealth Fund Buys Stake in Kazzinc

    Kazakhstan’s sovereign wealth fund, Samruk-Kazyna, has acquired a 29 percent stake in Glencore-controlled zinc producer Kazzinc, the fund’s deputy head said on Thursday, without disclosing the price, Reuters reported. Glencore, which owns 69.61 percent in Kazzinc, had earlier said it intended to boost its stake in the company to 93 percent for a total of $3.2 billion, including $2.2 billion in cash and $1 billion in equity.

    (Reuters) – Kazakhstan’s sovereign wealth fund, Samruk-Kazyna, has acquired a 29 percent stake in Glencore-controlled zinc producer Kazzinc, the fund’s deputy head said on Thursday, without disclosing the price.

    “The Kazzinc deal is closed, and today we own 29 percent in this enterprise,” Kuandyk Bishimbayev told reporters. “These were borrowed funds,” he added without giving further detail.

    He said the shares had been bought from Kazakh company Verny Capital.

    Glencore, which owns 69.61 percent in Kazzinc, had earlier said it intended to boost its stake in the company to 93 percent for a total of $3.2 billion, including $2.2 billion in cash and $1 billion in equity.

    The post Reuters – Kazakhstan Sovereign Wealth Fund Buys Stake in Kazzinc appeared first on peHUB.

  • Reuters – Apollo-Backed Evertec Files for IPO

    Apollo Global Management LLC-backed payment processor Evertec Inc. filed with U.S. regulators to raise up to $100 million in an initial public offering of its common shares. In a regulatory filing with the U.S. Securities and Exchange Commission, the San Juan, Puerto Rico-based company named Goldman Sachs and JPMorgan Securities as the underwriters to the offering.

    (Reuters) – Apollo Global Management LLC-backed payment processor Evertec Inc filed with U.S. regulators to raise up to $100 million in an initial public offering of its common shares.

    In a regulatory filing with the U.S. Securities and Exchange Commission, the San Juan, Puerto Rico-based company named Goldman Sachs and JPMorgan Securities as the underwriters to the offering.

    The company did not reveal the number of shares it planned to sell or their expected price. The company is yet to decide the stock exchange it intends to list on.

    Apollo Global Management, which acquired Evertec from Puerto Rican lender Popular Inc in September 2010, owns a 51 percent stake in the company. Popular retains 49 percent and is Evertec’s largest customer.

    The company processes over 1.2 billion transactions annually, and manages the electronic payment network for over 4,900 automated teller machines and over 107,000 point-of-sale payment terminals, Evertec said in the filing.

    The company earned $3.8 million on revenue of $250.7 million in the nine months ended Sept. 30.

    The amount of money a company says it plans to raise in its first IPO filing is used to calculate registration fees. The final size of the IPO could be different. (Reporting by Ashutosh Pandey in Bangalore; Editing by Maju Samuel)

    The post Reuters – Apollo-Backed Evertec Files for IPO appeared first on peHUB.

  • Microsoft launches ‘Don’t Get Scroogled by Gmail’ campaign to stop Google ‘going through personal emails’

    Microsoft’s efforts to downplay Google’s Gmail over its own Outlook.com service are well known amongst the tech crowd. In late-November the Redmond, Wash.-based corporation claimed that a third of new Outlook.com signups were people switching from Google’s email service, and after the web giant dropped support for EAS, Microsoft quickly advised Gmail users to make the same switch. Now Microsoft is at it again, launching a new crusade titled “Don’t Get Scroogled by Gmail“.

    The purpose of the campaign, according to the software firm, is to “educate consumers that Google goes through their personal emails to sell ads”. Don’t Get Scroogled by Gmail is aimed at American Gmail users and is supported by a GfK Roper study commissioned by Microsoft that found “70 percent of consumers don’t know that major email providers routinely engage in the practice of reading through their personal email to sell ads”, with a vast majority of people, 88 percent, disapproving of this practise once the information was brought to their attention.

    Well fine, but as you and I may ask: Why just target Gmail? Surely Microsoft could have taken various other email services to task as well. Fact is this has nothing in common with a “for the greater good” plan, but rather is a targeted plot to save Gmail users from the “evil” Google, by getting them to switch to Outlook.com.

    Microsoft’s senior director of Online Services says that: “Emails are personal — and people feel that reading through their emails to sell ads is out of bounds. We honor the privacy of our Outlook.com users, and we are concerned that Google violates that privacy every time an Outlook.com user exchanges messages with someone on Gmail. This campaign is as much about protecting Outlook.com users from Gmail as it is about making sure Gmail users know what Google’s doing.”

    Microsoft has a solution to this problem, as you might imagine, and has launched a petition on Scroogled.com to help “consumers have their voices heard” and “tell Google to stop going through their emails to sell ads”. Microsoft also states that Gmail users should “prioritize their privacy by switching to Outlook.com”.

    But what the Redmond, Wash.-based corporation seems to fail to realize is that being a Gmail user is both a choice and a necessity.

    In my case I use various other Google services connected to my Gmail account and I’m quite sure millions of other people are in a similar position. If I were to, hypothetically, switch to Outlook.com I’d still have to use my Gmail account to log into Google+ and check for notification emails, for instance. At the same time using Gmail with all the targeted ads is a personal choice, one that I favor over Outlook.com when it comes to features and adjacent functionality.

    I can also chat with my Google+ friends straight from Gmail, whereas I’d be stuck with Facebook Messenger on Outlook.com. By implication I’d be an even more active Facebook user and we all know how much Zuckerberg’s social network values our privacy. To me that’s a huge no-go.

  • Hewlett-Packard and the Art of Finance

    Sometimes big companies engage in transactions that make you wonder what on earth they were thinking. Hewlett-Packard’s 2011 acquisition of the British software company Autonomy certainly falls in that category. Even at the time, news of the acquisition contributed to a 20% drop in HP’s share price. And last fall HP wrote down the value of the acquired company by $8.8 billion, blaming $5 billion of the writedown on improper accounting.

    Right now, regulators are investigating whether Autonomy actually engaged in improper accounting. But what’s interesting is that almost no one raised a red flag at the time. KPMG, the big accounting firm, was helping HP with its due diligence by inspecting Autonomy’s books. The accountants apparently gave the company a clean financial bill of health, even though there had been rumors of accounting improprieties.

    It’s all an astonishing story — except that similar things happen on a lesser scale all the time. Investors and lenders find themselves misled by companies’ financials. Acquirers pay too much for a target company because they have misread or misinterpreted the books. Sometimes the problem is outright fraud. More often it’s a problem related to what we call the Art of Finance.

    The idea of finance as an art sometimes puzzles people. Financial reporting is about numbers, they assume, and numbers are either accurate or they are not. Financial numbers, though, are different. Many of the entries on a company’s income statement and balance sheet reflect estimates, assumptions, and procedural rules. Companies can treat these estimates, assumptions, and rules very differently, and so can wind up with wholly different accounting numbers for similar sets of transactions.

    For example, imagine that two companies are launching new lines of medical imaging equipment, complete with maintenance and service contracts. One company might decide that most of its costs are incurred in manufacturing and so will “recognize” most of the revenue from a sale — that is, record the revenue on its books — as soon as the equipment is delivered. The other might want to allow for major costs on the service end, and so will wait until the service contract expires to record much of the revenue. Result: the two companies’ books would look quite different, even though their business was very similar. As long as both companies are consistent, it is all quite legal.

    Autonomy, by many reports, engaged in “aggressive” accounting, which means that it used the estimates, assumptions, and rules to make its books look as good as possible. HP and KPMG were remiss in not tracking down exactly how aggressive its procedures were. But Autonomy is hardly alone in this practice; many companies treat their financial statements in this way. Former Sunbeam CEO “Chainsaw” Al Dunlap was even said to view his finance department as a profit center. By manipulating the books, the company could polish up its results until they glistened.

    When you look at a company’s books, you’re really looking at the integrity of the executive team responsible for them. You would want to check out their past practices and reputations, and you would want to sit down and go over the estimates, assumptions, and rules in great detail. But HP was so eager to get the transaction done that it failed to examine the financials with sufficient care.

    Why so? According to a recent Wall Street Journal article, then-CEO Leo Apotheker was the deal’s chief advocate, and lobbied the board hard to approve it. The late Peter Drucker may have the best explanation. “I will tell you a secret,” said Drucker: “Dealmaking beats working. Dealmaking is exciting and fun, and working is grubby. Running anything is primarily an enormous amount of grubby detail work — dealmaking is romantic, sexy. That’s why you have deals that make no sense.”

    Apotheker is long gone from HP. But the company, unfortunately, is paying a steep price — $8.8 billion in writeoffs — for this senseless deal.

  • 7 major energy trends to watch for in 2013, via DOE bigwig David Sandalow

    David Sandalow, the acting U.S. Under Secretary of Energy, says the Department of Energy’s programs to invest in energy innovation are about “trying to replicate the rate of IT innovation for energy.” He made the remarks at the Cleantech Investor Summit on Wednesday to a few hundred entrepreneurs, and investors who no doubt wished the technologies they’ve been supporting would get cheaper and more powerful at the same rate as Moore’s Law.

    Alas the cleantech sector has yet to see its own Moore’s Law, though the closest might be that solar cells and panels have dropped dramatically over the past 18 months. But even if Sandalow couldn’t promise a Moore’s Law for energy, he laid out some of the most important trends that the DOE is paying very close attention to in the energy sector in 2013.

    1). Grid resiliency and modernization: Both the Superbowl blackouts and hurricane Katrina have highlighted how important it is to make the grid much more resilient to blackouts as well as cyber events. The threat of cyber attacks “is real,” and it’ll be the private sector who mostly will lead the response against these situations, said Sandalow. Having a much more robust grid will also be needed as utilities add more clean power, like wind and solar, onto the grid.

    2). Low cost natural gas: Cheap U.S. natural gas, which has emerged through horizontal drilling and hydraulic fracturing, is the “hottest topic in the energy area,” said Sandalow. The DOE is accepting comments right now for whether or not the U.S. should export liquid natural gas. We have 16 applications for companies that want to export it, said Sandalow.

    3). The dropping cost of solar: The DOE has its SunShot program, which looks to lower the cost of solar panels, but there’s a lot more work left to do. Germany has a 50 percent lower cost to install solar panels because it removed a lot of the red tape, said Sandalow. He explained, “I want to know when solar will become viral in the way that cell phones did, and what will it take, energy storage?”

    4). Electric vehicles: The DOE has done a lot of work with electric vehicles and charging infrastructure, and we plan to do a lot more, said Sandalow.

    5). High performance computing and big data: The trend of big data analytics and the most powerful computers in the world will no doubt help crack the problems with energy innovation. They are already being used heavily in the energy and climate change monitoring sectors.

    6). Clean energy financing: There’s a lot more work to be done to finance clean power projects, though some milestones passed recently like the tax credits for wind projects. For startups clean power financing is actually a pretty hot area for investment.

    7). China: Sandalow says he’s been to China 13 times while he’s been in office. The relationship between the U.S. and China over energy has at times been challenging, says Sandalow, but the trend of Chinese investments being made into cleantech companies in the U.S. is really interesting, and “I expect to see more of it.”

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • OUYA Android Game Console To Get Annual Hardware Updates, Founder Says

    Ouya_Family_1024x1024

    The OUYA Android-based gaming console will get hardware refreshes on an annual basis, founder and CEO Julie Uhrman revealed in an interview with Engadget. Uhrman was at DICE, an annual summit that focuses on video games, where she also announced new game publisher partners for the OUYA platform. The refresh cycle will more closely resemble those of smartphones than those of traditional consoles, which generally enjoy multi-year lifespans extending into double digits.

    “There will be a new OUYA every year. There will be an OUYA 2 and an OUYA 3,” Uhrman told Engadget in an interview. That’s a pretty bold declaration of intent from a company that, while immensely successful in their Kickstarter crowdfunding campaign, has yet to actually ship production-ready OUYA 1 devices out to the general public, though they have already secured retail partners.

    There are a few reasons why current big name consoles have the long life that they do. A lot of money goes into their initial development, for one, meaning that manufacturers like Sony and Microsoft often sell them at a loss for years before they begin to turn a profit on hardware. And there are advantages to this model for the consumer, too: Users don’t have to worry about their hardware and software library becoming obsolete all that quickly when you’ve got a dependable, multi-year upgrade cycle.

    Uhrman explains that all games on OUYA will be backwards compatible, at least in so far as they’ll be tied to user accounts independent of hardware, rather than linked to hardware itself. All-digital delivery means that this is easier to accomplish, since there’s no messy business like disc formats to worry about.

    Plans for future versions of the console include faster processors, and potentially expanding storage beyond the current 8 GB included. At CES this year, Qualcomm and Nvidia both unveiled next-gen processors, so those are likely candidates for future updates, since the emphasis will be on eking out as much graphics performance as possible from the diminutive OUYA box. The current generation OUYA, when it ships, will have a Tegra 3 on board running at 1.6GHz, which should serve it well, at least compared to current generation smartphones.

    While it’s somewhat refreshing to see a consumer electronics maker talk in concrete terms about their future product pipeline, you have to wonder whether or not it’s the right move. Uhrman is now essentially committed to an annual update cycle, which puts pressure on the company to deliver that going forward, and which also means consumers are well aware that if they just wait a little while, they can get hardware with better specs. Plus, if the market turns out to be competitive, there’s no mystery about what your upgrade strategy is for potential rivals.

    It’s not necessarily surprising that OUYA wants to update annually; the platform they’re creating is well-suited to a frequent update cycle, and that could be one good way to make inroads against the major players, which remain relatively constant for around a decade. But whatever the company’s plans at this point, it still has to ship and then win over consumers before it can put any of them into action.

  • More bad news about broadband caps: Many meters are inaccurate

    For the 64 percent of Americans whose internet service provider imposes a broadband cap, and for those lucky enough to have a meter, I have some bad news. The president of the firm who audits many of the country’s broadband meters says that he can’t certify the measurements produced by five out of seven of his clients’ meters because they don’t count your bits correctly.

    Peter Sevcik, president of NetForecast, told GigaOM that seven clients have hired his firm to audit their broadband meters over the last few years, but of those seven only one — Comcast — has published a report on the NetForecast certification. Sevcik is only willing to certify one other client in a public report.

    Meters are a black box

    The other five clients — which Sevcik would not name — have meters that Sevcik views as inaccurate, although not all of them have publicly rolled out their meters. And not all of those clients impose a broadband cap. Sevcik usually expects accuracy on the meters of between plus or minus one percent, but so far these don’t measure up.

    “They are wrong by missing numbers by one way or another — sometimes it’s over reporting, but more frequently the error is under reporting,” he said. Under reporting should be a relief to those facing overage charges or service termination for going over their meters, but if the meters aren’t counting the data properly, it is still a problem.

    Also disturbing is the attitude that Sevcik has encountered at some clients with malfunctioning meters. “There’s a general sense by some people, ‘Eh, we under report so we give them a free pass, so why worry about that?’” Sevcik says. “I think one does need to worry because it ruins the overall veracity of the meter. It derails trust in the meter.”

    Broadband caps have grown to cover more Americans. They often come with meters.

    Broadband caps have grown to cover more Americans. They often come with meters.

    Sevcik wouldn’t name those clients, but his website lists Time Warner Cable, Cox, Comcast, AT&T, Bell Canada, Verizon and France Telecom as customers. Time Warner Cable and Cox have both confirmed to me that they have used NetForecast to certify their meters. AT&T’s spokesman says it has a team of engineers that certifies the accuracy of its meters but that it hasn’t worked with NetForecast to certify its wireline meters. Sevcik clarified that the seven clients he’s speaking of are all U.S.-based and all are testing wireline meters.

    Last November, AT&T customer Ken Stox drew attention to AT&T’s meters when he couldn’t replicate the ISP’s byte count with his own home testing. For Stox, who is technically astute, the questions he had about the meter were less about fairness and more about understanding what, when and how AT&T was counting.

    Building a broadband meter is tough

    pitydafoolThose same questions are ones that Sevcik hopes ISPs will answer as part of an overall effort to improve their meters. He notes that whatever you think about the fairness of data caps, if meters are to serve some kind of public purpose, the public has to understand what the ISPs are counting and how they are counting it.

    As for problems that lead to inaccurate meters, there are several. The first is that many of these meters are bolt-on afterthoughts. A telco or a cable company often uses measurement gear that sits on the subscriber side of the network. The ISPs has to allocate enough resources at that point to track the bits properly, but networks become congested. Then the ISP faces a choice. Does it count all the bits and risk slowing down the network, or does it let the bit count slide and let the rush of packets through?

    Most ISPs err on the side of letting them rush through and a better user experience. But to solve the problem they could dedicate more resources to the counters so they can keep up with peak traffic. More resources would also solve the next problem ISPs face — once they have the bit counts, they need to add them up. As Sevcik describes it, many of these counters drop the bits into an Internet Protocol Detail Report format. Those reports are generated every 15 minutes.

    Spread that across 10 million subscribers with a goal of doing hourly updates, and suddenly you have 40 million records to process in that hour. That takes servers — in some cases more than the ISP anticipated.

    And while Sevcik said that while some ISPs had used decimal counting as opposed to binary counting of bytes in the past, most used binary counting today. That’s good because a binary count adds about 7 percent to the total number of bytes. But as Sevcik points out, if a consumer streams 3 HD movies on Saturday night and expects to see that jump in data consumed on his usage meter, then it needs to be there, or the consumer needs to know why.

    Are meters worth it?

    It’s clear that building a meter takes work. A Time Warner Cable spokesman notes that development of its meter took several years — other ISPs said it took at least a year of effort from multiple engineering teams.

    My broadband consumption courtesy of Time Warner Cable. Not sure how I consumed 44GB in only 6 days.

    My broadband consumption courtesy of Time Warner Cable.

    If building a meter is so much work and consumes so many resources, why have them? For example, Comcast, which delayed the rollout of its meters while it struggled to get it right (and still runs monthly accuracy checks) has defended its meter as a customer education tool and as a means to manage network consumption. Critics point out that at 300 GB, Comcast’s cap is suspiciously close to the 288 GB figure that Comcast has said would be the amount of data consumed by someone using their broadband to replace cable. Those critics generally call caps a way for ISPs to protect their pay TV businesses.

    What we do know is that Comcast has spent a lot of money and effort making sure its meter is accurate, because as Charlie Douglas, a Comcast spokesman notes, “We knew it would be in the spotlight.” I imagine also because it knew a meter would be the first step in how it could change the pricing dynamic from all-you-can-eat to something that’s a little bit more metered. And as I have pointed out in previous articles, if meters become the basis for charging subscribers overage fees or even terminating their service, then someone needs to monitor those meters to ensure that they are accurate.

    I’ve called on the FCC to wake up and start gathering more data on how meters affect consumers and whether or not they are accurate, but the agency has so far been content to let this experiment in caps pay out without much oversight. With these accusations maybe the FCC will finally step up. Clearly, as a country we’re moving toward capped and metered broadband.

    Sevcik, whose experience goes back to the days of the ARPANET and the first routing systems, believes if that’s the case, then those meters should be accurate.

    “I’ve been in the internet business for quite some time … and in that time I’ve had my hand in the design of more than 100 networks and seen a lot in network technology. And what I’ve realized is, as the industry has matured there is an awful lot of talk and decisions made by people — consumers, policy advocates in DC and big companies — that is often based on hype,” he said. “my goal in a small way in this world of hype is to shed a light of real data and make a little piece of it really right.”

    In short; If we’re going to accept meters on our broadband, then let’s make sure they are accurate.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Scarab Darkroom lets you view and edit RAW images

    Take a photo with most digital cameras and by default you’ll get a JPG file, which is great for compatibility purposes, but does involve some compromises in image quality. And that’s because your picture will go through various processes before the final JPG is produced — sharpening, adjusting colors and contrast, compressing the results — and each step results in the loss of some information.

    Take pictures using a camera’s RAW format, though (if it has one), will give you access to the full and unprocessed image data. And you can then apply any tweaks you like on a case by case basis, for the best possible results. You’ll probably need a specialist tool to access the RAW images, but that may not be a problem: Scarab Darkroom, for instance, is a very capable RAW converter with support for cameras by Canon, Nikon, Olympus, Panasonic, Pentax, Samsung, and Sony, and you can download it and it entirely for free.

    After a quick and easy installation, the program presents a straightforward interface. A right-hand tabbed sidebar displays the drives and folders on your PC; you’ll navigate through these to a folder containing your RAW images; and their thumbnails will then appear at the bottom of the program window, making it easy to spot and view whatever picture you need.

    There are a few small buttons above the thumbnails with various viewing-related options: rotate, crop, zoom and so on. And you can also drag the image with the mouse, and zoom in and out with the mouse wheel, so even if you’re viewing a very high resolution photo, it’s quick and easy to find and examine a particular detail.

    Life gets more interesting when you click the Adjustments tab, though, where Scarab Darkroom provides tweaks for Exposure (Brightness, Contrast, Recovery, Blacks, Fill Light), Colors (Temperature, Tint, Hue, Saturation, Vibrance), Tone Curve (Highlights, Midtones, Shadows) and Sharpness. Drag a particular slider and the picture will update accordingly, giving you immediate feedback. And it’s easy to copy your settings to the clipboard, and restore them later, so once you’ve found a configuration which delivers good results then you can quickly apply it to all your other shots.

    There’s also a Metadata tab, although this is relatively basic by comparison. It displays a few of the key image details — exposure, aperture, focal length, ISO speed, flash used, date taken, camera, owner — and allows you to set the image rating, but that’s about it.

    And when you’re happy, and it’s finally time to abandon RAW for an image format you can actually use elsewhere, then the program can save your pictures as JPG or TIF files.

    There are a few small gaps in functionality here, mostly because this is the free version of the program. The developer is currently working on a commercial build, and so extras like noise filtering are going to be reserved for that edition.

    It seems unreasonable to complain, though, because otherwise Scarab Darkroom is an excellent tool. There’s no adware, no marketing annoyances; it’s fast, easy to use, supports a lot of RAW formats and is still being regularly updated to add more. Go grab a copy immediately.

    Photo Credit: diez artwork / Shutterstock

  • GIMP 2.8.4 adds improvements and polish to the popular image editor

    Popular open-source image editor GIMP 2.8.4 FINAL has been released for Linux and Windows, with a Mac binary build due for release imminently. Version 2.8.4 is a minor stability release, but does contain a number of interesting improvements, including more responsive drawing — particularly with the brush outline tool — plus better names for the default filters when saving or exporting.

    GIMP 2.8.4 will also be the second OS X release that runs natively on the Mac — 2.8.0 and earlier required X11, and many improvements in this release are aimed specifically at that platform.

    Aside from the improved responsiveness and filter tweaks, other user interface changes include GIMP now remembering the maximized state of windows across sessions, allowing the text tool to start on a non-layered image and various other fixes for text style attribute handling.

    Plug-in improvements include better default values being set in the Drop Shadow script and a major round of bug fixes for the BMP plug-in.

    Platform-specific improvements concentrate on the OS X platform — the gimpdir has been moved to the user’s Library\Application Support folder, while the system screenshot tool is now used when creating a new image from a screenshot. Plug-in windows should now automatically appear on top, and users can now select their chosen language via GIMP’s Preferences dialog.

    One addition to the Windows installer is the addition of a Brazilian-Portuguese translation.

    The stability update builds on the major 2.8.0 release from last year, which saw GIMP radically overhaul its user interface with such elements as a new ‘single-window’ mode, multi-column Dock windows, a brand new Cage Transform tool and the ability to organise layers into groups.

    GIMP 2.8.4 FINAL is available now as a free, open-source download for Windows, Linux and — coming soon — Mac OS X.

     

  • DataXu Raises $27M From Thomvest, Flybridge, Atlas, Menlo

    DataXu said it raised $27 million in a round of financing led by Thomvest Ventures and joined by existing investors Atlas Venture, Flybridge Capital Partners and Menlo Ventures. Thomvest Managing Director Stefan Clulow joins DataXu’s board. Thomvest Ventures is a venture capital group headed by Peter Thomson, a director of Thomson Reuters, publisher of this blog.

    PRESS RELEASE

    DataXu Closes $27 Million of Growth Capital to Solidify its Lead
    in Programmatic Marketing

    Thomvest Leads Financing In New Transformational Software Market

    Boston – February 7, 2013 – DataXu, a provider of digital marketing software for the enterprise, today announced that it has closed $27 million in funding. This new round of financing will reinforce the company’s leadership position, support ongoing global expansion and accelerate additional investment in technology that keeps its customers at the forefront of innovation.

    The investment was led by Thomvest Ventures, a venture capital group headed by Founder and Managing Director Peter Thomson, a director of Thomson Reuters. All three current major investors – Atlas Venture, Flybridge Capital Partners and Menlo Ventures – participated in the round.  Stefan Clulow, Managing Director of Thomvest, joins DataXu’s board of directors with this investment.

    “DataXu is using data science to redefine the way marketers build brands in a digital world,” said Mike Baker, CEO of DataXu. “In the era of digital and big data, it’s increasingly obvious that legacy ways of working in the advertising industry – including faith-based thinking, lack of transparency, and an inefficient supply chain — no longer make sense.  Our software helps marketers reduce the cost and complexity of marketing across multiple digital channels, while at the same time enabling them to build a much deeper and more actionable understanding of today’s empowered consumer.  This is the catalyst for our record -breaking growth.”

    This funding round comes on the heels of a number of milestones:
    ·       Created a new software category – Programmatic Marketing. (LINK)
    ·       Grew direct-to-enterprise platform sales revenue more than 700% year over year, now comprising the majority of the company’s revenue.
    ·       Introduced the first multi-channel Demand Side Platform to include display, tablet, smartphone, video and Facebook Exchange, with 80 percent of revenue coming from customers running multichannel.
    ·       Tripled staff and customer base, and increased the company’s global footprint with 11 offices in eight countries.
    ·       Developed the only real-time multivariate decision system that “learns” how consumers engage across channels and manages media investments for optimal return.(LINK)
    ·       Extended the DataXu platform to optimize media investments in real time based on consumer sentiment shift (LINK) and beyond exchanges on guaranteed media investments. (LINK)
    ·       Launched a suite of integrated analytics for customer intelligence and media investment analysis.  (LINK)
    ·       Founded the OpenRTB consortium whose standard for programmatic media was adopted by the IAB and is in use in more than 80 companies.

    “Our strategy is to find and partner with entrepreneurs building transformational companies,” said Peter Thomson. “The rise of programmatic marketing is reminiscent of how automation and software also revolutionized the financial services industry. We believe DataXu is in the vanguard of an even bigger opportunity and on a trajectory to become the next great enterprise software company.

    About DataXu

    DataXu is transforming the way companies build their brands in a digital world through the industry’s only fully integrated programmatic marketing solution. The DataXu Platform offers cloud based software that leverages data and analytics to help enterprise marketers better understand and engage consumers, and optimally manage marketing investments for more efficient and effective customer acquisition strategies. With 11 offices in eight countries, DataXu services more than 700 brands across the globe. For more information, visit www.dataxu.com or follow us at twitter.com/dataxu.

    About Thomvest

    Thomvest is a venture capital firm committed to the success of our entrepreneur partners. We primarily focus on investments in areas where we have deep expertise and experience, including software, technology-enabled services, and hardware businesses. The capital we invest is our own, enabling us to be more creative, flexible and patient than most venture investors. It takes time to build great companies and we’re committed to supporting our entrepreneurs throughout their journey. That’s why more than two-thirds of the companies that we have funded in the last decade have either gone public, been acquired, or continue to grow as independent businesses. To learn more about Thomvest, please visit www.thomvest.com.

    The post DataXu Raises $27M From Thomvest, Flybridge, Atlas, Menlo appeared first on peHUB.

  • Employees frequently steal (and use) confidential data when switching jobs

    According to Symantec, businesses are increasingly at risk of insider IP theft, with staff moving, sharing and exposing sensitive data on a daily basis and, worse still, taking confidential information with them when they change employers.

    A new survey conducted by The Ponemon Institute, and based on responses from 3,317 individuals in the United States, United Kingdom, France, Brazil, China and Korea, shows that half of employees admit to taking corporate data when they leave a job, with 40 percent saying they intend to use the data in their new position.

    In a lot of cases, their actions aren’t intended as malicious, Symantec says. They often just don’t know it’s wrong to take data like this, and frequently fail to understand who owns the IP in the first place (many employees wrongly attribute ownership to the person who created it).

    The survey also found that 56 percent of employees don’t think it’s a crime to use trade secrets taken from a previous employer and 62 percent think “it’s acceptable to transfer corporate data to their personal computers, tablets, smartphones and cloud file-sharing apps”. Once the data is there, it’s rarely deleted.

    It’s a big problem, because as Symantec explains, it “means valuable intelligence is falling into the hands of competitors. Ultimately, this puts everyone at risk — the employee who takes the IP, the organization that invested in it and the new employer who unwittingly receives it. Everyone can be held accountable, and no one wins”.

    Based on the survey results Symantec recommends organizations take action in three ways — educate their employees about IP theft, enforce non-disclosure agreements (NDAs), and monitor inappropriate access and use of IP (something which is often easier said than done).

    Symantec’s complete report, What’s Yours Is Mine: How Employees are Putting Your Intellectual Property at Risk, is available for download.

    Photo Credit:  BruceParrott /Shutterstock

  • Clearing a way to Russian bonds

    Russian debt finally became Euroclearable today.

    What that means is foreign investors buying Russian domestic rouble bonds will be able to process them through Belgian clearing house Euroclear, which transfers securities from the seller’s securities account to the securities account of the buyer, while transferring cash from the account of the buyer to the account of the seller. Euroclear’s links with correspondent banks in more than 40 countries means buying Russian bonds suddenly becomes easier.And safer too in theory because the title to the security receives asset protection under Belgian law. That should bring a massive torrent of cash into the OFZs, as Russian rouble government bonds are known.

    In a wide-ranging note entitled “License to Clear” sent yesterday, Barclays reckons previous predictions of some $20 billion in inflows from overseas to OFZ could be understated — it now estimates that $25 to $40 billion could flow into Russian OFZs during 2013-2o14. Around $9 billion already came last year ahead of the actual move, Barclays analysts say, but more conservative asset managers will have waited for the Euroclear signal before actually committing cash.

    Foreigners’  increased interest will have several consequences.  Their share of Russian local bond markets, currently only 14 percent, should go up. The inflows are also likely to significantly drive down yields, cutting borrowing costs for the sovereign, and ultimately corporates. Already, falling OFZ yields have been driving local bank investment out of that market and into corporate bonds (Barclays estimates their share of the OFZ market has dropped more than 15 percentage points since early-2011).  And the increased foreign inflows should act as a catalyst for rouble appreciation.

    Each of these points in a bit more detail:

    a) Foreigners’ share of the Russian bond market is among the lowest of major emerging markets.  Compare that to Hungary, where non-residents own over 40 percent, or South Africa and Mexico, where foreigners’ share of local paper is over 30 percent.

    b) Foreign buying last year compressed Russian yields sharply, eventually pushing down 10-year yields by 130 basis points over the year as foreigners moved further along an increasingly flattening curve.  But Russian 10-year yields around 6.5 percent will remain attractive to foreigners, comparing favourably with most other emerging markets.  And at home, falling government bond yields will benefit the economy as a whole as local banks change their focus. Barclays write:

    (Falling yield) provides cushion to the (finance ministry) which plans to borrow 1.2 trillion roubles internally, versus 0.9 trillion last year. This also accommodates the reallocation of Russian banks’ portfolios  from OFZ into retail lending and corporate and municipal bonds  driven by higher returns.

    c) Barclays advises clients to buy 10-year OFZs in anticipation of further gains, and suggests doing this on an unhedged basis to take advantage of potential rouble appreciation.  While the rouble has outperformed other emerging currencies in the past year, Barclays expects the outperformance to continue.

     

  • Reuters – Blackstone Buys Indian Business Park

    U.S. private equity firm Blackstone Group, along with two other companies, have agreed to buy a business park in south India for 19.5 billion rupees ($367 million), two sources with direct knowledge told Reuters. The deal, which is expected to be concluded within two to three months according to the sources, would be the largest private equity investment by value in India’s real estate sector since 2008.

    (Reuters) – U.S. private equity firm Blackstone Group (BX.N), along with two other companies, have agreed to buy a business park in south India for 19.5 billion rupees ($367 million), two sources with direct knowledge told Reuters.

    The deal, which is expected to be concluded within two to three months according to the sources, would be the largest private equity investment by value in India’s real estate sector since 2008.

    Blackstone, a property fund founded by Housing Development Finance Corporation (HDFC.NS) and unlisted real estate developer Embassy Group plan to invest an equal amount to buy Vrindavan Tech Village, a special economic zone on the outskirts of Bangalore in the southern state of Karnataka, one source said on condition of anonymity as the deal is not yet finalized.

    The facility, built by Singapore-based developer Assetz Property Group, is spread across 106 acres of which about 20 acres have been developed into 1.9 million square feet of offices occupied by companies that include Cisco (CSCO.O), Sony Corp (6758.T) and Nokia (NOK1V.HE).

    On the remaining acres, Embassy plans to build homes on 30 acres and about 5 million to 6 million square feet of offices on the rest, said the source.

    Real estate made up about a quarter of Blackstone’s total global assets under management of $210 billion at the end of December, and is its most profitable business.

    In India, Blackstone has invested nearly $600 million in commercial assets over the past two years, making it one of the largest private equity investors in the country.

    Blackstone, Embassy and Assetz declined to comment. HDFC did not respond to messages. ($1 = 53.1250 Indian rupees)

    (Editing by Ranjit Gangadharan)

    The post Reuters – Blackstone Buys Indian Business Park appeared first on peHUB.

  • Wunderlist 2 arrives on Android tablets, with iPad version hot on its heels

    Wunderlist 2, the newly-native redesign of 6Wunderkinder’s popular task management app, is finally hitting tablets more than a month after it became available for smartphones.

    Wunderlist was originally designed cross-platform using Titanium, but the need to create native versions became a focus last year for Berlin-based 6Wunderkinder, leading the company to abandon its second product, Wunderkit.

    Unusually, the first tablet-optimized native version to be released is for Android. Wanna see some frustrated Apple users? Check out the comment thread in the release blog post.

    But those with iPads needn’t fear. Fact is, both the Android and iPad versions were developed and finished at the same time. As CEO Christian Reber tweeted today, the iPad version is simply somewhere in Apple’s approval process. So, while those users won’t get to take advantage of Android-specific features such as homescreen task widgets and inter-app sharing, they should be catered for soon enough.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • U.S. Nonfuel Mineral Production Increases for Third Straight Year

    Nonfuel mineral production values increased in the United States for the third consecutive year, up $1.7 billion since 2011, the U.S. Geological Survey announced today in its Mineral Commodity Summaries 2013. 

    The estimated value of mineral raw materials produced at mines in the United States in 2012 was $76.5 billion, a slight increase from $74.8 billion in 2011. Net exports of mineral raw materials and old scrap contributed an additional $21 billion to the U.S. economy.

    The annual report from the USGS National Minerals Information Center is the earliest comprehensive source of 2012 mineral production data for the world. It includes statistics on about 90 mineral commodities essential to the U.S. economy and national security, and addresses events, trends, and issues in the domestic and international minerals industries.

    “Minerals are the raw materials for construction, manufacturing, high technology, new industries, jobs, and ultimately economic expansion,” said USGS Director Marcia McNutt. “These summaries are where Geology meets Economics, to create the complex tapestry of variations in mineral production over time and space.”

    The United States continues to rely on foreign sources for raw and processed mineral materials but, for the first time since 2002, the United States was not 100% import reliant for rare earths as rare earth mining resumed at Mountain Pass, California.

    Minerals remained fundamental to the U.S. economy, contributing to the real gross domestic product (GDP) at several levels, including mining, processing, and manufacturing finished products. Minerals’ contribution to the GDP increased for the second consecutive year.

    “Decision makers and policy makers in the private and public sectors rely on the Mineral Commodity Summaries and other USGS minerals information publications as consistent and unbiased sources of information to make business decisions and national policy,” said John DeYoung, Director of the USGS National Minerals Information Center.

    Production and prices increased for most industrial mineral commodities mined in the United States in 2012, but production and prices for nearly all metals declined. Industrial mineral commodities include things like limestone, silica, sand and gravel, and are used for industrial purposes like building and road construction, plastics, glass, and paper.

    Domestic raw materials and domestically recycled materials were used to process mineral materials worth $704 billion. These mineral materials, including aluminum, brick, copper, fertilizers, and steel, and net imports of processed materials (worth about $27 billion) were, in turn, consumed by downstream industries with a value added of an estimated $2.4 trillion in 2012.

    The construction industry began to show signs of improvement during 2012, with increased production and consumption of cement, construction sand and gravel, and gypsum, mineral commodities that are used almost exclusively in construction. Crushed stone production, however, continued to decline.

    The nonmetallic mineral products industry was boosted by the rebound in construction activity in 2012, with more than half of its output going to the construction sector. The recovery in the U.S. housing industry is fueling demand for industrial minerals and products.

    Mine production of 15 mineral commodities was worth more than $1 billion each in the United States in 2012. These were, in decreasing order of value, gold, crushed stone, copper, cement, construction sand and gravel, iron ore (shipped), molybdenum concentrates, phosphate rock, lime, industrial sand and gravel, soda ash, clays (all types), salt, zinc, and silver.

    Eleven states each produced more than $2 billion worth of nonfuel mineral commodities in 2012. These states include Alaska, Arizona, California, Florida, Michigan, Minnesota, Missouri, Nevada, Texas, Utah and Wyoming. Nevada produced the largest value at $11.2 billion. The mineral production of these states accounted for 64 percent of the U.S. total output value.

    The USGS Mineral Resources Program delivers unbiased science and information to understand mineral resource potential, production, consumption, and how minerals interact with the environment. The USGS National Minerals Information Center collects, analyzes, and disseminates current information on the supply of and the demand for minerals and materials in the United States and about 180 other countries.

    The USGS report “Mineral Commodity Summaries 2013” is available online. Hardcopies will be available in February from the Government Printing Office, Superintendent of Documents. For ordering information, please call (202) 512-1800 or (866) 512-1800 or go online.

    For more information on this report and individual mineral commodities, please visit the USGS National Minerals Information Center.

  • Updated evasi0n iOS 6.x jailbreak now available

    Three days ago evad3rs released the first public iOS 6 jailbreak tool, opening up iPads, iPhones and iPod touch devices to the world of underground modding. But as is the case with the majority of infant jailbreak-related releases it also brought along a series of bugs, which the team behind the project now claims to have fixed in the latest update.

    On Twitter, planetbeing, one of the three members behind evad3rs, announced the release of evasi0n 1.1. The second iteration of the popular jailbreaking tool brings along “the latest fixes”, which are supposed to sort the Weather app and “long boot” time issues. The latter problem is also referred to by the team as the “reboots getting stuck” bug.

    planetbeing also says that users may have to clear the browser cache in order to “see” the evasi0n 1.1 tool on the website. For those who already run jailbroken devices, installing the latest release is not necessary as the same result can be achieved by using Cydia to update the existent packages.

    First time users may want to download the evasi0n 1.1 jailbreak tool once it can be “seen” on evad3r’s website. Since only the links for the original release are currently available, for impatient users installing the first release then updating via Cydia might be a quicker “fix” for their incurable modding needs. Keep in mind that it’s advised to perform a backup before jailbreaking the device.

    Photo Credit: Sura Nualpradid /Shutterstock

  • Aviva Strengthens Leadership Team

    Aviva has made a number of appointments to strengthen its leadership team. David McMillan has been appointed CEO of Aviva Europe. Nick Amin is joining Aviva as group transformation director and Jason Windsor will join the group executive as chief strategy and development officer.

    PRESS RELEASE

    As Aviva moves into the next stage of its transformation, Mark Wilson,
    Group Chief Executive Officer, has made a number of appointments to
    strengthen his leadership team. These changes have three clear aims:
    first, to ensure Aviva has strong business leaders in all its key
    markets; second, to drive outstanding execution; and third, to enhance
    some of Aviva’s core insurance centres of excellence globally.

    David McMillan has been appointed CEO of Aviva Europe. David will be
    the member of the Group Executive accountable for our businesses in
    Spain, Italy, Turkey, Poland, Lithuania and Russia and he will become
    Chair of Aviva’s French Board. Most recently he was Group
    Transformation Director and before that was CEO of Aviva’s UK General
    Insurance business. This appointment recognises David’s positive
    impact on the business during his time as Transformation Director.

    Nick Amin is joining Aviva as Group Transformation Director and will
    become a member of the Group Executive, reporting to Mark Wilson. Nick
    has a very strong background in driving change across multiple cultures
    and geographies in the insurance sector, in both Cigna and AIA. Nick
    was instrumental in the transformation of AIA and preparing the company
    for IPO.

    Jason Windsor will join the Group Executive as Chief Strategy and
    Development Officer. This appointment recognises the success and
    ability Jason demonstrated over the past 12 months leading Aviva’s
    strategic review and disposal programme.

    Aviva Investors is a core asset of the Group and Jason Windsor will
    take on the additional executive responsibility of Aviva Investors,
    reporting to Mark Wilson. Jason will work with Paul Abberley, Aviva
    Investors’ interim CEO, to ensure the business is positioned to perform
    to its potential. Pat Regan will continue as Chairman of Aviva
    Investors in addition to his other responsibilities.

    David Angulo will broaden his remit to take on global responsibility
    for driving the development of our bancassurance distribution across
    Aviva. He will work alongside our business leaders to improve value
    from existing relationships and support the development of new
    relationships. David will report to Nick Amin.

    These changes are with immediate effect and subject to appropriate
    regulatory approval. They follow the announcement on 28 January 2013
    of the appointment of Khor Hock Seng as CEO of Aviva Asia. He will
    assume the role on 8 March and his appointment reaffirms our commitment
    to selected markets in Asia where we can build scale and deliver
    consistent returns.

    As a result of these changes Trevor Matthews will not stand for
    re-election at the 2013 Annual General Meeting and will step down from
    the Board on the day prior to this year’s AGM. Trevor has broad
    insurance experience and he has added stability to the Group’s
    developed markets during a period of business change. Trevor will
    continue in an advisory capacity for a number of months to ensure a
    smooth transition of his responsibilities.

    Mark Wilson, Chief Executive Officer of Aviva plc, said:”These changes are
    about ensuring we have the right people in the right
    jobs and that we have the best possible leadership team so Aviva can
    achieve its undoubted potential. David McMillan’s and Jason Windsor’s
    new roles recognise the strength of their achievements during 2012.
    Nick Amin and Khor Hock Seng are exceptional additions to the team. I
    would also like to thank Trevor Matthews for his considerable
    contribution to Aviva and wish him well for his future.”

    Enquiries:

    Media

    Nigel Prideaux +44 (0)20 7662 0215
    Andrew Reid +44 (0)20 7662 3131

    Analysts
    Charles Barrows +44 (0)20 7662 8115
    David Elliot +44 (0)20 7662 8048

    Notes to editors:

    * Nick Amin was previously Executive Vice President and Group Chief
    Administration Officer at AIA. Nick was responsible for the execution
    of AIA’s transformation strategy which resulted in AIA’s successful
    IPO. Prior to AIA,Nick was President and Chief Operating Officer at
    Cigna Asia Pacific, where he improved the profitability of the
    individual country businesses and successfully reduced the expense
    base.

    * Aviva provides 43 million customers with insurance, savings and
    investment products.

    * We are the UK’s largest insurer and one of Europe’s leading
    providers of life and general insurance.

    * We combine strong life insurance, general insurance and asset
    management businesses under one powerful brand.

    * We are committed to serving our customers well in order to build
    a stronger, sustainable business, which makes a positive contribution
    to society, and for which our people are proud to work.

    * The Aviva media centre at www.aviva.com/mediaincludes images,
    company and product information and a news release archive.

    * For broadcast-standard video, please visit
    http://www.aviva.com/media/video/ .

    * Follow us on twitter: www.twitter.com/avivaplc

    This information is provided by RNS
    The company news service from the London Stock Exchange

    END

    The post Aviva Strengthens Leadership Team appeared first on peHUB.

  • ecoATM Secures Mezz Debt

    San Diego start-up ecoATM has secured $40 million in mezzanine debt financing from Falcon Investment Advisors. To date, ecoATM has 300 kiosks nationally and plans to use this new capital to continue its goal of providing a portable electronics recycling solution across the US.

    PRESS RELEASE

    ecoATM, the award-winning San Diego start-up known for its innovative kiosks that fully automate the buy-back of used consumer electronics, announced today that it has secured $40 million in mezzanine debt financing from Falcon Investment Advisors, LLC. To date, ecoATM has 300 kiosks nationally and plans to use this new capital to continue its goal of providing a convenient portable electronics recycling solution to everyone in America.
    “In 2012, we grew from 45 ecoATMs primarily in California to a network of 300 ecoATMs in 20 states,” said Tom Tullie , chairman and CEO of ecoATM. “There’s still a large percentage of the country that doesn’t have access to a convenient recycling solution for their mobile phones and other personal portable electronic devices. We raised this money to help us deploy ecoATMs nationwide and help people recycle their old phones, tablets, or MP3 players, regardless of where they live.”
    “Falcon is a great partner for ecoATM and their experience helping high-growth companies achieve success makes them the perfect partner for our next phase of expansion,” said Bob Genthert , CFO of ecoATM.
    Currently, most ecoATMs are located in large shopping malls located in large cities. This financing will help ecoATM reach smaller cities and other types of high foot traffic areas. “We’re excited to be a part of ecoATM’s growth,” said Rafael Fogel , Partner at Falcon Investment Advisors, LLC. “It is not often that an opportunity comes along that combines such a compelling investment thesis with an invitation to consumers to help heal our planet.”
    Previous investors include Claremont Creek Ventures and Coinstar, Inc.
    The new capital has already allowed ecoATM to add to its kiosks the ability to accept tablets, in addition to cell phones and MP3 players. Since being founded in 2008, ecoATM has paid out millions of dollars to hundreds of thousands of customers and in the process saved landfills from hundreds of thousands of potentially toxic devices. ecoATM finds a second life for 60 percent of the devices it collects and responsibly recycles the rest. ecoATM is an R2 certified ewaste recycler and is ISO14001 compliant.
    About ecoATM
    Based in San Diego, Calif., ecoATM (www.ecoatm.com) is the first company to create an automated self-serve kiosk system to buy back old phones, tablets or MP3 players for cash. ecoATM uses patented, advanced machine vision, electronic diagnostics, and artificial intelligence to evaluate electronics. ecoATM’s eCycling stations provide a convenient trade-in solution that:
    pays consumers immediately in cash
    connects consumers real-time to broad worldwide secondary markets ensuring best possible pricing
    incorporates features that validate sellers’ identities and deter the sale of stolen phones and works closely with local law enforcement (http://ecoatm.com/law-enforcement.html)
    ecoATM holds both Responsible Recycling (R2) and ISO14001 certification, confirming the company’s commitment to maintaining the highest standards of electronics recycling.

    About Falcon Investment Advisors, LLC
    Falcon Investment Advisors, LLC is a private equity firm with offices in Boston and New York, specializing in subordinated debt and other junior capital investments. Falcon provides innovative capital solutions in amounts of $10 million to $75 million to middle market companies in a variety of industries throughout North America. Since its founding in 2000, Falcon has raised $1.7 billion and invested in over 50 companies in a broad range of industries to support acquisitions, recapitalizations, buyouts and organic growth.

    Media contact:
    Lizzie Younkin
    [email protected]
    619-295-8232
    This press release was issued through eReleases® Press Release Distribution.
    SOURCE ecoATM

    PR Newswire (http://s.tt/1zt2H)

    The post ecoATM Secures Mezz Debt appeared first on peHUB.

  • The Global Fund’s New Funding Model

    I mentioned in my last blog that I would be attending a meeting in Geneva to discuss the New Funding Model (NFM) of the Global Fund to Fight AIDS, Tuberculosis and Malaria, well I am just back from a fascinating couple of days getting to understand the new model. The meeting was a great opportunity to mix with a wide range of individuals and agencies and community representatives who are committed to ensuring that the Global Fund remains a major source of strategic investment in the fight against malaria, tuberculosis and AIDS.

    For those not familiar with the Global Fund, the website (here) provides a huge amount of useful information, including links which allow you to see how money has been used to good effect in the past. Since its launch, there have been 10 rounds of funding which have allowed countries to bid for resources in support of their national response. There was a high level of concern when Round 11 funding was cancelled in November 2011 (see here).

    The Partners Consultation meeting was an opportunity to hear more about how the Global Fund will operate in future and to understand better the transition process from the previous funding mechanism to the new.  The NFM is intended to ensure that the poorest countries and those with the greatest burden of disease have more chance of securing crucial funds to support the fight against the three diseases.

    Abigail and Mark at Partners Consultation. Picture: Neil Squires/DFID

    The Board of the Global Fund has been discussing new ways of providing funding which respond to past criticisms of the funding rounds based system of grant allocation. Of particular concern with the previous mechanism of funding was the huge amount of effort and time put in to developing funding bids which, if they didn’t meet the required standard, would fail to secure needed funding. The process could be a major distraction for hard pressed health planners struggling to use limited resources to provide a wide range of health services. Another criticism was that the very high level of ambition expressed by some countries could lead to significant funding and a welcome scale up of some services, but with negative consequences for other parts of the health service. For example, if staff were drawn away from maternity and child health services in order to staff HIV services, or to attend training on malaria or TB. With many of the countries most affected by AIDS, TB and malaria having limited numbers of doctors, nurses and other health workers (see here and here),  increasing activity in one area can easily led to a decrease of activity in another key service. This opportunity cost of different programmes competing for limited human resources, was sometimes over looked. Mark Edington and Abigail Moreland (pictured above) are two of the key members of the Global Fund team working to ensure that the New Funding Model addresses those concerns. They did a great job fielding questions about the new model and noting down ideas which could help strengthen the approach.

    The Partner Consultation drew together a number of individuals and agencies who are equally keen to ensure that the new model works. There was very strong representation from the communities affected by the three diseases, and the meeting opened with a statement from Civil Society groups, who had met the previous week in Amsterdam (here) and had developed a clear list of asks for Mark Dybul, the new executive director of the Global Fund. The Global Funds has significantly improved the lives of many poor and marginalised groups, and these communities want to protect the gains made and ensure further progress.

    There is a very tight timeline for rolling out the new funding model, and the production line of new guidance documents is only just beginning to deliver the first papers that will guide the process. The draft documents shared at the meeting give an early indication of some of the key elements of the new approach.

    I have tried to summarise my views on the New Funding Modality (as I understand it) in terms of the good news, and potential challenges, and have set these out below:

    The Good News

    Those countries facing funding shortfalls for their national response to the three diseases in the period 2013-2014 will be able to apply for new funds or to re-programme existing commitments. This could mean new money for up to 50-60 countries.

    A maximum of 9 countries will test out the new funding modality (these countries will be known as the early applicants). These countries will be able to bid for a set level of funding, indicated at the outset, but will also be able to express what their full demand for funding would be if they were able to secure more funding than is initially on the table. This element of the approach is intended to keep levels of ambition high.

    In addition, a set of 40 to 50 ‘interim applicant’ countries will be identified who will be able to apply for bridging funding to cover anticipated shortfalls in funding for current Global Fund financed programmes. Additional funding might include things like replacement insecticide treated bed nets for malaria prevention, when nets previously provided by the global fund are nearing the end of their functional life.

    For other countries, not in these groups, there will be no new funding until after the next replenishment of the Global Fund, however there will be scope to negotiate reprogramming of existing funds, and they will be encouraged to develop national strategic plans to address the three diseases, in preparation for future bids.

    Each country will be given an ‘Indicative Funding level’ which is the volume of funds they might reasonably expect to apply for based on their level of need (disease burden) and on the capacity they have to fund a national response. There will be more money available for poorer countries. This is really good news in terms of promoting greater equity in access to funds to fight the three diseases and will favour poorer countries with higher disease burdens.

    Indicative funding will provide a guaranteed minimum level of funding for a 3 year period, which countries know should be approved subject to a sufficiently robust application. This will remove the risk of significant time wasting in grant applications which has been a problem in the past.

    The move away from the round based application process should allow more time for grant applications and allow them to be synchronised with national planning cycles. Having said that however, for the ‘early applicant’  countries, tight deadlines set for the completion of the concept notes is likely to mean that the process of application will still feels highly pressured and demanding in this early phase.

    The Concept note application process is intended to lead to ‘grant ready’ funding – avoiding the problem of funding being approved subject to lots of conditions, which was a fault of the previous system. If this can be achieved, that would mark significant progress.

    But there will inevitably be challenges with any new funding mechanism. The ones that I identified include the following:

    The concept note process still seems to encourage single disease applications, although combined applications for all three diseases will certainly be possible. The problem with lots of single disease applications, each requiring its own application process, will be the same as for the previous funding mechanism, in that the opportunity to address some of the key system challenges, such as lack of doctors, nurses and health workers and limitations of the drug purchasing and distribution systems in ways which will benefit the whole health service may be lost. This is a problem that it should be possible to resolve and the Global Fund team are working on solutions.

    The requirement that the concept note include a ‘Full Expression of Demand’ in addition to the bid for the indicative funding is intended to maintain a high level of ambition in countries response to the three diseases. Whilst ambition is good if we want to develop and expand services, it is also important to plan realistically, bearing in mind the limited resources available to deliver the wider range of health services that populations need. We do not want countries to bid for so much funding that it will draw health workers away from other critical areas of healthcare. Any tendency to encourage countries to bid for more than they can realistically and effectively spend is something to guard against.

    There will inevitably be teething problems with the new funding model. However, the Global Fund is consulting and listening, and this is a real opportunity to improve on what has been a hugely important funding instrument for tackling the scourges of AIDS, TB and malaria. I, like many others at the meeting, will be keen to find out which countries volunteer and are selected for the early implementer phase. I hope that these countries can demonstrate how the new model can help build stronger health services which tackle the three diseases but also strengthen capacity to deal with the many different health challenges that every country faces.