Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The Fed suckered IBM into a failing cloud strategy? (cringely.com)
76 points by lmg643 on July 25, 2014 | hide | past | favorite | 76 comments


Cutting costs isn't IBM's strategy to get out of the recession. Cutting costs is IBM's strategy. I interned there in 2003, and it was already apparent that they preferred to hire people overseas and let attrition reduce the ranks of folks locally in order to reduce costs. From all the complaints, it was clear at the time that was causing serious problems with R&D.

The joke around the office was that we'd replace one person locally with three people overseas, losing the output of two people, since the three overseas people were so clueless they'd need a local person holding their hands full-time. I was doing microprocessor design at the time, and the community is small enough that everyone know that Intel was getting good folks overseas. But even at the reduced wages outside the U.S., IBM was cutting corners and not hiring the best people.

Even locally, they try to reduce costs. A friend of mine who stuck around long enough to make it into management told me that they try to keep salaries at about the 40%-ile to save costs. On finding that out (as well as a few other gems), he left. Until I heard that, I couldn't figure why brilliant friends of mine often got raises that didn't even cover inflation. Don't they know that people are going to leave because of that? They know, and that's their strategy.

The thing about Bernanke comes out of left field. IBM didn't use low interest rates to invest therefore "the companies that were expected to spend us back to better economic health didn’t do so" therefore low interest rates didn't make the recession less severe than it otherwise would have been? No comment on whether or not those last two statements are actually true (I'm not an economist and haven't studied the issue), but Cringley certainly doesn't make a case for them.


My wife worked at IBM for many years and every year, like clockwork, right before the quarter was up, they'd shut down the power to entire IBM sites for a day or two to save money. Everyone would be told to work from home. That's how important cost-cutting is to IBM.


Poundwise Pennyfoolish no? (Or is it Pennwise Poundfoolish?)

If productivity doesn't drop, why not telecommute 50%?


I interned there too in 2011 and can testify that employee morale is ridiculously low. No one loves the company, the only people who were working hard were the lifers who'd joined after college and were still there at 40 or 50.


This is why I left IBM (SWG) a few months ago. When I turned in my notice after 1.5 years at IBM, people, including very senior people and managers, cheered me on saying things like "I wish I could leave" or "I put my notice in 6 months ago but they gave me a huge raise...".

When the financial gimmicks run out, IBM is absolutely screwed because they've been cannibalizing their most valuable asset for short term shareholder value.


I know a bunch of people only hanging out because they think they'll get laid off anyways and IBM severance + unemployment isn't so bad.


Isn't it easier to look for a job when you're not competing with 8,000 other laid off people?


IBM is a slow trickle kind of lay-off place. So there are always lay-offs happening.


Perhaps, but if you've been at IBM for 10 years, 6 months of doing nothing but collecting your pay check probably sounds really good.


A good talk on one of IBM's big problems was submitted this morning but gained no traction. I wish HN'er would learn to identify big names in finance so this stuff wouldn't be missed.

https://news.ycombinator.com/item?id=8084640


It's a good point. But he's making the same point as Stockman (see below), and Cringely's story links to that clip as well. https://news.ycombinator.com/item?id=8085628


Yes, but Drunkenmiller "Broke the Bank of England"

http://en.wikipedia.org/wiki/Stanley_Druckenmiller

When he speaks, he's probably worth listening to.


Point taken. Thanks for the link.


I was a manager at IBM at the beginning of off-shoring. I too left because I disagreed with the model, in principle. However, the statistics did show that the company could save ~60% for the same quality of service, immediately. What was unclear and highly debated at that time was how sustainable that saving was and how developing economies would impact on those decisions.


>IBM didn't use low interest rates to invest therefore "the companies that were expected to spend us back to better economic health didn’t do so" therefore low interest rates didn't make the recession less severe than it otherwise would have been?

Investing in your own shares doesn't lead to growth like capital investment does, it's just more money at rest. Low interest rates improved the recession through improving the stock market - through 'trickle-down'. That's why the stock market can be raging while the larger economy is in the crapper.

So low interest rates can simultaneously make the recession less severe than it otherwise would have been, yet make the growth potential of the economy worse.


First of all, Cringely is right on the first part. IBM, and many other companies, are taking advantage of the low rate environment to borrow money and buy back stock. This is capital structure optimization, and many companies do it. He is tongue is cheek about the problems being the Fed's fault.

There are impacts to this policy. By borrowing to buy back equity, you are increasing the chance of distress. You don't increase your cash position, but you do increase the amount of interest payments you have to it. (Albeit less due to the low rate environment) This means it is more difficult to bet on anything that isn't a sure thing.

That said, perhaps this is the best way for shareholders to wind down a company that can no longer innovate. At some point, large companies can cease to be innovative, and cease to profitably handle acquisitions. Then what? Especially if you're too big to be bought out yourself.

There are two options... One - Split yourself up and sell the parts to the highest bidders. Two - Buy back the shares. Shrinking the outstanding shareholder base will increase the per-share value even if the total corporate value is flat. Rather than waste money on innovation (if you can't) or M&A (if you overpay and underintegrate) better to give it back to the shareholders and let them invest capital in companies that can grow.

Borrowing to buy back shares is just taking case #2 to the extreme, and ultimately passing the buck to the long term bond holders.


See The Fed's Financial Repression At Work: How Big Blue Was Turned Into A Wall Street Slush Fund, by David Stockman at Seeking Alpha http://seekingalpha.com/article/2324305-the-feds-financial-r...

It's a good piece, even if you don't share Stockman's view of the US economy and its impending doom http://seekingalpha.com/article/2324315-the-implosion-is-nea...


That first article is really good (the second reads like the breathless ranting of a conspiracy theorist).

If other major companies are doing the same thing as IBM (seems likely), then it's the best explanation I've seen for the stock market boom over the last couple years. There has to be a way to incentivise these companies to create real, long-term value over short-term stock price manipulation.


After a while they just get too big to do it. But that's ok. As long as someone is willing to buy the bonds, there's nothing wrong with shuffling the capital structure.

In theory, some people need fixed income instruments (say insurance companies) so there is a market for it.

If the people with risk capital want to place it on smaller more innovative companies, why not?


Or you know just have a special dividend share buybacks (outwith IT's controlling discount) always struck me as a way to line the pockets of the advisors and not the actual shareholders.


Generally share buybacks help the shareholders quite a bit. Unless I'm wrong, the advisors don't skim as much off of buybacks as they do IPOs. (And I am wrong more than I care to admit!)


But a dividend is a quantifiable amount than just the company buying shares and hope the market follows rationally.


The problem with the dividend is that it's immediately taxed. That said, if you take taxes aside, it's actually the same either way. This is the crux of the Modigliani Miller theory. [0]

An oversimplified way to look at it... Let's say a company has 1 million share worth $100, $50 of which is sitting in cash, and wants to return half the value to the shareholders.

One way to do this is via a dividend. Each share is now worth $50, and each shareholder has $50 in cash. (No net gain or loss, but the shareholders can do what they want with the money - the capital is freed)

Alternatively, they can buy back 500,000 shares. Each of the remaining shares is worth the same amount, but $500K of new shares is released to new uses. Same net value as above. No new value is created or left behind, it's just the dividend gives a partial payment on each share, while the buyback gives a full payment on some of the shares.

What IBM does is borrow money rather than pay cash, because they can deduct the interest payments on the debt. (This violates one of the assumptions on the MM theorem)

[0] http://en.wikipedia.org/wiki/Modigliani%E2%80%93Miller_theor...


My wife is an ex-IBM and the cut-to-get-growth story rings true.

But it overlooks a core problem that IBM faces. The cloud is a competitive platform for problems that previously required mainframes. They successfully defended (kinda) against Oracle/Sun in areas like running large banks and stock exchanges.

What happens to IBM when you can solve hard real-time transaction problems on 1000 loosely-coupled distributed computers instead of 1 large mainframe (made up of a 1000 processors with shared state)? That day is either here or close.

My take is that they can't ignore the cloud but they can't win it either. Tough place to be.


IBM can milk the mainframe for years to come, because the cloud doesn't compete against its legacy enterprise customers. Large enterprises have chosen to retain many of their legacy mainframe applications. Previous attempts at re-writes have too frequently failed due to cost overruns or have been complete failures. So, the safe choice for a CIO is to avoid rewriting legacy systems (many have been around since the 60s or 70s). So, IBM customers will continue paying their premium to run on a mainframe, and in some cases will leverage their existing skills/systems to deploy more onto their IBM systems.

This mainframe strategy for IBM will keep many of their legacy customers, but as Cringley and others point out, it fails to capture new growth.


Here's another view point: I recruited for Mainframe Developers a year ago. It's an increasingly rare skillset to find because developers:

- move on to cloud technology or other newer technologies to stay relavent

- don't want to move back to mainframe since they see less jobs for them

- there are almost no junior level candidates for mainframe roles. Most candidates will have 10-20+ years of experience. It's extremely rare to find anyone less than 5.

It is true that legacy systems aren't going anywhere anytime soon. But as time goes on there's a high cost in finding people that can maintain those systems.


This is true, but they've been milking it for 40 years, and it's in permanent decline. Mainframe sales are not worth anything like what they were in the 1980s.

Further, a big chunk of IBM's server business is running its own Unix. That market is also in decline, and easier to switch away from. See http://www.itjungle.com/tfh/tfh030314-story06.html


I remember back in the Sam Palmisano days (mid to late 2000s) watching him present in an internal all hands meeting (I'm a past Software Group IBM employee). He spoke briefly about mainframe margins, he didn't state numbers, but boy he made it clear that the margins were significant and they were seeing growth.

So, I don't know about the last few years, but I think IBM can ride the mainframe money train for years to come. My point was simply that it's not going to drive any growth.


Agreed, but the numbers have been going down for decades, and are still going down. In 1985, mainframes brought in $14bn of IBM's $50bn turnover. Today it's less than $5bn of IBM's $100bn turnover. Factor in inflation and the decline is even more dramatic.

IBM used to be able to charge $1 million just to install its top-level mainframe OS (before rental charges) and C cost $1,500 a day. I expect those days have gone ;-)


In some ways, I think Cringely underestimates IBM's potential. (Or maybe he correctly estimates IBM management's inability to see IBM's potential.)

One way Google kicked the ass of their competition was to radically their cost of computing through smart use of commodity hardware. It seemed like a small thing, but it gave them a lasting advantage, because nobody could afford to deliver the same search features.

Given IBM's deep pockets and deep experience with hardware, I'd think there must be some strategy there that would let them create cloud computing infrastructure with radically lower cost than the competition. We regular people are stuck with off-the-shelf colocation setups, off-the-shelf hardware, off-the-shelf processors, and off-the-shelf software. IBM can afford to change any of that from scratch. Any of it.

I just can't believe that, less than 10 years into the cloud computing era, we have already happened upon the optimum approach.


IBM no longer has deep pockets: it's spent more than it has earned on buying back its own shares, and now has a ton of debt.

IBM has been the high-cost supplier for almost 100 years, so the idea that it will out-innovate and undercut Amazon, Microsoft and Google in the cloud is an interesting one. Especially as both Microsoft and Google have much deeper pockets and are approaching IBM in annual turnover. (IBM used to have 70% of the IT business. Fairly soon, it won't even be in the top 5.)


What deep experience with hardware? Yes, there was a company called IBM that had a lot of hardware experience, and there is currently a company called IBM. But most of the parts of IBM that have deep experience with hardware got sold off to Lenovo.


Z-series (mainframes) and Power systems (high-end UNIX servers). Yes, both are in a shrinking market, but they've got significantly higher margins on both sales and support than commodity x86. They're also what's keeping IBM's domestic hardware manufacturing going.


Google were innovating new product with the money they didn't spend on hardware. And, AFAIK, the didn't so much reduce existing cost as reduced the future cost of new investment i.e. increased bang for buck.

That's not the same as doing removing every other light bulb in the building (which is what Lucent did at Bell-Labs!)


It isn't just going to come from cloud/distributed systems.

You can scale up to a huge extent just with average x86 servers now and these are competing with what used to be done on a mainframe. So even many companies who don't or can't use distributed systems don't need to use IBM any more.


And it's not just hitting in obvious places.

Consider Hadoop. I know of at two major DB2 customer who are investing in Hadoop rather than expanding their DB2 capabilities to handle problems IBM would love to solve with their solutions.


ibm do distributed computing, their selling point will be that the firms computers will be on location, rather than away in a warehouse in another state.


Latency and concurrency. Mainframes still have an advantage vs. the more big data or multiple task friendly cluster.


The article says that because of expansionary monetary policy, "[Businesses] tended to borrow money and invest in their own shares." To me, this statement seems nonsensical. Here's why:

Fed policy affects things by changing the aggregates. But stock ownership can never change in the aggregate. If you have extra money and you buy some stock, for every dollar you spent buying stocks, someone else now a has a dollar from selling you the stocks. A common myth is that money can go "into" an asset class. It cannot. It's impossible for extra money to end up invested in stocks in aggregate, because for every buyer there must be a seller. Individually, you may have less money, but that's balanced by someone else having even more money now. Stock sales will never change the aggregate money supply nor aggregate investment nor total stock ownership. So what is cringely saying here?

I'd appreciate if anyone could explain the logic. I don't want to pre-judge, but my suspicion is that monetary policy is a subtle issue and that blaming the Fed is fun, both of which lead to mistakes such as this one.

(In fairness, perhaps the author means to say that specifically IBM used low interest rates to buy stock and not that businesses generally did this. That is a charitable interpretation that makes more sense and deviates only a little from what was actually written.)


About this:

"It's impossible for extra money to end up invested in stocks in aggregate, because for every buyer there must be a seller."

If I buy a share for $1, and then a year later sell it for $10, then the new buyer is putting $9 extra dollars into that asset class.


If person A buys a share for $1, then another person must sell a share for $1.

A year later, if person A sells a share for $10, then another person must buy a share for $10.

At all times, someone is always holding the money created by the Fed. The money never "went in" to stocks. Sure, the valuation of the stocks changed, but the aggregate money supply and the aggregate stock ownership never changed.


The money supply is constantly changed by the Fed in response to demand for cash, to maintain consistent inflation.

In 2008 when everyone started selling their stocks, the Fed had to create massive amounts of money to match the appreciation of the stocks that were being sold. Thus, because the money supply grew in that situation, people said money was "coming out of" stocks. In reality it was value coming out of stocks, and the money was being created by the Fed. So think of it as shorthand for what's really happening.


Ah, but you're committing the very same fallacy! It's not possible that "everyone started selling their stocks" in 2008. That statement makes zero logical sense. Every seller was matched by a buyer. We might just as well say that everyone starting buying stocks in 2008.

I think what often links stock prices with demand for money is risk preferences. If stocks become more volatile (which they tend to do while falling), investors deleverage. This deleveraging reduces the velocity of money and thereby increases the demand for cash. If unchecked, this will cause deflation and requires the Fed to create more money in response.


Every seller was matched by a buyer but not at the same price. If a stock starts selling at $100/share and after 1,000,000 transactions is selling at $50/share, then some money came out of the stock despite the number of shares not changing.


No, no, no. :)

Every seller is matched by a buyer at the same price!

I think you are making claims without thinking them all the way through. Let's go through an example:

Imagine there is a company with 1,000 shares. Alice buys a share from Bob for $100. This trade implicitly values the company at $100,000. Later, Carol buys a share from Dave for $50. This new trade implicitly values the company at $50,000.

The value of the company fluctuated by $50,000, but if you look at every trade, an equal number of dollars and shares were exchanged on each side. Alice lost $100 of money while Bob gained $100 of money. Carol lost $50 of money while Dave gained $50 of money.

No money "came out" of the stock.

The aggregate stock ownership stayed the same: 1,000 shares were owned both before and after.

The aggregate money supply stayed the same: $150 were in people's pockets both before and after.

The only thing that changed was the market consensus of the price of the company. The bottom line is that price changes of an asset class don't mean money is being soaked or released. Individual ownership can change, but aggregate ownership cannot.

Does that sound reasonable? I hope this conversation is helpful for people reading this comment thread.


You're missing a few pieces of the stock market. The total number of shares increases over time, and some shares pay dividends.

Edit to add: My point is that the total value of the stock market fluctuates over time. If every sale is zero-sum, how are such fluctuations possible?


He's channeling someone else, which is why the first three words in Cringely's article are "Economist David Stockman". See https://news.ycombinator.com/item?id=8085628


The result of all this is that IBM management has lost touch with reality.

I see that the author needed some connective tissue between the bit about The Fed and IBM's cloud plans but any IBMer/ex-IBMer can tell you they're out of touch. The disagreement is about when they lost it.

Most, including this author, will point to whenever the domain they're experts in was mismanaged to hell. But IBM will keep on trucking right past their foray into "the cloud," constantly chasing the latest fads, and letting older divisions wither and die because nobody in charge understand the tech domain well enough to salvage them.


The headline got me to click, but I fail to see how cheap interest rates makes IBM's potential cloud failure the fault of the Fed.

You could argue low rates force the hand of some firms - the stock buyback argument made sense in the article and had me following. How a strategic decision like "Cloud is where we're going as a company" can be traced to the Fed? That's where I lose the thread. What if they had decided on another investment area? Would you be writing the same article? Would one write an "The Fed suckered Amazon into a successful cloud strategy" article?


I don't think it's the fault of the Fed. If somebody making $16 million a year can't be held responsible for their choices, it's a sad world we live in.

But I do think cheap money has let IBM coast a little. Our business culture has a theory, one I think mainly dumb, that if we tie executive rewards to stock prices, they'll do good things for their companies. In this case, cheap money lets them get paid without doing anything useful.


The article has two good points about IBM buying shares back and not being able to compete in the cloud market. However, I fail to see how the Fed directly caused IBM to fail in their cloud strategy? Seems that IBM management fail to spend/borrow money wisely.


I read it as Cringely being sarcastic, esp when you read the last para taking a potshot at Ginni.


I dunno.

What I do know is that IBM seems to be the #2 advertiser on TV and in magazines after that stupid gecko.

It is trying more and more to be like those Indian outsourcers like Wipro and Infosys, but the difference is that Wipro and Infosys pay the CEO at Indian rates and spend customer money on solving customer problems rather than sponsoring tennis and golf tournaments.


Why exactly does Cringley have such a hate on for IBM?


He seems like a sensationalist who has certain vendettas against specific companies and entities. I'm not sure why he's taken seriously at all...

(As in, I really don't understand why he keeps getting posted on HN, not just some throwaway insult)


Because he wrote the best book ever written on the rise of the PC industry, Accidental Empires (http://www.amazon.com/Accidental-Empires-Silicon-Millions-Co...). And then turned it into a TV documentary miniseries, Triumph of the Nerds (http://www.amazon.com/Triumph-Nerds-Bob-Cringely/dp/B00006FX...), which was best in class as well.


Hmm, perhaps you're right. But he seems to me to be in that category of authors who takes small bits of evidence and draws much to large of conclusions.

(And yes, I'm doing the same thing with this very comment)


sensationalist as he is, he's usually on the money about IBM. he had a huge following on slashdot too, and has a huge following among IBM employees with many wondering whether or not you get shitlisted for visiting cringely.com.


it seems common among ex-employees.


IBM probably deserves it, but they have already stopped being relevant to most of us.


...in the myopic startup world.

I work for an enterprise doing WebSphere & I'm scared to see it go one day actually -- will be cool if it cheapens to the point where it is equivalent to an open source tech but I think its the proprietary nature that got it to where it is. Pretty big learning curve but it is rock-solid and would be quite an upgrade from a lot of the scripting lang stacks if it were to hit the world at large.

Coming from RoR it feels like moving from a toy plane to a Harrier jet.


WebSphere may be upgrade from a lot of the scripting lang stacks, but it still is a horrible crap. I think that open source J2EE compliant alternatives exist and are much much better.


'...but it's rock solid...'. From someone who did a lot of robustness testing against WebSphere in the course of writing a JMS adapter, you have got to be kidding?


I think the clarification "watwut" makes is fitting -- it's not like it's complete perfection in every situation, but I use WebSphere 8.0 running a ton of apps written in a pretty wide variety of techs (Portal, Commerce, Message Broker, services). It even runs crappy legacy projects with minimal changes (literally 1 or 2 quick fixes suggested by IDE).

Obviously it is a bit opinionated & works best when you code specifically for the APIs/standards it is currently attempting to embrace, but the all the open source scripting tools I've used are way more opinionated. I like WAS's take because I often find that it's exposing services that I don't even know about until I eventually need them. And it's not a lot of bloaty junk, it's essential JavaEE interfaces/protocols.

I'll admit it took a while to grow on me and we have fantastic hardware...


The solid parts are actually the JRE / JVM...


eh yeah but it doesn't hurt to have an admin GUI and a ton of default stack config done for you, that's all I mean. WebSphere is like a 1-click install for a massive stack. The open source community seems to like DropWizard, Websphere is just like that on steroids.

I'm sure a comparable stack can be made from open source components though and there are other companies who do the pre-packaged stack approach, I know. But WebSphere is kinda convenient and nicely integrated is all I mean. After spending a ton of time chasing down gem dependency conflict/upgrade issues using package managers in Ruby, I'm kindof impressed with the prescription full-stack. It's just like having a corp do all the dependency/interop checks so you don't have to. Limiting in a way but reduces a lot of friction.


If you've ever read "IBM And The Holocaust" [1], you might say they deserve much more than this.

1. http://www.amazon.com/IBM-Holocaust-Strategic-Alliance-Corpo...


Not only can they not make a profit on running cloud services, they can't make a profit selling hardware to the people who do, either. Something..something..selling the division to Lenovo...


They couldn't do that before either.

The people buying IBM x86 servers were people who were already snookered into buying POWER or Mainframe systems.


IBM makes its money on the consulting anyway, they don't need to make money off the cloud services. They just need to sell lots of $200+/hr consulting services putting people on them.


Consulting (and services in general) is a low-margin business. You can only increase revenue by tricking young college grads to work 80-hour weeks at low pay.


Margins on services are much higher than margins on hardware. While they're much lower than margins on software, margins on consulting services are also much harder to attack.


When they are paying someone $40 per hour and billing that person at $300 per hour there has to be a nice margin in there.


Sure, but software gross margins are typically above 80%. (And should be close to 100% in a serve-yourself online software business, where the manufacturing cost is zero.)


IBM has stopped targeting excellence and started targeting EPS. You cannot operate a business without looking up from your spreadsheets once in a while, no matter how big that business is.

Earnings per share should be a natural byproduct of excelling in your market.


Cringely should stick to commenting on tech as opposed to the validity of raising debt to do share buybacks in a low interest rate environment.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: