Pacific Fibre, the company, is closed down. We had a great contract to build, solid customer contracts for launch but failed to build an investor book to the required us$300m. The cable was to be 12-15 Tb/s, and use technology that was a lot more upgradable than the existing cable. Demand and business model were never an issue. Someone plonking $1-300m in the table was, even though they'd likely multiply that amount significantly. Our competitor, for example, had their cable build cost paid for by launch date, and their build cost was several times larger.
Kim Dotcom managed to stir up a lot of people, the primary issue for him, assuming he raised the money, would be landing rights into the USA. But even as it is increasingly apparent that he's the victim here, he is still wanted by to be extradited by the same USA authorities who froze his assets.
Kim has friends at the tier 1 level, that should be clear from the Mega days. Those same commodity bandwidth friends would likely be willing to expand into an industry breaking market like NZ. Those players would likely be willing and capable of terminating a cable AND providing transit.
I've always wondered why big tech companies haven't used New Zealand as an experimental place for their "plans".
What I mean by that is, we have 4 million people, and we're not a large place by any means (2,000km in total length, something like that).
If Google wanted to show the world their "vision", they could simply come to NZ, buy a mobile carrier (for millions, not billions of dollars) and give everyone free plans (or whatever their vision is) and lay down fibre to everyone in the country (much faster than the stupid government imo) . They'd even be able to get their awesome self-driving cars on the roads fairly easily, without having to spend so much damn money lobbying.
As per the NZ public, we love new stuff. Right now, we're ripped off in every aspect (consumer-wise). Some massve corporation would change this country in a heartbeat, and for the better. (I'd hope anyway).
New Zealand, in my opinion, has an awesome "sample" size (regarding population). We were one of the first countries in the world to go from cash to using EFTPOS (http://en.wikipedia.org/wiki/EFTPOS) extremely fast.
>> "EFTPOS is highly popular in New Zealand, and being used for about 60% of all retail transactions.[23] In 2009, there were 200 EFTPOS transactions per person"
To me, it makes so much sense for massive corporations to come to NZ, trial their stuff _easily_, and for half the cost (made up number) than it would be to do it in the US.
Once people see how awesome NZ is when it comes to all these self-driving cars, cheap/free internet, cheap phones, and cheap/free phone data/calls/sms surely the rest of the world would want to be just like us? :]
I definitely agree with you, I'd love to see more tech companies entering the NZ market. Sadly we are lagging behind other countries when it comes to technology.
One of my own goals is to help introduce a better environment for technology in NZ. Hopefully encouraging more people to start their own startup and base their operations in NZ ( I'm planning on basing my startup here in NZ ).
I assume you mean using NZ as an experimental place with a view to then justifying expansion into the US on a larger scale, since you brought up Google specifically. As opposed to the EU or something.
I don't think that would work. Look at health care. There are multitude of good systems and mediocre systems, all around the developed world (and the developing, for that matter). There are plenty of example of what to do, what not to do, etc. And yet, you don't have to look too far to find people who are totally convinced that nationalized health care is a mistake, that it can't work, that it will bankrupt society, which all flies in the face not just of basic policy research and mathematics, but fucking real world examples.
And that is because American society does not give a shit what the rest of the world thinks or does, and they do not believe that the same rules apply to them, or that the experience of, say, the UK or Japan could ever apply to them. Certainly not NZ. You could have self-driving flying cars and ubiquitous networking in NZ and still Americans could give you infinite reasons why it will never work and is too expensive and also probably communism.
Now, for going NZ -> literally anywhere else, you have a good idea.
From my layman's point of view, this article is delusional and sounds written by a fan, not a journalist.
the first would see Mega funding the project itself once it became popular through purchasing bandwidth. This is the most viable option [...] Dotcom wants to commit Mega to purchasing $20 million
But who will fund the $300m for actually building it?
He’d need to find $300 million of private angel funding to get moving, something which the country has little of, but should be straightforward to find overseas
Yeah, shouldn't be that hard, like, he is Kim Dotcom!
unlike any other businessman, he’s excellent at cutting through red tape
"Dotcom says that the company will consume 2 terabits of daily bandwidth, which in perspective is more bandwidth in a day than the entire country uses right now."
I'm fairly certain that a country with more than 2.6 million people on the internet [1] will use more than 256 GiB in a day... Even if just 256,000 of those people (~10%) downloaded a single megabyte in a day, you're already at the 2 terabit figure - and something tells me that much more than 10% of the internet connected population in NZ will download more than a single megabyte in a day.
Come to think of it, I probably downloaded around a megabyte worth of stuff just opening the article to begin with.
I thought that initially, but they mention in the article "2 terabits of daily bandwidth", which I assumed to be a metric of per unit-time. Furthermore, I found a graph detailing internet usage per user per month for various countries [1].
If we take New Zealand to be equivalent to the North American average (higher than the European average) of 14.5 GiB/user-month, and apply that to the 2.6 million internet connected people in New Zealand, this equates to an average bandwidth of 14.5 GiB/sec, or around 116 gigabits per second.
The only way you can get close to 2 terabits per second is to assume all of those 2.6 million people use 256 gigabytes of bandwidth per month - and certainly, I don't think that the people in NZ use ten to twenty times as much bandwidth as the other countries in the world [1]. (Other assumptions: 30 days in a month, and all of the 66% internet connected population in NZ uses 14.5 GiB per month)
I'm willing to bet NZers use less than the average. Data caps are very low (30-40gigs is standard, all you can eat is uncommon) and speeds firmly abysmal (3-4mb here off peak, I'm in the biggest city, Auckland, and am about 10-15 from the centre).
Keep in mind that all internet in NZ and Australia is metered. Similar to mobile data here, except more like pay-as-you-go. You pay for a block of bandwidth and then use it till it's gone. Not the same idea as here in the States.
Just FYI, this is not strictly true. Most internet plans are metered, however you can often find a plan that includes unlimited downloads (and that does not shape bandwidth).
For example, see [1].
It is still typical for plans to be graded by the download cap, and for a time there seemed to be no unlimited download plans available, however this situation is being disrupted by ISPs like TPG. They are almost certainly overselling their bandwidth, but my personal experience is quite good with consistent fast downloads even when downloading terrabytes of data in a month (I lived in a student share house).
But with this Pacific Fibre cable the populace would probably see much cheaper bandwidth, especially if the NZ government is trying to get 50%+ of the population 100mbps.
I have spoken with people high up in the Ultra fast broadband rollout, and they have no idea. I work for a radiology company, and we would like to be able to have 1 gig files belonging to 20ish patients shuffled across town in a reasonable time frame. The UFB guy just didn't get it - orthopedic surgeons see patients for 15mins tops, and spending that waiting for imaging to arrive is somewhat useless. The guy was telling me how well YouTube works on his home UFB. Great.
Edit: To be clear, the people I talked to had no idea, presumably the company as a whole does as the scale of the work they are doing is pretty huge, and there haven't been big issues as far as I am aware.
They've obviously got their units messed up in that section. I used 99GiB yesterday, and I'm fairly sure I don't account for over a third of my country's bandwidth usage :)
> Dotcom wants to commit Mega to purchasing $20 million of bandwidth from the new cable company that he would resurrect, since Mega is now registered in New Zealand. According to the NBR, that would give Dotcom around half of what he needs if he paid for ten years. Even if that were to work, Mega would have to prove extremely popular, with Megaupload previously purchasing $40 million of bandwidth. Dotcom says that the company will consume 2 terabits of daily bandwidth, which in perspective is more bandwidth in a day than the entire country uses right now.
As they were talking about the total cost being $400m, and 10 years covering half, they are saying that Mega would pay $20m/year? And use 2TB/day? Meaning $27400/terabit (~$22000 USD)?
Good on him. I had a New Zealander friend explain to me that residential bandwidth gets metered into two categories: local to New Zealand (cheap or even free) and any data that comes from overseas, which I find a complete ripoff.
Why? If the bandwidth costs differently to the ISP - and it probably does - it seems much more reasonable to offer the choice to the consumer instead of averaging the costs.
Here in Portugal we used to have three tiers (international, national and intra-ISP, the latter of which was unlimited) and what happen in that some developers forked Emule to give it per-ISP and national filters, which you could configure independently.
It was very useful even for legal but bigger content, like game demos or patches - a single person would download it from abroad and then share it locally. A single cap would be much worse for everyone.
I don't believe this is true, at least for the larger ISPs (may be the case of Universities). A friend of mine owns a small ISP in New Zealand and I've discussed this with them as I assumed there would be a huge demand for local data centres and storage/streaming services. Right now all traffic goes out of the country and back in again - that's what should feel like a rip-off.
Apparently Telecom could do this but it would require a court order or major political pressure. If this were happening now you'd see a lot more innovation in the digital space, and the experience of using the internet in NZ, in terms of streaming media etc. Netflix for example declined to startup business in NZ because of these problems. http://www.stuff.co.nz/technology/digital-living/6045189/NZ-...
I live in NZ and that was true once with some ISPs. I don't know of any companies that do it now. Some stuff is unmetered - usually stuff like on demand video which they get you to pay extra for as a package.
Yes, for residential plans they abolished the National / International pricing for a fixed rate across both, but for servers in data centers they still have National vs International pricing. For example: http://www.net24.co.nz/dedicated-servers/
I don't know where you got the idea that all traffic goes out of the country and back in again. That is just false.
>I don't know where you got the idea that all traffic goes out of the country and back in again
I hope so! It's what I was told, and of course it would make no sense. I'll have to ask them to show me some hop reports. Good to hear that there is national traffic for data centres however.
There was an interesting article on "Australia's Strategy"[1] published by Stratfor I came across a couple of months ago that argued that Australia has such a huge dependence on access to sea lanes and shipping that it has made a great effort to stay friendly with the dominant naval superpowers. It would seem to me that an awful lot of their conclusions apply similarly to New Zealand[2], and the risks of pissing off the US will restrict the extent to which they could become a data-haven in the way some commentors here are describing.
An operation the size of this new Mega will almost certainly rile up the US over IP issues, whether there's some element of plausible deniability over its intended use or not.
If anything, Kim is certainly brilliant at PR. With such a grandiose promise, he will gain wider support for his cause within the general population which will in turn cause more politicians to side with him. It doesn't really matter if he can pull it off or not. If he doesn't pull it off, he can always blame the FBI/MPAA for not giving him enough money (he likely won't get any at all). People will now hate the FBI/MPAA even more as they are "preventing" them from having free Internet, getting him even more popular support despite failing to deliver on his promise. Brilliant!
Kim dotcom hasn't proposed investing. He has shown interest in being the biggest customer. The risk of investing in the cable is huge, there isn't a datacenter in NZ that can utilize the proposed bandwidth increase. This is essentially a chicken-or-the-egg situation where someone other than dotcom will be bearing the risk.
Ah ok, got it. This sentence was a bit misleading though:
> Dotcom took to Twitter recently with a new-found passion, promising that he would relaunch the “Pacific Fibre” project for the country and deliver “free broadband for all [New Zealanders].” How exactly does he plan to do that? By suing the pants off the American movie industry.
What is up with the next web and other website like Wired that keep getting linked here? They overload the purpose of the arrow keys which screws me over big time if I misspress slightly while trying to scroll down. The article I got moved to then switched to using the arrow keys to change between pages of the article, not between articles, and so I could no longer go back to the previous article. Even the back button on my browser was broken by their website and would just end up taking me back to the place that I was at. It makes reading so damn annoying; why do they do it? Wired even had it that if I accidently hit, say, right arrow and then tried to go back with the left arrow key it would 'forget' the fact that I had all the pages of the article open so I would no longer be able to see where I used to be.
Please don't make arrow keys do things! They're right next to up/down arrow keys which I use to read your article and it's easy to not actually hit 'alt' when trying to go back or forward in history.
Why? That would just result in this story, which I found interesting and a lot of other people seemingly also do, not getting picked up and placed on the front page.
The issue isn't duplicate URLs, the issue is that the submission system is broken. This is an example of that. When a story doesn't get a single upvote the first time it is submitted, but gets 50+ upvotes when it is submitted just two hours later, something is wrong with the system.
In good upvoting systems the "winners" should be decided by how interesting they are and not by the hour they was submitted or seemingly totally random factors. Creating a system like that is incredibly hard though.
Maybe. But really, people were just asleep and didn't feel like waking up to give their upvotes out. Most people who read HN were asleep three hours ago. It's Sunday morning, many HN readers are on the west coast of the US. For them, the previous submission was at 4am. Why are you surprised it didn't get many upvotes? Even for US East readers, it was 7am on Sunday.
Timing your submission for when most people are asleep and expecting upvotes is not going to work. If you want people to read it, submit when they're awake.
The system should account for this. This is the internet, people are always awake and using the site somewhere in the world. Maybe measuring page views and adapting the algorithm to give more weight to a vote at times when there are few people on the site would help.
The poster of this new article added ?fromcat=all to the end of the url to evade the filter.
It's gently infuriating.
I'm not sure what the answer should be. People should submit less? Should only submit great articles form good sources? More people should follow new and flag the dross and upvote the great articles. I dunno.
Perhaps banning any filter evasion that isn't ?repost=1; ?repost=2 etc might help. At least then people could easily see how often this had been reposted.
If I go the thenextweb homepage and click the link to the article I get the url with ?fromcat=all. Banning a user because they copied a url directly is not a good solution. Many websites put some extra information in the url.
At the same time you should not be penalized for actively cleaning cruft from the URL before posting.
Had the exact same thing happen to me, took the time to clean up the URL, next submission a few minutes later got approved and you could see both links to the article next to each other under new.
Pacific Fibre, the company, is closed down. We had a great contract to build, solid customer contracts for launch but failed to build an investor book to the required us$300m. The cable was to be 12-15 Tb/s, and use technology that was a lot more upgradable than the existing cable. Demand and business model were never an issue. Someone plonking $1-300m in the table was, even though they'd likely multiply that amount significantly. Our competitor, for example, had their cable build cost paid for by launch date, and their build cost was several times larger.
Kim Dotcom managed to stir up a lot of people, the primary issue for him, assuming he raised the money, would be landing rights into the USA. But even as it is increasingly apparent that he's the victim here, he is still wanted by to be extradited by the same USA authorities who froze his assets.