Yesterday Young’s story came full circle when IBM bought Red Hat for $34 billion, a 60% premium over Red Hat’s Friday closing price. IBM is hoping it too to can come full circle: recapture Gerstner’s magic, which depended not only on his insight about services, but also a secular shift in enterprise computing.
This is the bet: while in the 1990s the complexity of the Internet made it difficult for businesses to go online, providing an opening for IBM to sell solutions, today IBM argues the reduction of cloud computing to three centralized providers makes businesses reluctant to commit to any one of them. IBM is betting it can again provide the solution, combining with Red Hat to build products that will seamlessly bridge private data centers and all of the public clouds.
As part of our backup process, Backblaze will run a checksum against each file before uploading it. This requires the entire bzfileids.dat file to be loaded into RAM. After a long time, or if you have an extraordinarily large number of files, the bzfileids.dat file can grow large causing the Backblaze directory to appear bloated. The only way to resolve this would be to repush (or reupload all of your data).
The Backblaze client needed to solve a technical problem, which is to distinguish when the customer’s computer is entirely offline (no network connectivity) and when Backblaze’s datacenters and servers are offline (which is unusual). The way I implemented this is that ONLY IF the client cannot reach the Backblaze servers, then in that case the client attempts to fetching the homepages of these three websites without any cookies and ignoring the results that come back (other than verifying they are valid HTML)[…] to establish the difference between “the Backblaze service is down” and “you have no internet connectivity at all.”
The reasoning actually makes a lot of sense; the issue is that the user has no way of knowing what is happening or why.
FYI in my case this happens when I am connected to a network that allows connections to Reddit, Wikipedia and Google but blocks backblaze. I quickly figured out that this was some kind of connectivity detection but is mildly annoying (but no big deal).
when I worked at Wells Fargo (before Backblaze) almost all websites were allowed for social reasons (marketing teams need access to Facebook/Twitter/LinkedIn) but Backblaze was actually blocked as well. Why? Because it’s a backup service and Wells Fargo does NOT want you backing up your company machine, so all backup companies were blocked
ok, i get BB’s reasoning here, but spin up two or three tiny Linode servers in different locations, or one linode and one digital ocean or some combination there of, set proper dns records, and phone out to them. don’t abuse the internet at large.
How is this not a logical move by Apple? How would someone who bought an Apple Pencil 2 charge it with any other model of iPad? Or conversely, how would they sync the old one to the new iPad Pro without a lightning port? I mean, I hate an Apple shill as much as the next man but come on, use your brains here guys...
the point being: Apple does a lot of dumb shit these days and i just heard Burnie Burns of Rooster Teeth sum it up best: (paraphrasing) under Steve Jobs, Apple worked as a cohesive unit, where every product was complimented by or complimented another product in some way. after his death, it became obvious HE was the arbiter of that.
in the intervening 7 years, we have seen that complimentary process go completely out the window. there is no cohesion across the product lines. it is like HP or IBM or any other corporate tech giant where no two product lines communicate or share a vision.
why drop lighting from the ipad pro at all? why not just add a fucking C port? there are dozens of examples like that where decisions were made for stupid reasons that in the end, are just user hostile, whether or not they had those intentions.
so my original point stands: this may not have been apple actively being greedy, but it was them being stupid. And Shawn King would tackle rene richie and beat him with a club in order to be the first person to post a “well of COURSE Apple is doing their best to manage the situation” commentary after we find out they have been strangling puppies in the back of Cafe Macs since 1992.
Tim Cook introduced the new Mac mini at the Brooklyn Academy of Music’s Howard Gilman Opera House by gesturing to the sky. What followed was a video titled ‘The Arrival’ depicting a Mac mini descending like a UFO from the nighttime sky into the desert, which turned out to be a nighttime wallpaper from Mojave, Apple’s latest macOS update. It was a fun introduction to a computer that was last updated in 2014, and many Mac users had predicted would be discontinued.
As rumored, the new Mac mini is distinguishable from its predecessors by its new Space Gray color. The size is the same as previous models, but there are new ports on the rear and an entirely different set of components and layout inside. The new model is a little heavier too, coming in at 2.9 pounds, whereas prior models were either 2.6 or 2.7 pounds depending on the configuration. The new mini is also quieter rated at 4dBA when idle as compared to 12dBA for the older version and made from 100% recycled aluminum.
Apple’s latest mini can be configured with 4 or 6 CPU cores. The 4-core model includes 3.6GHz quad-core Intel Core i3 CPUs with 6MB shared L3 cache and the 6-core model featuring 3.0GHz 6-core Intel Core i5 with Turbo Boost up to 4.1GHz and 9MB shared L3 cache. Both models can be upgraded to a 3.2GHz 6-core Intel Core i7 with Turbo Boost up to 4.6GHz and 12MB shared L3 cache. According to Apple’s presentation, the new CPUs are up to 5x faster than before.
The new models also include Apple’s custom T2 chip. That adds Apple’s Secure Enclave coprocessor and allows for secure booting and encrypted storage. It also means fast HEVC video transcoding.
The new mini in action in the Apple event hands-on area.
The base models of both minis start at 8GB of 2666MHz DDR4 SO-DIMM RAM but can be configured in 16GB, 32GB, or 64GB. Storage options have been significantly increased too. The base model of the 4-core mini features a 128GB of SSD storage, while the 6-core model starts at 256GB. Both SSD configurations can do 3.4GB/sec sequential reads. The 4-core model can be upgraded to 256GB, 512GB, 1TB, or 2TB of SSD storage and the 6-core model to 512GB, 1TB, or 2TB of SSD storage. Apple says the SSDs are up to 4 times faster than prior models.
Graphics for both minis are powered by an Intel UHD Graphics 630, which Apple says is up to 60% faster than before. The new mini can power two or three 4K and 5K displays depending on their configuration. According to Apple’s technical specifications, the mini supports:
Two displays with 4096-by-2304 resolution at 60Hz connected via Thunderbolt 3 plus one display with 4096-by-2160 resolution at 60Hz connected via HDMI 2.0; or
One display with 5120-by-2880 resolution at 60Hz connected via Thunderbolt 3 plus one display with 4096-by-2160 resolution at 60Hz connected via HDMI 2.0.
As with past models, the mini also includes a healthy number of ports including:
4 Thunderbolt 3/USB-C
2 USB-A 3
3.5mm audio out
Gigabit Ethernet configurable to 10GB Ethernet
The mini also supports 802.11ac WiFi networking and Bluetooth 5.0.
To keep the mini cool, Apple has redesigned the power supply, added a bigger fan, and expanded vents increasing airflow twofold.
To demonstrate the power of the new mini’s, Apple’s website includes statistics about how much faster the machines can perform tasks in popular Apple and third-party pro apps including Xcode, Logic Pro X, Final Cut Pro X, Pixelmator Pro, Adobe Photoshop CC, Autodesk Maya, and others. The numbers are impressive, but it’s worth keeping in mind how old the prior Mac minis had become when evaluating the speed increases.
The base configurations of the mini start at $799 and $1099 respectively and can be ordered today for delivery November 7th.
As demonstrated in the hands-on area of today's event, the diminutive mini can be stacked for uses like build and render farms.
Overall, Apple seems to have delivered. We’ll need to see how these new computers compare on benchmarks against others in Apple’s lineup, but they are entering the market at a higher spot than the last generation models did, which should please many users.
Still, for the mini to remain a viable alternative long term, Apple needs to update it more often than once every four years. Hopefully, with the new internal design, the company has the headroom to make that happen.
i may not be among the first to order, but i will have one of these at home by year’s end. solid double up the middle for Apple here. this gives me hope for the new Mac Pro, but considering how well my 2008 Mac Pro is faring today, i don’t really see the need to splurge for one again at home. at work is largely the same. i need a lot of RAM, three monitors, and gig-e these days. when i bought my Pro i was doing development in various languages, a lot of virtual machine work, and didn’t have an army of servers to spread the load out to. these days if i need horsepower i want it racked and stacked and somewhere i don’t have to hear it, nor deal with the temperatures they produce or cooling they demand.
Microsoft is pledging our massive patent portfolio – over 60,000 patents – to Linux and open source by joining OIN this morning. If you're looking for signs that we are serious about being the world's largest open source company, look no further.
We know Microsoft’s decision to join OIN may be viewed as surprising to some; it is no secret that there has been friction in the past between Microsoft and the open source community over the issue of patents. For others who have followed our evolution, we hope this announcement will be viewed as the next logical step for a company that is listening to customers and developers and is firmly committed to Linux and other open source programs.