Sunday, September 23, 2012

On sealed white boxes

I finally got hold of a copy of Final Cut Pro X this summer (paid for too). Initial impressions are that it's a fantastic video editing application, justifiably a leading industry standard and so many steps up from iMovie that it takes video editing to a totally different place when being done at home. Blah, blah, it's great, blah, blah.

Now the downer is, whenever I load it, I get the above error message before anything else happens, basically saying that the graphics card isn't good enough to handle this application. That little prompter isn't just something you can close and ignore - it impacts on what I'm trying to do with the application. So far (I've still done very little in FCP X), it's lead to dropping frames during playback, which is somewhat annoying. I would expect that as I get to know FCP X more, I'll find other quirks of being able to run the app but on a feeble video card.

I checked what the required specifications were for running FCP X and I needed an OpenCL-capable graphics card or an Intel HD Graphics 3000 or later. Didn't know anything about these, but the lowly card pre-installed in my machine is an ATI Radeon x1600 128 MB, which basically means that it's a bit of a seven-stone weakling as far as such processor-intensive activities as video editing are concerned.

Wanting to find out if I could replace the graphics card myself (as I'd done with the RAM in my apparently now-ancient late-2006 20" iMac, I searched a few forums then also called around a bit to see if this could be done. Brighton being the kind of place that it is, there are a lot of Mac users down here, which means that there are also a lot of support options beyond just going to the Apple Store and watching someone from the Genius Bar do a Google search. Thanks to the good folks at South Coast Computers, I had a very informative conversation with one of their engineers on Friday.

The short answer to this quandary is that I can't upgrade my graphics card. It seems that in this particular model, the card is welded to the main board - apparently not the case in all Macs and certainly not the case in most PCs. If I wanted a new card, I'd need a new motherboard too, and as I have zero experience in fitting Mac motherboards, I'd probably need to shell out on paying service fees too. All-in-all, I got given a ball-park figure of around £500. Given that this is roughly half the price of a new one (and I am not planning on getting a new iMac any time soon as in most capacities, it's doing just fine), I decided against it and will just have to live with things as they are.

I don't tend to gripe at Apple very much as even though there are many criticisms that could very easily be levelled at the company, as I'm rather fond of their products. However, this situation did leave me slightly peeved. I'm not an experienced engineer, but given a blog page with the photos of how to do something, I'm quite a happy hacker (having replaced an old iPod battery and the RAM in the iMac using that method, for starters), so would prefer to be able to do this myself.

I suppose that once you buy a sealed box that is not designed to be opened by most people, you should expect things like this. Still, it's nice to have the option, even if most people don't use it.


idleformat said...

Very frustrating for you. I switched to Mac two years ago - in part thanks to your recommendation - and am happy to say I've never regretted it. I've gone on to use iOS devices off the back of that switch & am frequently amazed how well they all work together. But there have been frustrations with Apple's decision-making along the way: the remorseless way features are left out of current products just to make to upgrade sooner rather than later; the lack of iCloud support in OS X Snow Leopard until I upgraded to Lion. Sure, small fry compared with your issue. But then this week, the iOS 6 maps fiasco has really undermined my faith in Apple giving a damn about their products. Maps was an app that I used all the time. Now I can't. That Apple would knowingly release something so deficient and error-strewn compared with the previous Google-based version has really been a blow. To me it seems very cynical on their part to release something so... bad. But then, according to this Guardian article, Apple have been making mistakes for years:

globalism said...

Glad to hear that your transition to better computing has been a (mostly) successful move. Pleased to have also played a small part in that too!

When I was getting my first computer, about 12 years ago, I was about to start a graphic design course. All the advice I was given for that was 'go Mac', but I was also advised that as Windows and PCs were much more widely used, this was a safer bet. That first PC (running Windows ME, no less) lasted a total of three years before it died completely, having been significantly put through its paces. The laptop that followed it that I took to Japan with me, running the more-reliable XP, lasted just as long. In both cases, that final year for both machines was a real struggle too. I made three albums of those two PCs, but was never that impressed at how they handled.

I went into a store in Tokyo's electronics district, looking for a new machine to replace the one that was dying. There were rows of sleek Sony Vaio's trying to tempt me and I was almost swayed. However, they were running Vista (shudder) in Japanese only and it was really difficult to switch the whole OS to English. I was about to walk away and rethink, then the salesman I'd been speaking to called me back and asked if I'd considered a Mac. With a single, unobtrusive dropdown menu, I could switch easily between OS languages. It was an educational moment - Microsoft wanted users to stay in their silos, while Apple understood that the modern global citizen wanted more than that.

That first iMac is still going strong (enough) six years later, with little sign of the same unstoppable death spiral that both of my previous PCs entered into. Although it was more expensive, it's proved itself cheaper (or at least more cost effective) in the long run. From the slickness of the interface and the seamless ways that not only apps within the OS but also other parts of the ecosystem integrate with each other to the degrees of attention paid to engineering precision and the fact that the CEO that drove the company was a college dropout that geeked on typography classes, I'd long found them hard to resist. However, I also recognise that the devotion they inspire borders on the bizarrely cultish, that someone had a sweat very hard to build me my touchscreens, and that Apple are after all a huge corporation with every intention of remaining so.

Google says 'don't be evil'. They sorted out the problem of how to find stuff on the Net, but then reached into every aspect of our lives and tried to sell us personalised advertising on the back of it. Facebook made it even easier to connect with people from all around the globe, yet on several accounts made just the same corporate decisions about their userbase. These are the corporate giants that shape our world in much the same ways as others have shaped their generations. Apple are trying to push Google off their platform as it's the data that has the value. It's not that the iOS6 Maps are bad, it's that there is poor data in them, which is fixable (although an admittedly huge task if it's that widely spread).

For me, I want to live in a world where I can harness the computing power that is around me and that means Apple products as platform of choice. They're also big enough to take it if I have a little pop back at them too.

As Dylan said, 'don't follow leaders, and watch your parking meters'.