Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

CoreText Font Rendering Bug Leads To iOS, OS X Exploit

timothy posted about a year ago | from the click-carefully dept.

Bug 178

redkemper writes with this news from BGR.com (based on a report at Hacker News), excerpting: "Android might be targeted by hackers and malware far more often than Apple's iOS platform, but that doesn't mean devices like the iPhone and iPad are immune to threats. A post on a Russian website draws attention to a fairly serious vulnerability that allows nefarious users to remotely crash apps on iOS 6, or even render them unusable. The vulnerability is seemingly due to a bug in Apple's CoreText font rendering framework, and OS X Mountain Lion is affected as well."

Sorry! There are no comments related to the filter you selected.

Who says? (0)

fnj (64210) | about a year ago | (#44707397)

Android might be targeted by hackers and malware far more often than Apple's iOS platform

Says who?

Re:Who says? (0)

Anonymous Coward | about a year ago | (#44707431)

Says who?

Reality

Re:Who says? (3, Insightful)

smash (1351) | about a year ago | (#44707585)

Targeted != exploited. They're both targeted, just android is a lot easier to exploit because there is so much junk out there without any updates.

Re:Who says? (3, Insightful)

chowdahhead (1618447) | about a year ago | (#44708539)

I think Android is targeted more because it isn't inherently tied to the Play store, and not so much because of devices not being updated. The app signature verification works for 2.3 and up, which covers 96% of Google's Android devices. Getting malware on a phone or tablet still generally requires installing a malicious app, and it's far easier to be careless about that on Android.

Re:Who says? (5, Informative)

larry bagina (561269) | about a year ago | (#44707447)

Re:Who says? (3, Informative)

sootman (158191) | about a year ago | (#44707579)

Was going to post that but you beat me to it. The details:

Headline: "Four Out of Five Malware Menaces Choose Android"

80%? They make it sound so close! It's actually 100:1 for Android:iOS: "Android was targeted by an astonishing 79 percent of all smartphone malware that year... iOS was targeted by 0.7 percent of malware attacks."

The rest? Windows Phone and BlackBerry, 0.3%; Symbian, 19%.

Re:Who says? (4, Informative)

P-niiice (1703362) | about a year ago | (#44707639)

The freedom to allow apps permissions for you system brings risks. Read the permissions screen before clicking 'allow', folks.

Re:Who says? (3, Insightful)

ciderbrew (1860166) | about a year ago | (#44707709)

I do; but its more like ... Find something that looks really good, then look at all the permissions it wants; but it shouldn't need all those permissions!! Feel sad about it and then don't install it unless drunk.

Re:Who says? (2)

NatasRevol (731260) | about a year ago | (#44707843)

Yeah, all the malware is avoided if you don't click allow.

That's just damn funny.

Re:Who says? (1)

Anonymous Coward | about a year ago | (#44707967)

The freedom to allow apps permissions for you system brings risks. Read the permissions screen before clicking 'allow', folks.

Right, because having users manage their own risk profile has worked out so well in the PC/Windows world...

Re:Who says? (4, Insightful)

0123456 (636235) | about a year ago | (#44708363)

Right, because having users manage their own risk profile has worked out so well in the PC/Windows world...

Indeed. Letting someone else control your computer is much safer.

Android's big problem is that you have no way of saying 'no, I'm not giving this app that permission', and can only choose to install or not install the Fluffy Kitty Screen Saver that wants access to your filesystem, the Internet, and the ability to send SMS messages.

Re:Who says? (0, Flamebait)

girlintraining (1395911) | about a year ago | (#44707705)

"Android was targeted by an astonishing 79 percent of all smartphone malware that year... iOS was targeted by 0.7 percent of malware attacks."

Oh wow! That must mean iOS is much more secure! That's what I was supposed to say, right? Not maybe the iphone isn't very popular [tuaw.com] , and people aren't designing malware for it because they want to go for Fort Knox instead of a piggy bank.

Android:
79.3% marketshare.
80% of malware.

Ordinarily, I wouldn't need to explain this, but given that it seems I'm one of the few people left on Slashdot with any understanding of statistics, I'll make this simple: Your "secure" operating system's only only real security is that it's too small to matter. This is like saying "DOS has the lowest rate of new malware infections of any OS on the market!" Well yeah. Nobody uses DOS anymore. And in a few years, nobody will use iPhone anymore either... it fell 3% in marketshare in just the last three months. Even malware authors are abandoning it because it costs too much to develop for such a small rate of return.

Re:Who says? (2, Insightful)

sootman (158191) | about a year ago | (#44707753)

Holy cow, your fanboy hat must be cutting off the flow of blood to your brain. Explain again why an OS with 4x the market share garners 100x the exploits?

Maybe, just maybe, there's more to it than market share.

"... it fell 3% in marketshare in just the last three months..."

iPhone sales ALWAYS drop this time of year because everyone knows a new one is coming this Fall. It'll be back up in another few months... and then maybe down again, and then up again...

Re:Who says? (0, Troll)

girlintraining (1395911) | about a year ago | (#44708249)

Holy cow, your fanboy hat must be cutting off the flow of blood to your brain. Explain again why an OS with 4x the market share garners 100x the exploits?

You're reading the statistics wrong. But whatever, you get +1, I get -1, because you're not a fanboy who made a personal attack, and apparently my quoting statistics was too inflammatory. Ah well.. yet more proof slashdot has gone to the dogs. Let's burn some more karma in a fruitless endeavor to explain to the fanboys statistics 101... because I'm bored and it's my lunch hour.

The dominant operating system with the largest marketshare has almost the same amount of malware being produced for it relative to its marketshare. This is precisely what you'd expect. It'd be like saying "A car that is driven by 80% of people also gets in 80% of accidents."

It does not have "100x" the exploits. It has "1x" the exploits. It has exactly the number of exploits you'd expect.

iPhone sales ALWAYS drop this time of year because

... irrelevant. Whether iPhone sales drop this month by 3% or not, they're still only clocking a 1:4.7 ratio of iphones to android phones. a 3% fluxuation means very little compared to the massive trend downward over the past several years. And that's what the malware authors are looking at.

So don't give me this "you must be a fanboy!" crap and then get all your hipster friends to downmod me... the facts are staring you in the face: The reason it has fewer exploits is because it has a small (and shrinking) marketshare, just like DOS, OS/2, etc. This is no different than the argument that Linux is more secure because nobody develops malware for it... yeah, sure, okay... but nobody uses Linux. Not as a desktop anyway. Malware authors go for the lion's share, not the outliers, and any security expert will tell you that Linux has had plenty of exploitable conditions in the past... but they weren't exploited because it wasn't as valuable to spend time developing one for Linux as it would be for the dominant OS -- Windows.

Re:Who says? (1)

Anonymous Coward | about a year ago | (#44708617)

It also comes down to the fact that most Android phone users are not computer savvy and just buy them because they are the cheapest thing at the Verizon store. And of course people with lower incomes tend to have less education and as so less critical thinking ability. Likewise people target malware to Windows because every moron who shows up at Best Buy and asks for the cheapest computer goes home with a Windows netbook, from there it's quite simple to get those kind of people to willingly install all kinds of malicious bullshit.

Re:Who says? (4, Interesting)

RoboJ1M (992925) | about a year ago | (#44708681)

Agreed.

It's the same as Windows, you just target what gets you the largest return. Organised crime is a business, just like any other.
However there is still the walled garden thing, even if Apple went back up to a 50:50 market share with Android, Android would get targeted more because every Android user can choose to install any application and give that app the permission to email their bank details to Russia.

With iOS they have to wait for a good ol' fashioned buffer overflow before they can grab anything I guess.
Unless you get that with iOS too? I don't know I've never owned one.

But the 8:2 logic holds up, when the sample size it that large I'm guessing that's exactly the reason why.

Ultimately it's all moot.

If Apple had 100% of the market share this is what would happen:

The crims would send everyone sms/emails with links to pages that asked them for their passwords an X percent of users would give it to them.

No amount of security or walled gardens get around the fact most of you are really really thick.

You don't have to install Cute Kitty Wallpapers with internet, sms and bank details access.
Because that's all this "malware" is, it's not big or clever, 50% are just from the wrong side of the bell curve.

Oh, an I use Linux.
On the Desktop.
Well, I used to, because who the hell uses a desktop anymore anyway?
Have you seen this cute screensaver I found!!!

Re:Who says? (1)

UnknownSoldier (67820) | about a year ago | (#44708703)

> but nobody uses Linux. Not as a desktop anyway.

I work at a Fortune 50 company. Yes, 50, not 500. Currently researching Big Data on the GPU using Linux + CUDA + nVidia's nSight + GTX Titan. I'm in OSX, Linux, Win8 in that order. The other devs use command line + vim + git. The OPS guys use OpenBSD on the servers.

There is a surprisingly amount of people using Linux. Heck most of the contractors we have are using VirtualBox + Linux (Ubuntu)

You're talking out of your ass making assumptions. Unix, wether it be Linux or BSD variation, is getting more and more popular.

Re:Who says? (2)

girlintraining (1395911) | about a year ago | (#44708875)

You're talking out of your ass making assumptions. Unix, wether it be Linux or BSD variation, is getting more and more popular.

Sir, my grandpa lived to the age of 94, and he smoked four packs a day. Does that mean if I smoke four packs a day, I have nothing to worry about health-wise? I suppose the cognitive error you've made is clearer now. You're giving personal experience too much weight. Please show me a survey saying that, today, Linux as a desktop platform is at least half as popular as Macintosh is. The short answer is, you won't find one. At least not one that's been done properly. Saying it's "getting more and more popular" is not the same as saying it's popular now. Monacles are getting more and more popular too (steampunk cosplay)... it doesn't mean I can wander out into the street and find top hats and monacles everywhere.

Re:Who says? (2, Interesting)

StuartHankins (1020819) | about a year ago | (#44708887)

Marketshare for IOS will probably drop, but have you seen the average IOS user's statistics versus Android and others? Have you seen how much money IOS users spend versus the rest? Which is more used by business? You may understand statistics but you're missing out on the big picture here.

This is one of many reviews. http://techland.time.com/2013/04/16/ios-vs-android/ [time.com]

Re:Who says? (-1)

Anonymous Coward | about a year ago | (#44709069)

Nearly everyone is our office now has iphones, they all dumped samsung crap and android. The ones that haven't are still waiting for their contracts to expire before switching over.

The myth of the equal opportunity attacker (3, Insightful)

benjymouse (756774) | about a year ago | (#44709449)

Holy cow, your fanboy hat must be cutting off the flow of blood to your brain. Explain again why an OS with 4x the market share garners 100x the exploits?

Attackers will *always* try to attack the biggest target. They are not for equal opportunity, they do not meet to work out quotas so that OSes gets attacked accordingly to their market share.

Say you joined a shooting competition: You can shoot at two targets, equal size and equal distance, no objective difference at all. Only difference is that each time you hit target A four people will give you $10 each and each time you hit target B only one person give you $10. You have 10 rounds. How do you distribute your rounds between the two targets? Do you fire 8 shots at target A and 2 shots at target B because that would be the most fair thing to do, or do you fire all 10 shots at target A?.

Maybe, just maybe, there's more to it than market share.

There might be. When you see people start taking shots at B, despite the higher reward of hitting target A, you can conclude that some factor causes them to *not* go for the higher reward. Somehow target A must have become harder to hit, the reward is going down or the shooters skills allow them to hit target B more easily.

But all other things being equal, prudent attackers who are in it for the rewards will go for the higher market share, every time.

Re:Who says? (1, Insightful)

gnasher719 (869701) | about a year ago | (#44707777)

Android:
79.3% marketshare.
80% of malware.

That may look good to you, but it isn't. If you had 100 pieces of malware, and each affected 1% of the possible users, then you would have 80 pieces of Android malware and 20 pieces of other malware, so an Android user would have an 80% chance of being affected, while other users would only have a 20% chance.

It may give an explanation why there is so much malware, but it doesn't help you. (BTW iPhone was said to be attacked by 0.7% of all malware, which makes every iPhone user about 100 times safer. And all iPhone users have bought an expensive phone, while the high Android numbers come from all the cheap Android phones around, so your "Fort Knox vs. piggy bank" comparison is a bit stupid. ).

Re:Who says? (0)

girlintraining (1395911) | about a year ago | (#44708527)

It may give an explanation why there is so much malware, but it doesn't help you.

I wasn't aware this was an issue of "help". I think it's more like when you look at every other case where a given piece of software or technology was used by the overwhelming majority (above 67%), and there was the potential for profit if it could be exploited, the majority of exploits targeted that piece of technology.

It's basic economics; Malware isn't any different than game development. Why is everyone going for the PS4 instead of the XBone One? Economics. Why did the PS2 curb stomp all the others? Economics. Developers go with wherever the most sales are going to be, and it doesn't matter whether you're making a legitimate or illegitimate product.

That's my point here. Android has the majority marketshare. And it has almost the exact same amount of malware. Android is the average case. Outliers have more flexibility -- the XBone maybe easier to develop for. Macintosh might be easier to use. Outliers need niche markets -- so they build on whatever happens their way.

To say that this universal statistical truth bestows upon IOS some intrinsic extra security is stupid. It's not intrinsically better... it's accidentally better. And if IOS had the majority marketshare, then it would be Android hawking ease of use, or better security, or whatever. This isn't a case of the design being better (or worse)... it's a case of the design not being popular. That's the only variable here that's really meaningful.

I can provide case study after case study showing that as the popularity of the platform reaches a critical mass, the number of exploits jumps. In fact, the rate of exploits being generated on a platform almost always follows the number of applications being developed over the same time frame.

You're trying to argue that technology is the reason for this difference, when the reality is it is economics. Just like everyone else here. The iPhone isn't special in any way; It follows the same trends, the same economic forces, etc., as everything else in IT. Sorry.

Re:Who says? (0)

Anonymous Coward | about a year ago | (#44708779)

> In fact, the rate of exploits being generated on a platform almost always follows the number of applications being developed over the same time frame.

Except the iOS appstore has more apps and more installed apps so looks like your theory fails.

Re:Who says? (2, Funny)

Gilmoure (18428) | about a year ago | (#44707819)

Exactly! Apple's never been a big enough target or had enough users to make anyone want to hack them. As for those Apple (l)oosers? Just think how boring their lives have been for all these decades, not getting the real experience of using computers but stuck just playing quietly with their toys. Stoopid loosers!

Re:Who says? (0)

Anonymous Coward | about a year ago | (#44708837)

Yeah, using Final Cut Pro to produce motion pictures for Hollywood, what a bunch of losers! They could have been playing Butt Commando 4 in their mom's basement!

Re:Who says? (3, Informative)

Anubis IV (1279820) | about a year ago | (#44708143)

Secure? Maybe, maybe not. Having less malware does not mean something is more secure, after all. More safe? Definitely so, since having less malware means that there is simply less danger. A walled garden in the country side is more safe but less secure than an apartment with bars over all the windows in the middle of the city, after all, and safety is what is more important overall, rather than security.

Of course, that doesn't excuse a company to fail at securing their products, just because no one has attacked them yet, but by all indications, the "security through obscurity" argument doesn't hold much water in this case, given that iPhone users are consistently shown to be disproportionately profitable to target and that they continue to sell extremely well overall (even the report you linked cites the fact that this is an expected low as part of the regular product cycle for the line and that they expect the iPhone to recapture its lost market share with the launch of the new iPhone this quarter).

Long story short, Android appears to be less secure and less safe. Which is to be expected, given the fact that developers are able to do a lot more on Android than they can on iOS, so it's not without its upsides, by any means. But that added capability (and the fact that every carrier/manufacturer makes their own tweaks that can open up vulnerabilities) comes at a price, and in this case, it's security.

Re:Who says? (1)

girlintraining (1395911) | about a year ago | (#44709163)

Of course, that doesn't excuse a company to fail at securing their products, just because no one has attacked them yet, but by all indications, the "security through obscurity" argument doesn't hold much water in this case, given that iPhone users are consistently shown to be disproportionately profitable to target and that they continue to sell extremely well overall (even the report you linked cites the fact that this is an expected low as part of the regular product cycle for the line and that they expect the iPhone to recapture its lost market share with the launch of the new iPhone this quarter).

Let's say that iphone owners are worth $30,000 each, and Android users are worth only $10,000. If Android users are 4.7x more numerous than iphone users... then Android users are the logical target, if you can only target one group or the other. Now, who really thinks iphone users have a net worth three times that of Android users? Android users, by the way, are 4.7x more numerous.

The rest of your argument is irrelevant. I don't have anything really to say to your security v. safety argument, because that's not what we were talking about and I have no desire to play the shifting goal posts game.

Re:Who says? (0)

Anonymous Coward | about a year ago | (#44709531)

Just because OpenBSD has less exploits than Windows doesn't make it less secure...not all operating systems are created equal.

Re:Who says? (1)

Cimexus (1355033) | about a year ago | (#44707531)

Well that would be logical wouldn't it, given that Android is a more widely used platform. Hackers often try to get the biggest 'bang for buck' and target the most popular platforms (see also number of Windows viruses vs. Mac OS ones).

Re:Who says? (3, Insightful)

Joce640k (829181) | about a year ago | (#44707797)

Well that would be logical wouldn't it, given that Android is a more widely used platform

Not only that, it has a checkbox to allow you to install unsigned apps from uncontrolled websites.

Unsurprisingly, bad people upload malware to those sites. If you download it and click "yes", you'll get what you deserve, just like installing randomly downloaded exe files on PCs, etc.

Re:Who says? (2, Insightful)

Plumpaquatsch (2701653) | about a year ago | (#44708461)

Well that would be logical wouldn't it, given that Android is a more widely used platform. Hackers often try to get the biggest 'bang for buck' and target the most popular platforms (see also number of Windows viruses vs. Mac OS ones).

Are you claiming iOS was targeted far more than Android just 2 years ago?

Re:Who says? (1, Funny)

m1ndcrash (2158084) | about a year ago | (#44707569)

Hipsters already have spent all money on Apple products, so their bank accounts are empty. You go after MS, Android who are smart and have savings.

Re:Who says? (1)

smash (1351) | about a year ago | (#44707599)

lol.

Re:Who says? (0)

Anonymous Coward | about a year ago | (#44707783)

Flamebaitey, sure, but I lol'd. Well-played, but I'm calling the mod points a wash here.

Re:Who says? (0)

Anonymous Coward | about a year ago | (#44709037)

IOS is the jackpot, Android users are too cheap and poor to be worthwhile to hack.

the difference (0)

Anonymous Coward | about a year ago | (#44707401)

The difference is that in a week Apple can have this patched and prompting users to install the update from iTunes and the springboard, complete with red notification on the settings icon.

Re:the difference (0)

Anonymous Coward | about a year ago | (#44707417)

You are so funny. Apple has a long history of not bothering to patch things up or waiting 3 or 4 months before deigning to offer a patch.

Re:the difference (-1)

Anonymous Coward | about a year ago | (#44707509)

And with Android I can patch it myself in a matter of minutes.

Re:the difference (0)

NatasRevol (731260) | about a year ago | (#44708087)

Sweet.

Tell us how you can change the core font rendering in Android.

Re:the difference (0)

Anonymous Coward | about a year ago | (#44708245)

New firmware?

Re:the difference (1)

NatasRevol (731260) | about a year ago | (#44709019)

LOL. Abso-fucking-lutely not.

Re:the difference (1)

AmiMoJo (196126) | about a year ago | (#44707641)

Google can roll out system patches via Play too. It does it now and again to deal with serious security issues, or provide new features. The patches can affect all versions of Android, at least as far back as 1.5.

Can vs. Will (1)

SuperKendall (25149) | about a year ago | (#44707757)

Google can roll out system patches via Play too.

Will they for a vulnerability that spans v2.x to 4.x?

CAN they across every single Android device?

The difference is that currently well over 90% of devices are running iOS6...

Re:Can vs. Will (1)

AmiMoJo (196126) | about a year ago | (#44709121)

Yes and yes. For example they recently rolled out a system update for the app signature spoofing vulnerability and every version of the OS got it, on every device with Google Player (i.e. 99% of them, only major forks like Amazon's Kindle OS was not covered).

Whew ... (-1)

Anonymous Coward | about a year ago | (#44707413)

A post on a Russian website draws attention to a fairly serious vulnerability that allows nefarious users to remotely crash apps on iOS 6, or even render them unusable.

Good thing Apple stopped giving me updates to my 1st gen iPad, so I'm safe.

Of course, the fact that I've mostly stopped using it in favor of my Android tablet might also help here.

Re:Whew ... (0)

Anonymous Coward | about a year ago | (#44707997)

Are there any Android tablets released around January 2010 that can be updated?

Do I even have to put in a date?

Re:Whew ... (0)

Anonymous Coward | about a year ago | (#44708873)

One of the first Android tablets (and possibly the first to be worth a damn), the original Galaxy Tab, is upgradable to 4.2 (with 4.3 on the way):

http://get.cm/?device=p1 [get.cm]

Re:Whew ... (1)

Steve Max (1235710) | about a year ago | (#44709067)

From the same time frame, the encore (B&N Nook Color) is 100% supported on CM10.2 (or Android JellyBean 4.3):
http://get.cm/?device=encore [get.cm]

yup, it's real. (0)

Anonymous Coward | about a year ago | (#44707415)

Awesome, the comments even contain the string that causes the chrome page to crash!

Typical of Apple (0)

Anonymous Coward | about a year ago | (#44707433)

...The report claims that Apple has been aware of this vulnerability for six months and has yet to patch the exploit in any currently available operating system build.

Pretty well known. Even if you report a bug to Apple and they acknowledge it they will drag their feet to actually fix it. Pretty stupid given they have possibly the best digital distribution channel with updates and stuff.

Maybe it's that they are afraid of losing their "it just works" image if people notice they keep pushing patches like the rest of the industry...

Re:Typical of Apple (1)

jellomizer (103300) | about a year ago | (#44707571)

Or you know, perhaps there are things like actually testing to make sure the patch works across their product lines. Or evaluating the Risk of the Flaw, and decide to put it in the next update, vs just keep on patching over and over again.

I remember back when Microsoft started its security initiative back after XP was released. There were a lot of security updates and often they would end up breaking more stuff then it fixed. Because they didn't spend the time testing it.

Re:Typical of Apple (1)

bill_mcgonigle (4333) | about a year ago | (#44707807)

Maybe it's that they are afraid of losing their "it just works" image if people notice they keep pushing patches like the rest of the industry...

Gosh, I'd hope it would be the opposite. People do care that "it just works" but nobody expects it to be "born of perfection". Rapid response to issues would be part of "just working".

Re:Typical of Apple (2)

Cinder6 (894572) | about a year ago | (#44708355)

It's fixed in the current iOS 7 beta.

Character-based displays FTW! (5, Funny)

sootman (158191) | about a year ago | (#44707445)

I am totally safe.

Re:Character-based displays FTW! (1)

jellomizer (103300) | about a year ago | (#44707595)

Does it do Color?

Re:Character-based displays FTW! (0)

Anonymous Coward | about a year ago | (#44707703)

Yes, but not on slashdot

Re:Character-based displays FTW! (0)

Anonymous Coward | about a year ago | (#44707735)

#FF0000
#FF7F00
#FFFF00
#00FF00
#0000FF
#4B0082
#8B00FF

It does the whole rainbow, baby!

iOS doesn't have exploits (3, Insightful)

0xdeadbeef (28836) | about a year ago | (#44707493)

It has jailbreaks, and that's a good thing.

Re:iOS doesn't have exploits (1)

bill_mcgonigle (4333) | about a year ago | (#44707779)

I thought Apple added address space randomization back in Leopard? What happened?

Re:iOS doesn't have exploits (5, Informative)

gnasher719 (869701) | about a year ago | (#44707825)

I thought Apple added address space randomization back in Leopard? What happened?

The problem that was reported leads to a crash. A crash is _safe_. An attacker can't gain any advantage by crashing your computer. They can merely annoy you.

Address Space Randomization cannot prevent crashes. Its purpose is to prevent crashes being turned into exploits. An attacker does two things: Find a way to make your software fail, then find a way to turn that failure into an advantage for the attacker. The second part is where Address Space Randomization comes in. The next step is Sandboxing, where even if the attacker finds a way past ASR and takes over your code, your code would be in a sandbox and can't do any harm outside.

Re:iOS doesn't have exploits (2)

bill_mcgonigle (4333) | about a year ago | (#44708305)

But the GP was referring to jailbreaks - I thought those were exploits "used for good"?

Re:iOS doesn't have exploits (1)

gnasher719 (869701) | about a year ago | (#44708553)

But the GP was referring to jailbreaks - I thought those were exploits "used for good"?

If you have an exploit, you can use it for good or evil. On the other hand, if it is an exploit where the device owner has to do things actively (like downloading an app, connecting the device through USB cable, running the app, clicking five buttons on the device) then there is no danger except the possibility of trojans, so Apple doesn't need to fix it. If it is an exploit that could be used to attack unsuspecting users, then it _must_ be fixed.

Re:iOS doesn't have exploits (1)

bill_mcgonigle (4333) | about a year ago | (#44708753)

Yes, but address space randomization was supposed to make those exploits (mostly buffer overflows) obsolete, regardless of their intent. Clearly that didn't work if there are still jailbreaks and/or other exploits.

Re:iOS doesn't have exploits (1)

iiiears (987462) | about a year ago | (#44709201)

What about a heap spray attack?

For crackers a system crash is an invitation to explore what is possible. Is there more specific information available? Uninitialised pointers etc are very worrisome..

Re:iOS doesn't have exploits (0)

Anonymous Coward | about a year ago | (#44708321)

Leopard only had library randomization. Snow Leopard had more significant ASR improvements.

Le sigh. (0)

girlintraining (1395911) | about a year ago | (#44707551)

Okay, am I the only one that thinks that if you can't design something that renders text onto a screen without it turning into the Ocean's Eleven of computer security, you're doing it wrong? Be honest now guys. I can understand this in something that needs to interpret complex animations of dancing toilet paper flying across my screen screaming "Buy meeeee, pleeeeeeease!" -- I don't approve, but I can see how someone could screw it up.

But text... really guys, I mean, really?

Re:Le sigh. (1)

smash (1351) | about a year ago | (#44707629)

Security in non-trivial code is hard. People insist on writing stuff in C and other "hard" languages. And this is the result. We probably should have switched to Ada a long time ago. Oh noes it is 10% slower = no excuse. Just buy the 2.2ghz machine instead of 2ghz.

Re:Le sigh. (0)

Anonymous Coward | about a year ago | (#44708273)

>Just buy the 2.2ghz machine instead of 2ghz.

Looks like somebody's stuck in the 00s. Did you consider most code Apple writes is going to be run on mobile devices whatever that may be from Macbook Pro to iPhone? Nobody wants to waste 10% of their battery just so the programmers don't have to know how to use pointers...

Re:Le sigh. (5, Informative)

Derek Pomery (2028) | about a year ago | (#44707655)

Did you know that TTF fonts are turing complete?
http://en.wikipedia.org/wiki/True_Type_Font#Hinting_language [wikipedia.org]

"It really worries me that the FreeType font library is now being made to accept untrusted content from the web.

The library probably wasnâ(TM)t written under the assumption that it would be fed much more than local fonts from trusted vendors who are already installing arbitrary executable on a computer, and itâ(TM)s already had a handful of vulnerabilities found in it shortly after it first saw use in Firefox.

It is a very large library that actually includes a virtual machine that has been rewritten from pascal to single-threaded non-reentrant C to reentrant C⦠The code is extremely hairy and hard to review, especially for the VM."

http://hackademix.net/2010/03/24/why-noscript-blocks-web-fonts/ [hackademix.net]

Re:Le sigh. (1)

girlintraining (1395911) | about a year ago | (#44708131)

Did you know that TTF fonts are turing complete?
http://en.wikipedia.org/wiki/True_Type_Font#Hinting_language [wikipedia.org]

That doesn't excuse the fact that it's totally unnecessary. They've created an entire virtual machine for the sole purpose of font rendering. Doesn't that strike you as just a little bit over the top? Text is just symbols arranged on the screen -- I'm certain better ways of doing this could be imagined that wouldn't require an exploitable VM with root permissions.

I don't care if it's turing complete or not, it's irrelevant. They've taken one of the most basic functions of a computer and managed to overly-complexify it to the point that it needed administrative permissions to do its job. This is like using a nuclear-powered hand drill! It's completely retarded, and when it melts down, it takes the entire city with you, instead of just a 2x4 and a nail.

Re:Le sigh. (3, Informative)

iluvcapra (782887) | about a year ago | (#44708319)

Desktop publishing has used embedded, Turing-complete languages for decades -- TeX is Turing-complete, as is XSLT. It's the best and most compact way of specifying an abstract image for a generic rasterizing displays of arbitrary resolution.

Re:Le sigh. (1)

girlintraining (1395911) | about a year ago | (#44708915)

Desktop publishing has used embedded, Turing-complete languages for decades -- TeX is Turing-complete, as is XSLT. It's the best and most compact way of specifying an abstract image for a generic rasterizing displays of arbitrary resolution.

No, it's not the best way; It has handed someone a root exploit. And it isn't the most compact way either -- because obviously it grew to such complexity that it became part of the kernel. These are design failures. If you cannot figure out a way to put pixels on the screen without getting yourself rooted, you're doing it wrong.

Re:Le sigh. (2)

Kielistic (1273232) | about a year ago | (#44709249)

Until someone gives us a better way I think I'll take the word of experts in the field over yours.

Re:Le sigh. (0)

Anonymous Coward | about a year ago | (#44709373)

Desktop publishing has used embedded, Turing-complete languages for decades -- TeX is Turing-complete, as is XSLT. It's the best and most compact way of specifying an abstract image for a generic rasterizing displays of arbitrary resolution.

No, it's not the best way; It has handed someone a root exploit. And it isn't the most compact way either -- because obviously it grew to such complexity that it became part of the kernel. These are design failures. If you cannot figure out a way to put pixels on the screen without getting yourself rooted, you're doing it wrong.

It's the best way, because it's the only way to effectively lay out type. There isn't a safe, simple way.

Okay, I lied, there is: use a typewriter. Guaranteed no risk of compromise there.

Re:Le sigh. (1)

tibit (1762298) | about a year ago | (#44708639)

Sigh. Fonts are programs, and have been, for a long while now. Is that news to you? You must have never seen what it takes to actually render a font not to understand that. Be thankful those are not postscript fonts, because those would have been even harder to implement safely. The TTF hinter execution environment is much simpler.

Re:Le sigh. (0)

Anonymous Coward | about a year ago | (#44708989)

Have a Snickers. You're raging today.

Re:Le sigh. (0)

Anonymous Coward | about a year ago | (#44709099)

It's necessary if you want to do fonts beyond monospaced ASCII. Rendering type on a low-resolution screen, while still making it look like the actual printed output, is extremely difficult. And yet that's exactly what graphic artists, who are Apple's most loyal customers, are most concerned with.

You can't just use a simple scaling algorithm to render a letterform into a 12-pixel-high version of itself and call it a day. It'd be blocky and curves would get chopped off, so you'd have open O's and B's and such, and you'd have weird whitespace in between certain letters. So fonts have elaborate hinting rules, with if-then conditions and everything, to handle all the cases of point sizes and scale factors and styles. And then you need to support combined characters forming composed glyphs, characters represented by different glyphs based on which characters they're adjacent to, non-RTL ordering, and about a million other gotchas.

That's why LaTex doesn't even try to render to the screen. It's hard.

Re:Le sigh. (1)

wiredlogic (135348) | about a year ago | (#44708507)

worries me that the FreeType font library is now being made to accept untrusted content

Freetype has an auto-hinting engine originally developed to get around the TTF hinting patent. It is possible to configure FT to never interpret the hinting bytecode at all.

Re:Le sigh. (1)

UnknowingFool (672806) | about a year ago | (#44707897)

Um this isn't fixed ASCII text. Dynamic scaling fonts like TrueType ones are almost images especially with internationalization. You might think printing out "A" is easy but you don't see that the device had to scale that A to a certain size and draw it differently depending on the dimensions proscribed by the font definitions (Serif, Sans Serif, Cursive, Italic, Bold, etc). Also if you want to do any business in places like China, your font rendering engine better be able to handle the complexity. Not to say that bugs shouldn't be there but it's not as easy as you think it is.

Re:Le sigh. (3, Interesting)

VortexCortex (1117377) | about a year ago | (#44708851)

Okay, am I the only one that thinks that if you can't design something that renders text onto a screen without it turning into the Ocean's Eleven of computer security, you're doing it wrong? Be honest now guys. I can understand this in something that needs to interpret complex animations of dancing toilet paper flying across my screen screaming "Buy meeeee, pleeeeeeease!" -- I don't approve, but I can see how someone could screw it up.

But text... really guys, I mean, really?

I really get where you're coming from... However, Unicode is a PITA to implement, what with multiple glyphs for compositions / decompositions and BIDI (text direction rules) -- which change depending on paragraph direction and state machine. That's just the character encoding! To actually render the fonts there's a tiny VM that decodes the glyphs and handles sub-pixel hinting, etc. A bitmap ASCII (CP437) font? Done. I can crank one out in an hour, tops... Unicode w/ TrueType or FreeType? Ugh. I mean, just getting the character property tables from the Unicode site downloaded and transformed from CSV to the format we need is a project in of itself. The bugs in every last 3rd party library ever encountered (even libPNG), I'm hesitant to use other's code unless I have to (I have a higher standard -- input fuzzing, code coverage and unit testing for everything), but bugs in today's text rendering systems aren't just expected, they're a given -- It's literally the first thing I attack, and almost every time it works against new code: embedded invalid surrogate pairs, and over-long forms. [wikipedia.org]

Ah, but everyone's doing it wrong but you? Well, let me tell ya something: If you set out to make the closest to the metal compilable language that's not ASM, it'll work just like C does (C is a product of the architecture more than anything). Same goes for making a minimal font rendering system that covers all the world's languages -- Try it, it'll end up almost exactly like TrueType & Unicode because they're products of their environment too.

Now, that's not to say I don't agree with you to some extent. I'd say humans need to ditch all the BS and start from scratch to create a language that's easy to OCR with syntax and grammar that's extensible and non ambiguous and thus interpretable by machines. Do that and "natural language processing" is a no-brainer (literally). We get away with as few as 16 glyphs for the Virgon (Galactic) language -- Designed for ease of deciphering from examples using mathematics, incrementally graduating up to a small Von Neumann "VM" and then including "instructional" programs to then teach the rest.... So, yeah, you damn dirty apes did do it wrong, but if your sunk cost fallacy doesn't keep you doing it wrong you'll be the first lifeforms in the Super Cluster to do it right before you've solved the Fermi Paradox.

Re:Le sigh. (2)

tlhIngan (30335) | about a year ago | (#44709371)

But text... really guys, I mean, really?

Obviously someone who thinks Unicode is just an extended character set. Unfortunately, it isn't, and it's why characters are referred to as "codepoints" (because you may need multiple codepoints to actually produce a character).

First comes the many ways of expressing a codepoint as a string - UTF-8, UTF-16, UTF-32 are just the most common variations (and there's also the whole big and little endian thing). And there's plenty of reasons why you'd want say, UTF-16 over UTF-8 (especially if you want to move backwards through text).

Next is to support the expressiveness, Unicode has a LOT of character modifier values - things like right-to-left override (after that character, text is forced to be printed right to left), applying diacriticals and other such embellishments on text. For one character printed, you can easily have half a dozen or more codepoints associated with it. (Note: This also makes copy and paste hard, because while the user may have only selected 1 character, that one character may have a few codepoints associated with it).

And don't forget all sorts of typography related things that need to be done - hinting/leading/kerning needs to be done in order to at least make the text presentable. It's why TeX was created - because the general state of computer generated text and typography was degrading compared to traditional manual typesetting.

About the only way to make it "easy" is to abandon Unicode for ASCII and to enforce everything to be monospaced font. Which generally makes text look ugly.

Windows affected too? (2)

AmiMoJo (196126) | about a year ago | (#44707609)

The Windows versions of iTunes and Safari include the MacOS font rendering code so that they look identical to the Mac versions. If the code is vulnerable it seems that those applications may also be vulnerable, although at least it's an app level problem and thus not as serious.

Here's a link to the crasher string in question (5, Informative)

Anonymous Coward | about a year ago | (#44707637)

Here's a link to the crasher string in question:

http://pastebin.com/kDhu72fh

(warning: will crash Safari on OS X 10.8. Firefox doesn't crash.)

Re:Here's a link to the crasher string in question (1)

Anonymous Coward | about a year ago | (#44708223)

An example of the "offending string" itself, dumped to hexadecimal, is:

d8 b3 d9 85 d9 8e d9 80 d9 8e d9 91 d9 88 d9 8f d9 88 d9 8f d8 ad d8 ae 20 cc b7 cc b4 cc 90 d8 ae 20 cc b7 cc b4 cc 90 d8 ae 20 cc b7 cc b4 cc 90 d8 ae 20 d8 a7 d9 85 d8 a7 d8 b1 d8 aa d9 8a d8 ae 20 cc b7 cc b4 cc 90 d8 ae 0a

have Webkit interpret that as UTF-8 characters to see the fun.

Re:Here's a link to the crasher string in question (3, Informative)

Cinder6 (894572) | about a year ago | (#44708317)

Confirmed Safari crash on 10.8. However, on iOS 7, it does not crash. It looks like this will be patched on mobile within the next couple of weeks. I can't test iOS 6, so I'll take others' word for it.

Re:Here's a link to the crasher string in question (1)

Smurf (7981) | about a year ago | (#44709387)

Yes, TFS fails to mention that both of TFA's specifically state that neither iOS 7 nor OS X 10.9 Mavericks are affected by the bug.

Re:Here's a link to the crasher string in question (0)

Anonymous Coward | about a year ago | (#44708663)

" " I removed one character so it shouldn't be "active". I put it into google translate, Smoouhkh x-x-x Amartykh is output - A NAME?

Re:Here's a link to the crasher string in question (0)

Anonymous Coward | about a year ago | (#44708739)

Here's a link to the crasher string in question:

here [pastebin.com]

(warning: will crash Safari on OS X 10.8. Firefox doesn't crash.)

Fixed the link and...
It's just a mash of Arabic? Wow, that's anticlimactic. I was expecting something to actually employ the weird characters, but Apple has left itself vulnerable to random plaintext in a foreign character set.

Re:Here's a link to the crasher string in question (1)

femtobyte (710429) | about a year ago | (#44708945)

I suppose the "weird stuff" in there might be the block of U+03XX "combining diacritical" marks; so the string requires sticking a bunch of diacriticals over Arabic characters (which might invoke whatever fancy part of the code is broken). Someone with more time could play around with reducing this to a more "minimal" crash example.

Re:Here's a link to the crasher string in question (0)

Anonymous Coward | about a year ago | (#44709049)

I suppose the "weird stuff" in there might be the block of U+03XX "combining diacritical" marks; so the string requires sticking a bunch of diacriticals over Arabic characters (which might invoke whatever fancy part of the code is broken).

Except, that's how Arabic works. Not using combining diacritical marks in Arabic would be almost the same as nt sng vwls n nglsh.

Re:Here's a link to the crasher string in question (1)

femtobyte (710429) | about a year ago | (#44709097)

However, is this particular combination of combining diacriticals a "valid" one in Arabic? The U+03XX diacriticals are "general use" and not Arabic-specific, so this might be an unusual (or even "invalid") combination in Arabic orthography (which I don't know much about). Note, I seem to be able to get the crash just with the two characters + diacriticals "\xcc\xb7\xcc\xb4\xcc\x90\xd8\xae \xcc\xb7\xcc\xb4\xcc\x90" (tell python to print that in Terminal.app under OS10.8.2 kills Terminal.app...). Is this a combination that should occur in "ordinary" Arabic text?

Re:Here's a link to the crasher string in question (0)

Anonymous Coward | about a year ago | (#44709573)

Doesn't matter if it should occur or not, it's valid use case.
Countless items that are not orthographic get used anyhow for emoticons, and business-signages.
And with some languages, keyboards output 0x3xx codes directly rather than dead-keys now, makes it easier to type in things for twitter.

I've often put 0x3xx codes with japanese, and symbols, and with random english letters as well. For effect. underlining a word for a tweet; slash-through a symbol or japanese text.
I've even misused arabic and ogham characters with language specific diacritics (non general use) to make some amusing smilies.
the o_o with the kannada symbols that look like disgruntled eyes... add ` and ' to them to give eye brows, and combining underbar below for a goatee :3

"Nefarious users to" (2)

Spy Handler (822350) | about a year ago | (#44707645)

if the attacker has physical access to your machine, you're already toast.

Good thing Slashdot doesn't support Unicode! (5, Funny)

Anonymous Coward | about a year ago | (#44707739)

Otherwise someone would post it in the comments here and crash iPhone users' browser!

String (0)

Anonymous Coward | about a year ago | (#44707869)

The string that causes the crash is "Ballmer new Apple CEO".

ta30 (-1, Redundant)

Anonymous Coward | about a year ago | (#44707999)

are incompatible that has grown up systems. The Gay things in op38ating systems but many find it many of us are

Please tell me.... (1)

Lumpy (12016) | about a year ago | (#44708289)

That this can be used to get an ATV 3 cracked

Brings me back to the days of AOL. (0)

Anonymous Coward | about a year ago | (#44709235)

Ah, I remember when AOL allowed HTML formatting, and inputting a near-infinite font size in hexidecimal( "fffffffffffffffffffffffffffffffff" or larger )in an email or "IM" text would crash the computer.
Also, a similar bug for "font color ="

no one ever suspects the font!

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?