Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Software Government Politics

FCC Rules Open Source Code Is Less Secure 365

An anonymous reader writes "A new federal rule set to take effect Friday could mean that software radios built on 'open-source elements' may have trouble getting to market. Some US regulators have apparently come to the conclusion that, by nature, open source software is less secure than closed source. 'By effectively siding with what is known in cryptography circles as "security through obscurity," the controversial idea that keeping security methods secret makes them more impenetrable, the FCC has drawn an outcry from the software radio set and raised eyebrows among some security experts. "There is no reason why regulators should discourage open-source approaches that may in the end be more secure, cheaper, more interoperable, easier to standardize, and easier to certify," Bernard Eydt, chairman of the security committee for a global industry association called the SDR (software-defined radio) Forum, said in an e-mail interview this week.'"
This discussion has been archived. No new comments can be posted.

FCC Rules Open Source Code Is Less Secure

Comments Filter:
  • by canUbeleiveIT ( 787307 ) on Friday July 06, 2007 @01:01PM (#19769517)
    Just goes to show how much a bunch of gov't bureaucrats know. Or maybe there just being ass-kissy with business again.
  • Amusing (Score:5, Insightful)

    by ebbomega ( 410207 ) on Friday July 06, 2007 @01:01PM (#19769519) Journal
    Because Security Through Obscurity totally worked for:

    MPAA (DeCSS)
    Nazis (Enigma)
    Xerox (Robin Hood & Friar Tuck)
    Microsoft (just about any form of security they've ever had)

    and about a billion other examples
  • by Space cowboy ( 13680 ) * on Friday July 06, 2007 @01:03PM (#19769553) Journal

    If I'm trying to break into some code, and I can read the source code to determine how the author protected it, I'll have an easier job (note: "easier", not "easy") because I can home in on the algorithm the author used. I know whether it's Blowfish, DES, AES, IDEA, or a simple XOR or substitution cipher. I know what pre-encrpytion steps were taken, and what post-encryption algorithms were used.

    Let's say that in a moment of insanity, I decided to use a basic XOR encryption routine (create each byte in the encrypted stream by XOR-ing the corresponding source byte with every byte in the password save one, rotating that one as I iterate over the source). This is completely and utterly trivial to crack if you have the source code and *know* the routine I used. It's a repetitive cypher, so it's reasonably obvious unless the password is of significant (a sizeable fraction of the source's length) as well. Note the difference - it's easier with the source code.

    Now that's a contrived example - no-one in their right minds would use an XOR cypher, but the same principle applies to harder encryption techniques. If you *know* what system was used to protect the source, you have an advantage over not knowing... Did they gzip the source before encrypting it ? Did they use ZIP, RAR, or 'compress' instead ? Did they XOR to hide the obvious compression header ? Is it inverted (last byte first) or was any other transformation done *before* the encryption stage to try and make it non-obvious that a successful crack had taken place ? These are all "knowns" if you have the source code...

    So, yes, it is easier when you have the source code. Security through obscurity is rightly derided, but not because it has no value. It is derided because it leads to the use of insecure encryption methods (small keys, using XOR/whatever instead of proper hard encyption, etc) and the fact that once the obscurity is cleared up, there's no more security. The idea is that if you are sufficiently confident that your encryption is unbreakable, you *can* document how you did it in public. That doesn't mean you *should*.

    The point though, and why I disagree with the regulators, is that if you're using hard encryption, it really doesn't matter whether it's *easier*, it's not *easy*. It is in fact still so damn hard, that we're talking "impossible in our lifetime(*)" - the relative comparison makes no sense. It's akin to measuring the height of Mount Everest at 6-month intervals - it's always pretty darn high, though you might find some variance due to snowfall.

    So, yes, they're right. But by not considering the (tiny) impact of their conclusion, they have made the wrong ruling.

    (*) Modulo the discovery of an easy way to crack the encryption technology, of course.

    Simon.

  • by KiltedKnight ( 171132 ) * on Friday July 06, 2007 @01:05PM (#19769581) Homepage Journal
    From TFA:

    The SDR Forum has cited the Secure Socket Layer (SSL), a widely used technique for securing e-commerce transactions, and the National Institute of Standards and Technology (NIST)'s public hash algorithms as evidence that open processes often yield the most highly successful security techniques.
    Very typical. First, they say that the stuff is not as secure as the "security by obscurity" method, then they go and say the most widely accepted and used method for secure web transactions is evidence that open source software yields the most highly successful security technique.

    And we keep voting the same crew into office who keep appointing the same bozos to the FCC... shame on us.

  • by boguslinks ( 1117203 ) on Friday July 06, 2007 @01:11PM (#19769715)
    from TFR:
    A system that is wholly dependent on open source elements will have a high burden to demonstrate that it is sufficiently secure to warrant authorization as a software defined radio.

    By this they probably mean, if the radio is open source then any DRM is useless, and this is insufficiently respectful of the benighted Copyright Holders of whatever is being played, thus it is "less secure."
  • by pavon ( 30274 ) on Friday July 06, 2007 @01:12PM (#19769719)
    I am somewhat perplexed as to why the FCC would need to be regulating the security of consumer devices. For organization that need secure communications, there are already many government and private certifications, that insure this. But why on earth would they restrict consumers from purchasing non-secure software radios if they don't need them?

    Is this because they feel that software radios could be hacked to broadcast outside of their certified frequency and power limits? Or because they think they need to protect the public from buying 802.11 routers with crappy WAP implementations?
  • not about security (Score:2, Insightful)

    by mevets ( 322601 ) on Friday July 06, 2007 @01:16PM (#19769785)
    The security bit is just a cover story. This is about some perceived danger to the RIAA, MPIAA and similar cartels.
  • by kebes ( 861706 ) on Friday July 06, 2007 @01:18PM (#19769811) Journal
    You're quite right. Obscurity does provide some level of security, though relying on it alone is a surefire way to have your security cracked eventually. (Whereas things that are cryptographically secure will not be cracked in my lifetime.)

    Another way to look at it (especially in the context of open source radio) is that whoever is implementing the security has finite resources (money, man-hours, whatever) at their disposal. For every hour they spend trying to obfuscate the inner workings, that is one less hour spent validating that it is *truly* secure (in the "cryptographically secure" sense). If you instead leverage open-source, then you have code that has been tested and vetted by experts the world over. Suddenly the hours spent on adding obfuscation would be a waste of resources: the code is already so secure that adding the slight additional security of obscurity is a waste of time.

    So, while obscurity does provide some kind of security... it is actually the most resource-wasteful form of security (alot of effort for something that eventually gets cracked), whereas the more efficient security model is to focus on things that are fundamentally secure (in which case you may as well use open-source solutions, since you get to take advantage of work already done, and the marginal loss of obscurity doesn't end up mattering).
  • by Anonymous Coward on Friday July 06, 2007 @01:19PM (#19769825)
    The FCC has absolutely no power to regulate nor any say at all in how software radio or television are implemented.

    The FCC commisioners are deluding themselves, again, if they think Congress gave them the power to appoint monopolies.

    They have already been slapped down once with regards to the DTV Redistribution Control flag and they're about to be slapped down again.

    What's next, washing machines and clock radios?

    http://pacer.cadc.uscourts.gov/docs/common/opinion s/200505/04-1037b.pdf [uscourts.gov]

    If the Foolish Child Commission can't remember the limits of their power, We the People will be more than happy to remind them, spank them and send them to their 'time-out' corner once again.
  • MoCSSRH (Score:2, Insightful)

    by gr3kgr33n ( 824960 ) on Friday July 06, 2007 @01:20PM (#19769841) Homepage Journal
    Well, if they [FCC] are going to take this stance, it is our duty to enlighten them as to the consequences of their actions.

    I would like to see a Month of Closed-Source Software Raido Hacks

    Then they [FCC] will discover that since the closed source software radios are not examined by independent unbiased debuggers, the possibility of bugs, bad encryption schemes, et al is a very high possibility.
    Maybe then the government bureaucrats will see the merits of Open Source.
  • by Space cowboy ( 13680 ) * on Friday July 06, 2007 @01:20PM (#19769849) Journal
    Oh for [insert deity]'s sake, please don't tell them that... If they actually start thinking through every possible way someone could do harm on a plane, they'll shut down the airlines "for your safety and convenience"...

    At the end of the day, the most dangerous thing is an intelligent mind with the goal of doing harm. There is little-to-no way to protect against that, but it's not a politically acceptable truth, so they just make life difficult for everyone and hope for the best [sigh]. The *only* reason for all this is to protect *themselves* from a "you didn't do anything" accusation after the fact.

    If people would just accept that life == risk, we'd be a lot better off.

    Simon.
  • by eln ( 21727 ) * on Friday July 06, 2007 @01:22PM (#19769869)
    They believe what the people who give them the most money want them to believe. Welcome to government.
  • by newandyh-r ( 724533 ) on Friday July 06, 2007 @01:24PM (#19769905)
    If the end-user can modify the source with reasonable ease:

    They can easily bypass any "broadcast flag";
    They can remove restrictions on which channels a scanner can scan;
    They may be able to transmit on forbidden channels or at
    power levels that are above those permitted for a channel.

    That is the sort of hacking that frightens the FCC

    Andy
  • by BitchKapoor ( 732880 ) on Friday July 06, 2007 @01:24PM (#19769907) Homepage Journal

    From TFA: The SDR Forum has cited the Secure Socket Layer (SSL), a widely used technique for securing e-commerce transactions, and the National Institute of Standards and Technology (NIST)'s public hash algorithms as evidence that open processes often yield the most highly successful security techniques.

    Very typical. First, they say that the stuff is not as secure as the "security by obscurity" method, then they go and say the most widely accepted and used method for secure web transactions is evidence that open source software yields the most highly successful security technique.

    And we keep voting the same crew into office who keep appointing the same bozos to the FCC... shame on us.

    These are two different groups. The FCC is advocating security through obscurity, while the the SDR Forum is advocating open source. Get it?

  • Nonsense (Score:5, Insightful)

    by Anik315 ( 585913 ) <anik@alphaco r . n et> on Friday July 06, 2007 @01:28PM (#19769973)
    There's nothing inherently secure about closed source software or anything inherently secure about open source software. In fact, closed source software that is not secure when the source code is visible is not really secure at all.
  • by QuoteMstr ( 55051 ) <dan.colascione@gmail.com> on Friday July 06, 2007 @01:36PM (#19770061)
    That's what code burned into ROM is for -- or hell, EPROM or even EEPROM would be fine, so long as it can't be erased through normal operation of the device.

    If the FCC is that concerned about software radio operating out of spec (which I personally believe isn't really going to be a problem), then it should mandate hardware access controls on all radios.

    Ultimately, ANY solution that relies on locking down client devices is doomed to failure. People can, and do, tinker with their own devices.
  • Re:The FEDS (Score:3, Insightful)

    by wbren ( 682133 ) on Friday July 06, 2007 @01:36PM (#19770069) Homepage

    ALL the Federally APPOINTED people , are BUSH supporters, and they fail to know the law!
    We know who they are , and ignorance of the law is no excuse. BINGO !!!
    Shockingly, there are also plenty of Democrats that are ignorant of computer security issues. Sorry, but that's the truth, and I'm no Republican or Bush supporter myself. Ignorance of how to make a point is no excuse...
  • Ceteris paribus (Score:5, Insightful)

    by hey! ( 33014 ) on Friday July 06, 2007 @01:40PM (#19770129) Homepage Journal
    "Ceteris paribus" -- assuming "allthings being equal", which they never are.

    True, if you have two equally boneheaded pieces of software, then exploits in a the closed one are harder to divine -- not by much, but harder. On the other hand, if you have a piece of software that has survived years of public scrutiny by experts, that is presumptively harder to exploit than something some random engineer ginned up in secret.

    Something cannot be widely reviewed (which is the gold standard in security) and secret at the same time. So generally, I think open source represents the best by far and the worst by a little of security possibilities.

    The ultimate problem is that broad statements like X is more secure than Y are meaningless. You have to specify the context and threat you are concerned with. Is an open source interpreter burned into a ROM inside of microwave oven more vulnerable than a proprietary interpreter? Well, against what? Same goes for the software radio thing.

  • by Overzeetop ( 214511 ) on Friday July 06, 2007 @01:41PM (#19770147) Journal
    Whoa, there. There are lots of ways to violate FCC regulations with off the shelf hardware. Whether it happens in hardware or software, it's still illegal. There's no reason that OSS can't comply, they're simply arguing that somebody could re-code it to be non-compliant. Hardly a valid reason for disallowing it.
  • Re:Amusing (Score:5, Insightful)

    by Martin Blank ( 154261 ) on Friday July 06, 2007 @01:42PM (#19770155) Homepage Journal
    When the Germans kept Enigma a secret, they did nothing more or less common than anyone else was doing, or still does for the most part. National governments by and large do not leave their communications to AES, but instead use (what they at least perceive to be) more secure methods. NSA still keeps our codes secret, Russia's FSB keeps its codes secret, and the UK's GCHQ keeps its codes secret.

    One of the advantages to this is that the limited distribution of a given code can (but does not always) limit the number of attacks against it. Whereas a commercial cipher may result in millions or even billions of ciphertexts to analyze, a government cipher may result in only thousands to work with, and it may be more difficult to determine plaintext aspects of a given document for comparative analysis. It's also generally difficult to get the actual cryptographic hardware without paying someone (either from inside or outside) to steal one.

    This doesn't work well at all for the kinds of things that the FCC covers, however. I can generate billions of ciphertexts with known plaintexts for some new wireless system, and I can also do analysis against the electronics involved to look for side-channel attacks. Hiding things for commercial items intended for the general public is fairly pointless.

    Side note: I'd not heard of the Robin Hood & Friar Tuck trick. That was some very fun reading. Thanks for brightening my morning a bit. :)
  • by Space cowboy ( 13680 ) * on Friday July 06, 2007 @01:42PM (#19770169) Journal
    The thing about pretty much all the discussion over 'security through obscurity' is that it compares a 'secure-because-obscure' to a 'secure-without-being-obscure' mechanism. I'm not saying that the use of a secure-through-obscure mechanism is a good thing, and if you read my post, you'll see that.

    My point was that if I'm using a hard-encryption mechanism, then I can additionally do things that would render a "cracked" result difficult to determine. If you know what you're looking for (ie: the algorithm is open source), I can't do that. I wasn't trying to say "just use secure-through-obscure' methods, I was saying that they can have some value when also combined with hard encryption.

    I also disagreed with FCC (at the end of the post). It was sort of amusing to watch the moderations (up to 5, down to 2, up to 5, down to 3, up to 5). I'm left wondering whether those that moderated me down actually read what I wrote (and thought I was wrong), or just read the title of my post, and gave a knee-jerk response...

    Simon
  • by Harmonious Botch ( 921977 ) * on Friday July 06, 2007 @01:43PM (#19770177) Homepage Journal
    They are more familar with the idea of secrecy and control than ideas like cooperation and standards.
  • by Dunbal ( 464142 ) on Friday July 06, 2007 @01:44PM (#19770191)
    That is the sort of hacking that frightens the FCC

          And with their infallible logic they conclude that closed source means you cannot remove restrictions, transmit on forbidden channels/power levels and bypass broadcast flags. Because no closed source program ever has been bypassed, modified or otherwise hacked. Days and even hours after its release.

          When will these people learn that the PEOPLE have the power, not the government? We the masses obey ONLY when it suits us. If they have to go to such great lengths to try to limit us, perhaps what they are trying to do is not such a good idea after all? They just don't get it.
  • by AHumbleOpinion ( 546848 ) on Friday July 06, 2007 @01:47PM (#19770229) Homepage
    I am not agreeing with the FCC on this one, but I am going to defend "security through obscurity" a little due to expected /. audience oversimplification and knee jerking. At times "security through obscurity" is a perfectly valid and desirable approach when used *alongside* other good techniques. It is only bad when it is the foundation of your security. Note that I am only addressing the security angle and not addressing open source philosophy (or for some out there religion).
  • Exactly. Hey, FCC: Decrypt this:

    -----BEGIN PGP MESSAGE-----
    Version: GnuPG v1.4.5 (GNU/Linux)

    hQIOA3zQFkc0jOpLEAgAkeu9YYOYA2YLePtUm3tGthW7fBO1RN BM/EBDJ3FkQdfZ
    avUq5gRrYhZ/vwo5MfMe950/SpZcgaUpN4pOoNQQFEyD8QYMjB mnvU0sH0iUAvza
    oZvcvq7cxiswhUPwSFZPVz8vyGW0WqP6aTcRxF/EA71Jo2IbMs aoSMKv2T1Jkr04
    OnGhFO5hEhNkAPEpoIucdkVKMn3U+Cmj846vj/I4CIaLu99mHw p150fuSgI1Jfua
    8Ax9ztv9Krx74khTlOIwW/5nLKz6IXqDRn8YIehA3YmWuddFGg 7vcoMlMgmsficz /PJCe0acA5zvOuY1ISYnqB6aeAKe3caU+RY2MVDYxwgAv5+pdr Z1nyOaOzVFdVFD
    +qRRoX3CPt5BsQxjgCYvwc3yqi9anUGbxglOMj3xPHJKSdjzgK OPsbDiA0EJxbLZ
    YgFPU+rW6bk/HUnlu0vyavgp4f6fPCCHFYXKhFVbxU4i4uEx+t zZH3UB/qsFX+MA
    YyqWWBvUfTsG+rqKTqgtlM9YAz9VoxwrY7mls7TOdcIigKdeCH sF8qOMsAwQFT9M
    lcFBzpzDv2Bl6Puh8cN5cIPnJAI5W8M9792szOTxv2A+4wNQW0 6UipSCBYXuZ9/E
    +b3EtraDOg6ZZB5W/BdiQDBWeJlO/Kedm4tAhCuUObYtvlylri c3S11Eii/bYdPd
    kNLpAeyvgT/IjwxSabSmfCIrrQc0C1bk3z0BVoRdDYLmBbdddO b94OYMSBZUXG58
    SRcjfHked62COU2PtpeuYn6qSwCB+NRdVv5OgM6w6HE+iCkQ5L Z2dCHBuFMWPctd
    C7ykhLQWCja4a7EgJE99k48sSyWnvFwOKimINes8Mlfz8XuCST OGf+OOsfWjKzSv
    dgSJ3eXZJ/q2T6cGISbyPSiqeiekRo8h8iWncdgzsLIF+wu+hX G7IxlC7anmrd8U
    dG8LFVMnOIkp2BkJmQllbbpBBdu7x5govz0nCq+NFVUyZbnJKf JyLeGO3xe1j1mb
    le+vkdWQNHqRovRWukMmQXNfFamqMLoWe+P0Z7Nlgkhin9JgLd 6r+/QPUWsMeHQ1
    tBiI2RcHjXBcz/IvvohoUZf+HXcOye5Ly0dNnBJuXg/oswXBKZ zaVs173T3DK7ZT
    L0Lq1UDTEFd0LI3PdQ+KqtB7Rt9Xn0igliqffXVZ0VmBoskTs5 oKmX2DrrbjPuoM
    CPs5O9agZs3O8ULAQLz+rCZFOGtPqO3vhYxGmyBx9WxkekzpcA e1yeKMn4ZroYUW
    F45+DnxKGigrwpnNM5Ew9EUnmYwhWab2kXePdiK767Hu27qHjS Omc7EGfkZ6yj4B
    7ZlLkojiQKKlknQdn5nhfQpvNUBMDNcfIHCmkUoN+kKLJ3LAsD G/0gK5u+PRx8TV
    OLmaBQCsLgRIHhC0m2KctuVYioDCTHprGXB8eRaTfo/+q1tKis B+F+G3M0WzOPuB
    +H/rB1bvbRSjccGdDlu8DyfT9DnGHx5TZpj6DGhyfUMw20hY1h 9qpNgjHoo5531R
    x4gKjozWFIoj/DqMPcI2BiYZ2kJHSDBQUal0CUobgl3AK7yjZP uuKUlXz3PjslA3
    2icnOi1qP262vydWZaEPkBdSozFyatk1lzDwF/oXvkvyz3XVDI Om8nGg0JRhgPas
    xyy7ptd4WV92FRR9hEQRhpfZqBAy90oLPudxUQ74sWCSjI6Kw1 vXm1/BiXjlj0tk
    d77v/UGaFRc5/vDeKYS45b2NbOsVno4DjkLI9pWNTDNfOpgll0 /tfWpei9W8Ycyy
    1gxpuRsv8DkuhJJn/HO9i7Aa6zYGPMhqo97eTsf+9JBKuu/fxO 9zq6iFkpnw+LAC
    gaHfiyEP3sXGNUJbrrAceRsa7xM1
    =eVzI
    -----END PGP MESSAGE-----

    Here's the public key it was encrypted with:

    -----BEGIN PGP PUBLIC KEY BLOCK-----
    Version: GnuPG v1.4.5 (GNU/Linux)

    mQGiBEaOfaYRBACmhQFOOvPFVMEPHFNGcETe2eh8iAsJOWgdux JXR1E4a2zB87tp
    +vU20lEBqcd8o7Mfx1z3ZPZC8pZu2N9J4+zSNqRpD/bKQ6iZ2q YFk+IcP7Zx+Qrd
    rGZKPKQByqvFG+nUWqDKw8vr5rASuG2/BxbjJHbayjpVX7J9CP q4VcR7xwCg38z0
    7CS0W2SlEBhRu+pVBZX54f0D/AonvOSzZGPJEyD9sfU7aXNowt jku5V9ybIJtHVI
    DCpsC1IhRfrmx2hHgxyx1egrKT0PlgjilUAcZN9ZhkJgKoZxpg BVH7LdxIN+/jUc
    capxx7zoOmV0NTy26yc0y3UQb2m6lSejUPyj8mUvMUBouj2Btd xKQOXl+qPwmMyo
    ncFIBACGt55hbuFHmf6/j0fCz/wjMWyHn0NebdvgC5HBVm9/a5 Lnr435OwpwJOID
    Mavig01JSVYOZp/4nTOG9p7FFePt7rAbtljaaCNBRLyEY5I08U mhDLau1xPHFDXM
    GLrR9rRehRyyeO6Dcj30KCKHlkDzIRWHYMbFiUEUMUq4xDofnr QfUm9iIFNoaW5u
    IDxyb2Iuc2hpbm5AZ21haWwuY29tPohgBBMRAgAgBQJGjn2mAh sDBgsJCAcDAgQV
    AggDBBYCAwECHgECF4AACgkQgoZHF4HZU+rTJgCeLwZd4bVTbh wIyUa7CnQpXSlj
    rc4AnRhZTQezQnKHioFhxE+nx44H7jfPuQINBEaOfawQCAD5yk fs8bCeQVhkBhrT
    4apDd6yHcKToUOFze4nFenAxzSphnvhOiZ31SJ6XkWmL37ITRV +7PdU+MNgpMSRA
    juKy4le407ME1NxaAoeVXtmAcbtb8qwQFgS6r4wA9sF+bgbeJ7 HKYKPTeH8dXw8D
    KjN+uB/HDpkJpCfMjgV
  • Re:its about time (Score:3, Insightful)

    by wperry1 ( 982543 ) on Friday July 06, 2007 @01:52PM (#19770307) Homepage Journal
    "theres no telling what backdoors Al Qaada has running in our country's networks."

    Sure there is... anyone can look at the source and see back doors, etc. It's more likely that there could be code in a MS project developed by foreigners in Canada http://slashdot.org/article.pl?sid=07/07/05/213424 9 [slashdot.org] because no one would have access to review the source code.
  • by A non-mouse Coward ( 1103675 ) on Friday July 06, 2007 @02:00PM (#19770415)
    Yes, if you did something stupid and your source code was available to the world, it could take less labor to discover your stupidity than if your source was closed.

    OTOH, having source available for competent reviewers does increase the likelihood that your stupidity will get caught before it goes to market or, hopefully, shortly thereafter.

    But that's just it: having the source available to competent reviewers. It has NOTHING to do with whether the source is open to everyone or not.

    Open source != Better Security
    Closed source != Better Security

    This is as stupid as the ID vs Evolution argument. These are NOT mutually exclusive points. There are many open source projects that have sucky security because they don't have competent security analysis done by competent security analysts. Likewise, there are closed source products that have decent security because they invited competent security analysts to review the code. It's not whether your code is open/closed, it's all about who is reviewing your code.

    Do you need an example? Try the NSA. They have code whose source is closed to the world, but is reviewed by competent analysts.

    Nanny, nanny, boo-boo ... My OS is better than yours. Oh wait, that's also the same stupid argument. Market-share, value of the information assets, etc., all play a role. Ask me for my opinion and I'll tell you they all suck, regardless of whether they're open or not. Why? Because the fundamental building blocks we're still depending upon are not reliable, e.g. ARP, DNS, DMA (where your USB thumb drive's driver can overwrite kernel code in memory thanks to DMA), etc.

    --
    The unfortunate reality is that it's seldom the best technology that is adopted, just the technology that is in the right place at the right time.
  • by moderatorrater ( 1095745 ) on Friday July 06, 2007 @02:13PM (#19770599)
    or at least misleading. It's not saying that the software is more insecure and it's not saying that open source software is insecure, it's saying that a phone with software that can be altered by a third party should be classified differently because of the hardware that it's running on. In other words, because a cell phone messes with radio waves, if the software on the phone is designed so that it can be altered by a third party, it should be treated differently then one in which the manufacturer controls the software. This isn't security through obscurity in that they're hoping for less bugs or security holes in the software, it's security by limiting the software that runs on the phone to just the hardware makers.
  • I hate to say it, but, some evidence suggests that obfuscation works if there is enough of it. Cryoptography is ultimately about adding cost and time to an enemies retrieval of message to deter them from attempting to read it, or at least render it less valuable by the time they do, and obfuscation can do that.

    I mean, to some extent, even Microsoft's non-crypted formats are somewhat secure. No one knows how to produce an authentic Word document to the last detail. I don't see an open source file system driver for Linux that lets you reliably write to NTFS formatted partitions, the SAMBA team has numerous problems trying to read Microsoft file and print sharing stuff. If you view all of these closed source efforts as a way to "encrypt data", in the very least, MS has successfully made a lot of their software tamper resistent by the mere virtue of not publishing the source code.
  • by jguthrie ( 57467 ) on Friday July 06, 2007 @02:22PM (#19770723)
    Out of curiousity, how do you prove that the source code that was provided matches the binaries that were provided?
  • No, even then Security through Obscurity is harmful. The problem is that it is not easy to tell if your foundation is secure without considerable peer review. By adding the obscurity element you lose your peer review. Even though you may think your foundation is secure, you may have holes that you don't know about. Sure it will be difficult for outside people to find them too, but if they do you're in a lot of trouble.

    Worse, the more obscurity you have, the harder it is to get the good stuff configured properly in the first place. Most security breaches come not from fundamental weaknesses in any of the algorithms, but operator errors and surrounding design flaws (like how you handle your keys). The best crypto sytems are the ones that are as simple to operate as possible, well documented, and provide lots of feedback and debugging information to the operator to make sure they are using it correctly.
  • Hmm ... (Score:1, Insightful)

    by Anonymous Coward on Friday July 06, 2007 @02:40PM (#19770973)
    And, by the same idea, closed source software with hidden backdoors that anyone can exploit is inherently more secure than open-source software that anyone can view the source of, and said closed source software should be used on all government machines.

    Despite the people who looked at the source telling everyone on IRC the secret root password, and giving people a few terabytes of sensitive government information in the form of a distributed torrent.
  • by Control Group ( 105494 ) * on Friday July 06, 2007 @02:44PM (#19771019) Homepage
    Your post is so wrong, it's tempting to think you must be joking. But in case you're not:

    It is acknowledged by the entire security industry - the FCC notwithstanding - that obscuring the method by which you secure something is not an effective way to increase the security of that thing. As an example: a well-design ATM system doesn't depend on whether the attacker knows what's on the ATM card, how the reader works, how the system is programmed, or anything else about the mechanisms. It depends entirely on whether the attacker knows the PIN associated with the card.

    As another example, the most secure form of encryption possible - by which I mean it is literally impossible to break without the key - is the one-time-pad cipher. The mechanism for that is trivially simple: take the message you want to encrypt, and begin generating random integers from 1 through 26, one integer per character in the message. Then go through the message, adding each number in sequence to each character in sequence (A + 3 = D, X + 3 = A, etc.). The resulting encrypted text is perfectly resistant to decryption without the key.

    The fact that I just told you how to generate and use a OTP cipher doesn't change the fact that it's perfectly unbreakable. The security is in the key, not the mechanism.
  • by droopycom ( 470921 ) on Friday July 06, 2007 @02:51PM (#19771131)
    "Cryoptography is ultimately about adding cost and time to an enemies retrieval of message"

    This is mostly correct, but cryptography is NOT security. Security is usually defined in terms of integrity, confidentiality, authentication etc...

    Your examples are flawed. Its not because Samba does not work well that hackers wont be able to hack your files away from a password protected share.

    MS software is not tamper resistant, you can tamper with it all you want. The purpose of tampering is not to make it work (ones of samba's goal) but to get it to do something that it is not supposed to do. Samba is all about having it work the way its supposed too, tampering is the other way...

    Same for NTFS writer. The Linux NFS writer can do a lot of tampering with your NTFS filesystem, including destroying it.

  • by tkrotchko ( 124118 ) * on Friday July 06, 2007 @03:07PM (#19771353) Homepage
    "No one knows how to produce an authentic Word document to the last detail. I don't see an open source file system driver for Linux that lets you reliably write to NTFS formatted partition"

    Here's the thing... you're not talking about security, you're talking about interoperability.

    Is your Word document secure because Open Office can't perfectly reproduce it? It NTFS secure because nobody has a perfect driver for it in Linux? Is SMB secure because Samba isn't 100% perfect?

    Obviously not. If the idea is to keep something both secure and readily accessible to the public, I can't say for sure it can't be done. But the empirical evidence suggests it's either impossible or so difficult that you can't do it cost-effectively, at least not for things that people really want. I mean, look at Blu-Ray and HD-DVD. They spent lots of money to secure the formats and apparently people can copy the disks at will in under 12 months. And that was not open source.
  • by Dunbal ( 464142 ) on Friday July 06, 2007 @03:32PM (#19771687)
    whereas possibly 1 in 50,000 has the reverse-engineering skills to understand (and possibly modify) binary-only systems.


          Irrelevant. It only takes ONE. Welcome to the information age.
  • by Ungrounded Lightning ( 62228 ) on Friday July 06, 2007 @03:55PM (#19772033) Journal
    I hate to say it, but, some evidence suggests that obfuscation works if there is enough of it.

    And it all depends on what is meant by "security".

    The FCC could care less about how hard it is to recover the message or break the box. What they're concerned about is how hard it is to modify the box to operate outside their regulations.

    It's a lot easier to modify the function of a peripheral if you have information about it - including commented source for the controlling driver - than if you don't. Don't believe it? Look how long it took - and still takes - to write blob-free fully-functional Linux drivers for winmodems, graphic accellerators, WiFi chipsets, etc. Listen to the cries for documentation from the driver and kernel development projects.

    The FCC says "Thou shalt not publish the source code to the parts that control the radio." Since FOSS licenses REQUIRE the vendors to publish the source code, FOSS is thus effectively forbidden, since it would not be possible to abide by the software license and the FCC license simultaneously.

    As for vetting the code, the FCC reserves the right to demand the source of ANY software - proprietary or not - used in a type-approved software-defined radio. They say they probably will rarely want to look, and will probably honor the company's request for confidentiality unless they have some reason not to, but they do demand it be forked over whenever they ask. So arguments that they can't vet it because it's closed are moot.
  • by McDutchie ( 151611 ) on Friday July 06, 2007 @03:56PM (#19772051) Homepage

    I don't see an open source file system driver for Linux that lets you reliably write to NTFS formatted partitions,

    I have been seeing it [ntfs-3g.org] for quite a while now. NTFS-3G, which works within the FUSE userspace file system framework, has an excellent reputation for reliability.

  • by ColdWetDog ( 752185 ) on Friday July 06, 2007 @04:05PM (#19772183) Homepage
    This may be hard to grasp, but its partially YOUR fault if you can't manage your government employees. (FYI, one of your management tools was the purpose of the 2nd amendment!)

    OK, We're supposed to ask the National Guard (our well trained militia, as it were) to arrest various and sundry government employees? Neat idea, I'll just drive down the local Armory and ask them.

  • Is your Word document secure because Open Office can't perfectly reproduce it? It NTFS secure because nobody has a perfect driver for it in Linux? Is SMB secure because Samba isn't 100% perfect?

    It is, in the sense, that, to all of those systems, the MS implementation could theoretically decide that they are a form of an attack. If you look at it from an IP centric way, one could make the argument that using an FOSS version of that data is a sort of a theft in that, MS did all the hard work coming up with a spec and an implementation to get it to work, and, the FOSS people are merely implementing to a spec, which is much easier than the creative process of creating a brand new file system, document format, or network protocol from scratch.

    Indeed, I've always wondered why, instead of trying to ape Microsoft's file and print protocol, why one could not make a Linux native file and print protocol and then offer an FOSS driver for Windows to use it. Windows has been multiprotocol capable now, for what, at least a decade. Similarly, why couldn't one create their own file system driver for Windows, like EXT3?

New York... when civilization falls apart, remember, we were way ahead of you. - David Letterman

Working...