Please create an account to participate in the Slashdot moderation system


Forgot your password?
Software Government Politics

FCC Rules Open Source Code Is Less Secure 365

An anonymous reader writes "A new federal rule set to take effect Friday could mean that software radios built on 'open-source elements' may have trouble getting to market. Some US regulators have apparently come to the conclusion that, by nature, open source software is less secure than closed source. 'By effectively siding with what is known in cryptography circles as "security through obscurity," the controversial idea that keeping security methods secret makes them more impenetrable, the FCC has drawn an outcry from the software radio set and raised eyebrows among some security experts. "There is no reason why regulators should discourage open-source approaches that may in the end be more secure, cheaper, more interoperable, easier to standardize, and easier to certify," Bernard Eydt, chairman of the security committee for a global industry association called the SDR (software-defined radio) Forum, said in an e-mail interview this week.'"
This discussion has been archived. No new comments can be posted.

FCC Rules Open Source Code Is Less Secure

Comments Filter:
  • by canUbeleiveIT ( 787307 ) on Friday July 06, 2007 @01:01PM (#19769517)
    Just goes to show how much a bunch of gov't bureaucrats know. Or maybe there just being ass-kissy with business again.
    • by eln ( 21727 ) * on Friday July 06, 2007 @01:22PM (#19769869)
      They believe what the people who give them the most money want them to believe. Welcome to government.
      • "They believe what the people who give them the most money want them to believe. Welcome to government."

        Yup, Money Talks.

        Unfortunately, Open Source projects by nature just don't have that kind of legislative money to throw around.

      • by Harmonious Botch ( 921977 ) * on Friday July 06, 2007 @01:43PM (#19770177) Homepage Journal
        They are more familar with the idea of secrecy and control than ideas like cooperation and standards.
      • gov't can be great (Score:4, Interesting)

        by bussdriver ( 620565 ) on Friday July 06, 2007 @02:28PM (#19770807)
        Government is customer managed and you get what the majority deserves :-(

        To the person with only a hammer, everything looks like a nail...
        Not all government is bad and wasteful; it can and does out perform the private sector more times than Americans are sold to believe.

        This may be hard to grasp, but its partially YOUR fault if you can't manage your government employees. (FYI, one of your management tools was the purpose of the 2nd amendment!)

        As Ben Franklin essentially said, any government well administered is good government and all eventually fall (as a result of despotism; society is not a spectator regardless of what they may think.)
        • Re: (Score:3, Insightful)

          by ColdWetDog ( 752185 )
          This may be hard to grasp, but its partially YOUR fault if you can't manage your government employees. (FYI, one of your management tools was the purpose of the 2nd amendment!)

          OK, We're supposed to ask the National Guard (our well trained militia, as it were) to arrest various and sundry government employees? Neat idea, I'll just drive down the local Armory and ask them.

    • by RingDev ( 879105 ) on Friday July 06, 2007 @01:41PM (#19770153) Homepage Journal
      Standard Neo-con practice, appoint like-minded, highly loyal individuals into key points of power to make decisions that benefit big companies and personal investments in ways that congress can not easily effect.

      Kevin J. Martin is the current head of the FCC, appointed by Bush in 2005. Prior to that, he was general council for Bush's first election campaign, then he took over the 'technical transition' when Bush/Chenny were moving into the white house. After they got settled he picked up a nice position as a white house assistant. The guy is nothing more than yet another Neo-con chronie who shows his loyalty to big business and the party line over the interests of the people and gets promoted for it.

      On the bright side though, he is at least somewhat qualified for the job. He has a real degree from a real school, he worked at the FCC prior to being appointed to Chairman, and has focused much of his career in the tech/telecomm industries.

    • by tjstork ( 137384 ) <todd,bandrowsky&gmail,com> on Friday July 06, 2007 @02:14PM (#19770617) Homepage Journal
      I hate to say it, but, some evidence suggests that obfuscation works if there is enough of it. Cryoptography is ultimately about adding cost and time to an enemies retrieval of message to deter them from attempting to read it, or at least render it less valuable by the time they do, and obfuscation can do that.

      I mean, to some extent, even Microsoft's non-crypted formats are somewhat secure. No one knows how to produce an authentic Word document to the last detail. I don't see an open source file system driver for Linux that lets you reliably write to NTFS formatted partitions, the SAMBA team has numerous problems trying to read Microsoft file and print sharing stuff. If you view all of these closed source efforts as a way to "encrypt data", in the very least, MS has successfully made a lot of their software tamper resistent by the mere virtue of not publishing the source code.
      • by gsking1 ( 1109797 ) on Friday July 06, 2007 @02:26PM (#19770789)
        I get your point.. BUT. There is a very good NTFS writer for Linux []
      • by Penguinisto ( 415985 ) on Friday July 06, 2007 @02:28PM (#19770805) Journal if we all write crufty kludges instead of clean, elegant code we'll all be perfectly safe and secure?

        Suddenly, I'm not so sure I'm gonna be able to get any sleep tonight for some odd reason...


      • by droopycom ( 470921 ) on Friday July 06, 2007 @02:51PM (#19771131)
        "Cryoptography is ultimately about adding cost and time to an enemies retrieval of message"

        This is mostly correct, but cryptography is NOT security. Security is usually defined in terms of integrity, confidentiality, authentication etc...

        Your examples are flawed. Its not because Samba does not work well that hackers wont be able to hack your files away from a password protected share.

        MS software is not tamper resistant, you can tamper with it all you want. The purpose of tampering is not to make it work (ones of samba's goal) but to get it to do something that it is not supposed to do. Samba is all about having it work the way its supposed too, tampering is the other way...

        Same for NTFS writer. The Linux NFS writer can do a lot of tampering with your NTFS filesystem, including destroying it.

      • by phorm ( 591458 )
        MS has successfully made a lot of their software tamper resistent by the mere virtue of not publishing the source code

        Compatibility resistant too...
      • Re: (Score:3, Insightful)

        by tkrotchko ( 124118 ) *
        "No one knows how to produce an authentic Word document to the last detail. I don't see an open source file system driver for Linux that lets you reliably write to NTFS formatted partition"

        Here's the thing... you're not talking about security, you're talking about interoperability.

        Is your Word document secure because Open Office can't perfectly reproduce it? It NTFS secure because nobody has a perfect driver for it in Linux? Is SMB secure because Samba isn't 100% perfect?

        Obviously not. If the idea is to
        • Re: (Score:3, Insightful)

          by tjstork ( 137384 )
          Is your Word document secure because Open Office can't perfectly reproduce it? It NTFS secure because nobody has a perfect driver for it in Linux? Is SMB secure because Samba isn't 100% perfect?

          It is, in the sense, that, to all of those systems, the MS implementation could theoretically decide that they are a form of an attack. If you look at it from an IP centric way, one could make the argument that using an FOSS version of that data is a sort of a theft in that, MS did all the hard work coming up with a
      • I hate to say it, but, some evidence suggests that obfuscation works if there is enough of it.

        And it all depends on what is meant by "security".

        The FCC could care less about how hard it is to recover the message or break the box. What they're concerned about is how hard it is to modify the box to operate outside their regulations.

        It's a lot easier to modify the function of a peripheral if you have information about it - including commented source for the controlling driver - than if you don't. Don't belie
      • by McDutchie ( 151611 ) on Friday July 06, 2007 @03:56PM (#19772051) Homepage

        I don't see an open source file system driver for Linux that lets you reliably write to NTFS formatted partitions,

        I have been seeing it [] for quite a while now. NTFS-3G, which works within the FUSE userspace file system framework, has an excellent reputation for reliability.

  • Amusing (Score:5, Insightful)

    by ebbomega ( 410207 ) on Friday July 06, 2007 @01:01PM (#19769519) Journal
    Because Security Through Obscurity totally worked for:

    MPAA (DeCSS)
    Nazis (Enigma)
    Xerox (Robin Hood & Friar Tuck)
    Microsoft (just about any form of security they've ever had)

    and about a billion other examples
    • by nurb432 ( 527695 )
      Yea, the MPAA and Microsoft are really hurting with their billions in the bank...

      And you really cant compare enigma to current technology.
      • Re:Amusing (Score:5, Interesting)

        by Penguinisto ( 415985 ) on Friday July 06, 2007 @01:26PM (#19769935) Journal

        Yea, the MPAA and Microsoft are really hurting with their billions in the bank...

        ...meanwhile, their products are well-known for being about as secure as a fresh pot roast tossed on the floor of a wolf pit.

        Just because one can make a profit off of it doesn't make it any more secure.

        And you really cant compare enigma to current technology.

        I beg to differ - it was:

        1. a hardware-encoded algorithm set, eventually broken by other algorithms (courtesy of a few hardy Polish expatriate mathematicians), and
        2. actively decoded by one of the very first electronic computers in existence (see also "Colossus" and "Bletchley Park")

        Cripes, man... if Enigma/Colossus wasn't relevant in concept, then what is!?


      • Re: (Score:3, Interesting)

        by wperry1 ( 982543 )
        All you are saying is that Security through Obscurity is more profitable not that it is more secure.

        That is also why these guys have all the money in the world to throw at politicians and convince them that their way is better.

    • Re: (Score:3, Interesting)

      by AgentRavyn ( 142623 )
      To be fair, Enigma wasn't security through obscurity. It was a pretty strong mechanical encryption system that had serious user flaws. Every day, they had to brute force the day code using cribs that they had learned throughout the war.

      The Allies were only able to figure it out after they got a hold of one of the devices, analyzed it, and then rigged up a whole bunch of primitive Turing machines (Alan Turing was pretty essential to this whole process, by the way). Then, as mentioned above, they brute for
      • Re:Amusing (Score:4, Informative)

        by Lockejaw ( 955650 ) on Friday July 06, 2007 @01:27PM (#19769961)

        Had the radio operators been a little more careful, it would've been a lot harder to break Enigma.
        Yes, a lot of their communications were so formulaic that you could start the day with a known-plaintext attack, recover the key, and then use it to decrypt the rest of the day's communication.
      • From what I remember they were able to infer how to build the device based on extensive analysis of the encoded data that was captured. I don't believe they actually captured an Enigma device itself.
        • []
          Google for U-110.

        • Re:Amusing (Score:5, Informative)

          by TheRaven64 ( 641858 ) on Friday July 06, 2007 @02:05PM (#19770473) Journal

          I don't believe they actually captured an Enigma device itself.
          The Poles captured an Enigma machine and sent it to England when Poland fell, and GCHQ had a simpler version (same principle, fewer wheels) long before the war. One of the biggest factors in cracking the Enigma code was the fact that the German high command insisted that the settings for every wheel had to change every day. This dramatically reduced the search space. Once you'd cracked the code for one day, the number of possibilities for the next day were much smaller than if they had been completely random. I always remember this whenever I get a password rejected by a system because it must contain at least one uppercase letter and one number...
          • Re: (Score:3, Interesting)

            by adrianmonk ( 890071 )

            One of the biggest factors in cracking the Enigma code was the fact that the German high command insisted that the settings for every wheel had to change every day. This dramatically reduced the search space. [ ... ] I always remember this whenever I get a password rejected by a system because it must contain at least one uppercase letter and one number...

            I agree. I had a chuckle recently when we had a security training course at work, and they went through a lot of explaining of what the rules are for

    • Re:Amusing (Score:5, Insightful)

      by Martin Blank ( 154261 ) on Friday July 06, 2007 @01:42PM (#19770155) Homepage Journal
      When the Germans kept Enigma a secret, they did nothing more or less common than anyone else was doing, or still does for the most part. National governments by and large do not leave their communications to AES, but instead use (what they at least perceive to be) more secure methods. NSA still keeps our codes secret, Russia's FSB keeps its codes secret, and the UK's GCHQ keeps its codes secret.

      One of the advantages to this is that the limited distribution of a given code can (but does not always) limit the number of attacks against it. Whereas a commercial cipher may result in millions or even billions of ciphertexts to analyze, a government cipher may result in only thousands to work with, and it may be more difficult to determine plaintext aspects of a given document for comparative analysis. It's also generally difficult to get the actual cryptographic hardware without paying someone (either from inside or outside) to steal one.

      This doesn't work well at all for the kinds of things that the FCC covers, however. I can generate billions of ciphertexts with known plaintexts for some new wireless system, and I can also do analysis against the electronics involved to look for side-channel attacks. Hiding things for commercial items intended for the general public is fairly pointless.

      Side note: I'd not heard of the Robin Hood & Friar Tuck trick. That was some very fun reading. Thanks for brightening my morning a bit. :)
    • Re "Robin Hood and Friar Tuck" - that was the first I'd heard of it, but I have a similar tale [], though in my case it could be more accurately described as "Robin Hood and the Sheriff of Nottingham" :-)

    • Enigma was publicly documented to a degree. It was based upon commercial devices from the 1920s, this greatly facilitated those who attacked it. The extensions / revisions made to the basic design were kept secret, however the weaknesses that led to its defeat were not these extensions or revisions but operator error. For example operators would send the same test message each morning, a violation of their training and procedures, and this greatly aided in the discovery of the day's configuration of the mac
  • Around the world, people who were in the middle of saying "What the IRS doesn't know, can't hurt me !" suddenly stopped & asked, "Did you feel that, there's a disturbance in the force".
  • by Space cowboy ( 13680 ) * on Friday July 06, 2007 @01:03PM (#19769553) Journal

    If I'm trying to break into some code, and I can read the source code to determine how the author protected it, I'll have an easier job (note: "easier", not "easy") because I can home in on the algorithm the author used. I know whether it's Blowfish, DES, AES, IDEA, or a simple XOR or substitution cipher. I know what pre-encrpytion steps were taken, and what post-encryption algorithms were used.

    Let's say that in a moment of insanity, I decided to use a basic XOR encryption routine (create each byte in the encrypted stream by XOR-ing the corresponding source byte with every byte in the password save one, rotating that one as I iterate over the source). This is completely and utterly trivial to crack if you have the source code and *know* the routine I used. It's a repetitive cypher, so it's reasonably obvious unless the password is of significant (a sizeable fraction of the source's length) as well. Note the difference - it's easier with the source code.

    Now that's a contrived example - no-one in their right minds would use an XOR cypher, but the same principle applies to harder encryption techniques. If you *know* what system was used to protect the source, you have an advantage over not knowing... Did they gzip the source before encrypting it ? Did they use ZIP, RAR, or 'compress' instead ? Did they XOR to hide the obvious compression header ? Is it inverted (last byte first) or was any other transformation done *before* the encryption stage to try and make it non-obvious that a successful crack had taken place ? These are all "knowns" if you have the source code...

    So, yes, it is easier when you have the source code. Security through obscurity is rightly derided, but not because it has no value. It is derided because it leads to the use of insecure encryption methods (small keys, using XOR/whatever instead of proper hard encyption, etc) and the fact that once the obscurity is cleared up, there's no more security. The idea is that if you are sufficiently confident that your encryption is unbreakable, you *can* document how you did it in public. That doesn't mean you *should*.

    The point though, and why I disagree with the regulators, is that if you're using hard encryption, it really doesn't matter whether it's *easier*, it's not *easy*. It is in fact still so damn hard, that we're talking "impossible in our lifetime(*)" - the relative comparison makes no sense. It's akin to measuring the height of Mount Everest at 6-month intervals - it's always pretty darn high, though you might find some variance due to snowfall.

    So, yes, they're right. But by not considering the (tiny) impact of their conclusion, they have made the wrong ruling.

    (*) Modulo the discovery of an easy way to crack the encryption technology, of course.


    • by kebes ( 861706 ) on Friday July 06, 2007 @01:18PM (#19769811) Journal
      You're quite right. Obscurity does provide some level of security, though relying on it alone is a surefire way to have your security cracked eventually. (Whereas things that are cryptographically secure will not be cracked in my lifetime.)

      Another way to look at it (especially in the context of open source radio) is that whoever is implementing the security has finite resources (money, man-hours, whatever) at their disposal. For every hour they spend trying to obfuscate the inner workings, that is one less hour spent validating that it is *truly* secure (in the "cryptographically secure" sense). If you instead leverage open-source, then you have code that has been tested and vetted by experts the world over. Suddenly the hours spent on adding obfuscation would be a waste of resources: the code is already so secure that adding the slight additional security of obscurity is a waste of time.

      So, while obscurity does provide some kind of security... it is actually the most resource-wasteful form of security (alot of effort for something that eventually gets cracked), whereas the more efficient security model is to focus on things that are fundamentally secure (in which case you may as well use open-source solutions, since you get to take advantage of work already done, and the marginal loss of obscurity doesn't end up mattering).
      • Re: (Score:3, Insightful)

        Exactly. Hey, FCC: Decrypt this:

        -----BEGIN PGP MESSAGE-----
        Version: GnuPG v1.4.5 (GNU/Linux)

        hQIOA3zQFkc0jOpLEAgAkeu9YYOYA2YLePtUm3tGthW7fBO1RN BM/EBDJ3FkQdfZ
        avUq5gRrYhZ/vwo5MfMe950/SpZcgaUpN4pOoNQQFEyD8QYMjB mnvU0sH0iUAvza
        oZvcvq7cxiswhUPwSFZPVz8vyGW0WqP6aTcRxF/EA71Jo2IbMs aoSMKv2T1Jkr04
        OnGhFO5hEhNkAPEpoIucdkVKMn3U+Cmj846vj/I4CIaLu99mHw p150fuSgI1Jfua
        8Ax9ztv9Krx74khTlOIwW/5nLKz6IXqDRn8YIehA3YmWuddFGg 7vcoMlMgmsficz /PJCe0acA5zvOuY1ISYnqB6aeAKe3caU+RY2MVDYxwgAv5+pdr Z1nyOaOzVFdVFD
    • no-one in their right minds would use an XOR cypher

      /me shifts uncomfortably

      C'mon, it was the early 90s, I was new at this programming thing, and my boss told me to do it...

      At least I changed the constant away from 0x7F.

    • by MobyDisk ( 75490 )
      Technically, you are right.

      The problem is, if you don't have the source, you'll never know that the XOR encryption is in there. So it will never be fixed. Knowing the security level for certain is just as important as the actual security implementation.
    • Lookup Kerckhoffs' principle []. Security through obscurity is a widely debated subject going all back to the 19 century, when it concerns to cryptography, and sooner than that, in the locksmith circles, and it is more or less a consensus that it is not only ineffective but terribly dangerous, because "every secret create a potential failure point".

      Read the wikipedia article, it is enlightening and very insightful.
      • And by sooner I mean earlier. God damned foreign language and its traps!
        • Re: (Score:3, Insightful)

          by Space cowboy ( 13680 ) *
          The thing about pretty much all the discussion over 'security through obscurity' is that it compares a 'secure-because-obscure' to a 'secure-without-being-obscure' mechanism. I'm not saying that the use of a secure-through-obscure mechanism is a good thing, and if you read my post, you'll see that.

          My point was that if I'm using a hard-encryption mechanism, then I can additionally do things that would render a "cracked" result difficult to determine. If you know what you're looking for (ie: the algorithm is
    • Ceteris paribus (Score:5, Insightful)

      by hey! ( 33014 ) on Friday July 06, 2007 @01:40PM (#19770129) Homepage Journal
      "Ceteris paribus" -- assuming "allthings being equal", which they never are.

      True, if you have two equally boneheaded pieces of software, then exploits in a the closed one are harder to divine -- not by much, but harder. On the other hand, if you have a piece of software that has survived years of public scrutiny by experts, that is presumptively harder to exploit than something some random engineer ginned up in secret.

      Something cannot be widely reviewed (which is the gold standard in security) and secret at the same time. So generally, I think open source represents the best by far and the worst by a little of security possibilities.

      The ultimate problem is that broad statements like X is more secure than Y are meaningless. You have to specify the context and threat you are concerned with. Is an open source interpreter burned into a ROM inside of microwave oven more vulnerable than a proprietary interpreter? Well, against what? Same goes for the software radio thing.

    • by Ravnen ( 823845 )
      There is also the issue of bugs, which can sometimes be used to bypass security mechanisms. Bugs can of course be found without source code, e.g. through testing, but it is far easier for an expert looking for weaknesses to find bugs when the source code is available than when it isn't.
    • Yes, if you did something stupid and your source code was available to the world, it could take less labor to discover your stupidity than if your source was closed.

      OTOH, having source available for competent reviewers does increase the likelihood that your stupidity will get caught before it goes to market or, hopefully, shortly thereafter.

      But that's just it: having the source available to competent reviewers. It has NOTHING to do with whether the source is open to everyone or not.

      Open source !=
  • From TFA:

    The SDR Forum has cited the Secure Socket Layer (SSL), a widely used technique for securing e-commerce transactions, and the National Institute of Standards and Technology (NIST)'s public hash algorithms as evidence that open processes often yield the most highly successful security techniques.

    Very typical. First, they say that the stuff is not as secure as the "security by obscurity" method, then they go and say the most widely accepted and used method for secure web transactions is evidence th

    • Why do people talk about "The Government" like it is a single person? It is many people who do not get allong and sometimes activly fight each other. Some of them are clueless, and some are mistaken by malice. No surprises here. Amusment, perhaps...
    • Re: (Score:3, Insightful)

      by BitchKapoor ( 732880 )

      From TFA: The SDR Forum has cited the Secure Socket Layer (SSL), a widely used technique for securing e-commerce transactions, and the National Institute of Standards and Technology (NIST)'s public hash algorithms as evidence that open processes often yield the most highly successful security techniques.

      Very typical. First, they say that the stuff is not as secure as the "security by obscurity" method, then they go and say the most widely accepted and used method for secure web transactions is evide

    • by gEvil (beta) ( 945888 ) on Friday July 06, 2007 @01:24PM (#19769913)
      It's not the same group making these statements. The FCC is the one who has said that "security through obscurity" works, while the SDR Forum (an industry group) cited SSL as a counterexample.
    • Re: (Score:3, Informative)

      by eln ( 21727 ) *
      The SDR Forum is not affiliated with the FCC or the federal government, and in fact is opposed to this new FCC rule. The SDR Forum brought up those two methods as a counterpoint to the FCC's rationalization for this rule. I don't see any doublespeak there.
      • NIST is a government agency. And it wouldn't surprise me if the FCC uses SSL on some of their web servers, internally or externally. And how many government agencies use Kerberos?

  • Sure there is, and its called payoffs.
  • by Anonymous Coward on Friday July 06, 2007 @01:07PM (#19769625)
    ... since its very inception back in 1934 (and its predecessor the "Federal Radio Commission from 1927 until 1934) has always been under the corrupted financial influence of the big broadcasters, despite the faux-adversarial image they try to paint on their relationships.
  • by bkuhn ( 41121 ) on Friday July 06, 2007 @01:08PM (#19769637) Homepage
    Over at the Software Freedom Law Center [], we've published a white paper regarding the new rules []. That might be of interest to some.
  • by gillbates ( 106458 ) on Friday July 06, 2007 @01:09PM (#19769657) Homepage Journal

    How can you prove something is secure if you can't see the source code?

    You can't.

    The FCC's position is that it is better to hide one's head in the sand and hope the vendor implemented a secure solution than to actually *prove* the solution is secure.

    The FCC has always worried that the technology's flexible nature could allow hackers to gain access to inappropriate parts of the spectrum, such as that used for public safety. So the regulators required manufacturers to submit confidential descriptions showing that their products are safe from outside modifications that would run afoul of the government's rules. Cisco's petition asked the regulators to clarify how use of open-source security software, whose code is by definition public, fit into that confidentiality mandate.

    The problem is that, as any ham operator knows, access to any part of the spectrum is as simple as building your own homebrew equipment. Hackers, by their very nature, already know how to access the radio spectrum; it is the weak, or non-existent encryption which represents the real threat. Keeping your code closed allows security vulnerabilities to exist for much longer than they would if they could be scrutinized by the public at large.

    Furthermore, any software defined radio, open source or not, can be made "open source" by simply replacing the binary in flash. Which means that any software defined radio, open source or not, can be hacked. Which might be a bigger issue worth more discussion.

    • How can you prove something is secure if you can't see the source code?

      Actually, you can verify that a piece of compiled code is secure if the vendor provides type annotations with it in the style of proof-carrying code. This is similar to how the JVM can verify that Java bytecode won't do things it's not supposed to, except now we need a richer specification of what we consider to be secure.

  • by pavon ( 30274 ) on Friday July 06, 2007 @01:12PM (#19769719)
    I am somewhat perplexed as to why the FCC would need to be regulating the security of consumer devices. For organization that need secure communications, there are already many government and private certifications, that insure this. But why on earth would they restrict consumers from purchasing non-secure software radios if they don't need them?

    Is this because they feel that software radios could be hacked to broadcast outside of their certified frequency and power limits? Or because they think they need to protect the public from buying 802.11 routers with crappy WAP implementations?
    • by db32 ( 862117 ) on Friday July 06, 2007 @02:03PM (#19770451) Journal
      It is exactly as you said. They don't want the populace spewing things into the RF spectrum that they can't manage. So one or two pirate radio stations spring up and are easily hunted down by the FCC. Now, with easy to "hack" software radios everyone could start broadcasting any information they want, in any format, on any frequency, at any power, etc...and there would be no way for the FCC to even begin to track that kind of rampant violation down.

      If one guy is in the street protesting it is easy to control and quell. If its 10,000 guys in the street protesting it gets a little harder, if its 10,000,000 guys its basically imposisble.
  • by LM741N ( 258038 ) on Friday July 06, 2007 @01:13PM (#19769747)
    These are the same FCC bozos who are promoting Broadband Over Power Line or BPL, despite all the independent technical experts who confirm that the systems are just giant antennas radiating hash, noise, etc and interfering with Public Service Radio. Along those lines, the American Radio Relay League (ARRL) is suing the FCC over its certification methods for such systems. see for the details

  • approaches that may in the end be more secure, cheaper, more interoperable, easier to standardize, and easier to certify...

    In my experience these statements are true...
    - secure: sometimes; more likely with more popular projects, less likely with smaller projects
    - cheaper: sometimes; adding in cost of people to noodle with code or interfaces can raise costs quickly (however cost may be minimal if we're talking about cloning a few thousand embedded cuts, etc.)
    - interoperable: definitely, becaus

    • - easier to certify: definitely not, because the code frequently shifts (e.g., OpenSSL's experiences with FIPS validation)
      In comparison with what? Incremental releases happen in both open- and closed-source software. Sure, the open-source project has nightly builds which won't all get certified, but chances are the closed-source one does too. The difference is that only the open-source one lets people see its nightly build.
  • not about security (Score:2, Insightful)

    by mevets ( 322601 )
    The security bit is just a cover story. This is about some perceived danger to the RIAA, MPIAA and similar cartels.
  • ... for black hats :(
  • Looks like someone needs to drop the FCC a note to inform them that an Open Source operating system has somehow managed to achieve LSPP/EAL4+ Common Criteria security certification [].
  • by romiz ( 757548 ) on Friday July 06, 2007 @01:19PM (#19769823)
    The problem the FCC (and every other emission regulation body) has with open source and software radio is that it will be trivial to modify a device using these methods to emit at an arbitrarily high power level over a restricted wavelength, or using a band without using the proper medium access control. If this happened, the wavelength would be pretty much unusable for all other users until the FCC tracks down the emitter, and shuts him down.

    That's why today, most radio-enabled devices, and especially mobile phones, have to pass type conformance to be commercialized in a geographic area. In the current state of things, if the radio software can be changed by the user, the type conformance cannot be awarded. Software radio makes things worse, because it is harder to justify that a component cannot emit at a given frequency, if changing the software in this component would allow switching emission frequencies at will.
    • by QuoteMstr ( 55051 ) <> on Friday July 06, 2007 @01:36PM (#19770061)
      That's what code burned into ROM is for -- or hell, EPROM or even EEPROM would be fine, so long as it can't be erased through normal operation of the device.

      If the FCC is that concerned about software radio operating out of spec (which I personally believe isn't really going to be a problem), then it should mandate hardware access controls on all radios.

      Ultimately, ANY solution that relies on locking down client devices is doomed to failure. People can, and do, tinker with their own devices.
    • Re: (Score:3, Interesting)

      by everphilski ( 877346 )
      Most SDR's I've seen (all in amateur radio world ...) are run off of crystals or chips generating a waveform. The base frequency is NOT generated by software... so it is a hardware issue as to frequency, not software.

      Where software comes into play is processing the incoming signal, and generating an outgoing signal. And the software is damn good at that :)
    • by interiot ( 50685 )

      Exactly. The headline is misleading... the FCC isn't concerned about crackers being able to take control of other people's machines, they're concerned about normal people being able to fully modify their own equipment.

      It's just a single issue with the frequency restrictions. If software could be open-source, and end users were able to configure everything but that one little thing, it wouldn't be as big of a problem. But it's an inherent part of open source that anything can be modified. OSS prevents

  • by Anonymous Coward on Friday July 06, 2007 @01:19PM (#19769825)
    The FCC has absolutely no power to regulate nor any say at all in how software radio or television are implemented.

    The FCC commisioners are deluding themselves, again, if they think Congress gave them the power to appoint monopolies.

    They have already been slapped down once with regards to the DTV Redistribution Control flag and they're about to be slapped down again.

    What's next, washing machines and clock radios? s/200505/04-1037b.pdf []

    If the Foolish Child Commission can't remember the limits of their power, We the People will be more than happy to remind them, spank them and send them to their 'time-out' corner once again.
  • MoCSSRH (Score:2, Insightful)

    by gr3kgr33n ( 824960 )
    Well, if they [FCC] are going to take this stance, it is our duty to enlighten them as to the consequences of their actions.

    I would like to see a Month of Closed-Source Software Raido Hacks

    Then they [FCC] will discover that since the closed source software radios are not examined by independent unbiased debuggers, the possibility of bugs, bad encryption schemes, et al is a very high possibility.
    Maybe then the government bureaucrats will see the merits of Open Source.
  • by russotto ( 537200 ) on Friday July 06, 2007 @01:21PM (#19769857) Journal least not security as it's usually defined. It's about prevention of modification by the end user or a third party not authorized by the manufacturer.

    While the rules require these "security" measures to prevent modification to software designed radios, as far as I can tell (based on several 802.11 devices I've messed with) the only actual "security" measures which have been taken have been to not publish the source. There's not really anything preventing modification of the firmware to operate outside the ISM band or at unpermitted power levels. So I'm not sure exactly what measures the FCC is really requiring, other than that manufacturers don't publish their datasheets.
  • by newandyh-r ( 724533 ) on Friday July 06, 2007 @01:24PM (#19769905)
    If the end-user can modify the source with reasonable ease:

    They can easily bypass any "broadcast flag";
    They can remove restrictions on which channels a scanner can scan;
    They may be able to transmit on forbidden channels or at
    power levels that are above those permitted for a channel.

    That is the sort of hacking that frightens the FCC

    • by Dunbal ( 464142 ) on Friday July 06, 2007 @01:44PM (#19770191)
      That is the sort of hacking that frightens the FCC

            And with their infallible logic they conclude that closed source means you cannot remove restrictions, transmit on forbidden channels/power levels and bypass broadcast flags. Because no closed source program ever has been bypassed, modified or otherwise hacked. Days and even hours after its release.

            When will these people learn that the PEOPLE have the power, not the government? We the masses obey ONLY when it suits us. If they have to go to such great lengths to try to limit us, perhaps what they are trying to do is not such a good idea after all? They just don't get it.
  • Sir, you will no doubt be shocked to learn that this neither comes with a silver platter, or chilled champagne. I know when this realization dawned on me, my monocle popped out and rolled under my desk. My gentleman's gentleman, Wheatley, has noted his displeasure with your oversight while remedying the situation.
  • I'd have to give them a big "Yes and No." The breakpoint is whether or not there's an active community of people looking over the source and testing it. If there is, they're more likely to find insecurities before hackers. If not, and the only people reading the source are hackers, there could be a problem. All of this to me suggests that the Open Source community should consolidate, have fewer projects, and we can all subject each other's projects to more rigorous review.
  • Nonsense (Score:5, Insightful)

    by Anik315 ( 585913 ) <anik@alphaco r . n et> on Friday July 06, 2007 @01:28PM (#19769973)
    There's nothing inherently secure about closed source software or anything inherently secure about open source software. In fact, closed source software that is not secure when the source code is visible is not really secure at all.
  • Thanks (Score:3, Funny)

    by Applekid ( 993327 ) on Friday July 06, 2007 @01:30PM (#19769987)
    It's just that the boys at the FCC are go getters! Who cares if they aren't software security people, it's the FCC! They see a problem and are totally pro-active to take it on. Morality cops on TV and radio? That definitely falls within assigning and licensing portions of the EM spectrum for private industry. They're just going above and beyond.

    All hail the FCC!

    (can I puke now?)
  • This sounds a lot like microsoft "declaring" they are not bound by the GPLv3. They can make whatever "declarations" they want-- it doesn't mean they are necessarily true. Sadly-- IT management and most software radio users will read that as a fact and not an opinion.
  • by AHumbleOpinion ( 546848 ) on Friday July 06, 2007 @01:47PM (#19770229) Homepage
    I am not agreeing with the FCC on this one, but I am going to defend "security through obscurity" a little due to expected /. audience oversimplification and knee jerking. At times "security through obscurity" is a perfectly valid and desirable approach when used *alongside* other good techniques. It is only bad when it is the foundation of your security. Note that I am only addressing the security angle and not addressing open source philosophy (or for some out there religion).
    • by mark-t ( 151149 )

      [Security through obscurity] is only bad when it is the foundation of your security.
      It invariably is though. That's the problem with it.
    • No, even then Security through Obscurity is harmful. The problem is that it is not easy to tell if your foundation is secure without considerable peer review. By adding the obscurity element you lose your peer review. Even though you may think your foundation is secure, you may have holes that you don't know about. Sure it will be difficult for outside people to find them too, but if they do you're in a lot of trouble.

      Worse, the more obscurity you have, the harder it is to get the good stuff configured properly in the first place. Most security breaches come not from fundamental weaknesses in any of the algorithms, but operator errors and surrounding design flaws (like how you handle your keys). The best crypto sytems are the ones that are as simple to operate as possible, well documented, and provide lots of feedback and debugging information to the operator to make sure they are using it correctly.
  • by tom_evil ( 1121495 ) on Friday July 06, 2007 @01:49PM (#19770265) Bruce Schneier:

    "If an algorithm is only secure if it remains secret, then it will only be secure until someone reverse-engineers and publishes the algorithms. A variety of secret digital cellular telephone algorithms have been "outed" and promptly broken, illustrating the futility of that argument."

    from Crypto-Gram: September 15, 1999 []

    But what could we expect from an FCC headed by a lawyer, a businessman, a professional Senate staffer, a DRM-supporter who received coaching from Clear Channel to oppose a satellite radio merger, [] and a professional telecom corporate lobbyist.

  • This is exactly how the FCC should be expected to rule if it is so arrogant to rule on so broad a notion at all. Not because there is any real relevant security concern on the part of the FCC. There is a "security concern" that software radio in particular can make it hard for government and industrial bedfellows to protect their profits and control however. With a proliferation of software radio, especially at the hands of the prolific open source folks, things like cell phone lock in, relative scarcity
  • by moderatorrater ( 1095745 ) on Friday July 06, 2007 @02:13PM (#19770599)
    or at least misleading. It's not saying that the software is more insecure and it's not saying that open source software is insecure, it's saying that a phone with software that can be altered by a third party should be classified differently because of the hardware that it's running on. In other words, because a cell phone messes with radio waves, if the software on the phone is designed so that it can be altered by a third party, it should be treated differently then one in which the manufacturer controls the software. This isn't security through obscurity in that they're hoping for less bugs or security holes in the software, it's security by limiting the software that runs on the phone to just the hardware makers.
  • by PatSand ( 642139 ) on Friday July 06, 2007 @02:22PM (#19770729) Journal

    Interesting that they apparently didn't consult folks at NSA. Their operating hypotheses for any US cryptosystem are:

    1. The equipment is known and available for disassembly and testing

    2. The algorithm is known or discernable from the equipment and related manuals

    3. You have lots of output data from the device (the underlying plain text is properly)

    4. You don't have the key...that's what you need

    While I will grant that most folks never see any of this (most equipment, algorithm details, and key parts of repair/use manuals are classified), they assume the worst case and still make it secure. In other words, like having open source code and figuring out the key from that and clean output.

    While "Security through Restricted Access" is a very good practice, the argument is STUPID at best, and downright biased towards closed, proprietary software vendors. Frankly, these people couldn't encrypt their way out of a wet paper bag with a pen, ruler, and other sharp things like their pointy little heads.

    If they think it is "less secure" we can lock them up somewhere with whatever they want to crack an open source cryptosystem used as the jail lock and see how soon they get out. I hope they include a lifetime supply of food, water, toiletries, medicines, etc. I think a simple 1024 bit Elliptical Curve Cryptographic system will keep them safely behind bars for several decades, if not their lives.

    Where do they find these bozos to fill these positions? I'd like to know so we can close that source of universal stupidity off and make the world a better place...

    I guess these folks will never qualify for one of my D.O. letter...they're either just too stupid or have such low IQs that they need to be institutionalized immediately.

  • by m6ack ( 922653 ) on Friday July 06, 2007 @03:18PM (#19771499)
    The FCC is not talking about security in a way that most of the people in this thread are talking about. They are talking about REGULATORY security. For instance, they want to make sure that a radio cannot produce so many dBm spectral emission outside of it's band when it is operating in it's intended band. They want to make sure that your Linksys doesn't output more than so many dBm so that it doesn't blast out the neighbor's network. That is what they are talking about -- and they see these as the real hurdles in qualifying SW defined radios. They would rather have regulatory control at the developer's level than having to resort to investigation and bringing individuals to court.

    The issue is that this ruling benefits Cisco that wants to defeat the likes of Linksys, Netgear and others that are beginning to deliver "decent" solutions with cheap radios and the help of hobbyists leveraging open source software. If you require that some of the SW is closed, you cannot leverage the benefits of the open source module on that bit you have closed. You also have to end up spending more time organizationally to support the effort, because you have to maintain two sets of documents -- one for the closed section, and another for the open section. You have to support binary compatibility, or some mechanism for the open source to integrate with the closed source firmware... it just becomes that much more of a burden for Cisco's competitors to develop and maintain their solutions.

    So, please, don't flood the FCC with emails telling them that "Open source /is/ secure" -- from the standpoint of regulation, it's not! Flood them instead with messages that say, "This ruling is entirely prejudicial against many companies leveraging Open Source software for their solutions."

  • I miss the "old" FCC (Score:4, Interesting)

    by ( 213219 ) on Friday July 06, 2007 @04:54PM (#19772929) Journal
    A few years ago the FCC was overhauled in an effort to speed the processes of approval and allocation. At that time the most common complaint was that it took years to obtain approval for new technology. The truth is, that the old FCC did seem to drag their feet and yes, it was rather difficult to get approval for new technology and to get a piece of the radio spectrum reallocated you may as well forget about it. People and industry did have a lot to complain about. When the FCC did make a decision, it was (almost) always the right one, it had been well researched and lobbiests and lawyers had little influence, even the politicians really had very little say.

    When the system was overhauled, it was done with the best of intentions. They allowed industry access in ways that they never had before and the FCC had to start to rely on information presented by the very industry that they were intended to police! Today, we could almost describe the industry relationship with the FCC as symbiotic.

    The FCC has as it's primary charge the responsibility of making the public airwaves work for the public. They protect these airwaves by allocating frequencies, by approving new uses, and by certifying equipment that may use or interfere with the public airwaves.

    With technology changing so fast, and the airwaves being so crowded, and all sorts of new ideas (good and bad), the FCC has lots to do. Congress told them to work faster and be more responsive to industry. Industry does not want OSS, they view it as competition. They would rather develop copyrighted and even patented software to do this stuff so that they can earn a healthy return on investment. The FCC is simply echoing this as they have been instructed by congress to do (they see it as working with industry).

    OSS is sort of socialist when you think about it from the closed source standpoint. It is a threat simply because it is free. You would think public airwaves would be a place where free software would be at home -- and it should be but it isn't. Becuase the FCC is no longer really allowed to make the best decisions for the public. They must now answer to the very people they are supposed to police. That is simply wrong; they should answer to the public and the requirements of international treaties.

"Even if you're on the right track, you'll get run over if you just sit there." -- Will Rogers