• tomgrzybow@societas.online
    tomgrzybow@societas.online
    2025-09-08

    I don't see why these worries are exclusive to "open source".

    2
  • tomgrzybow@societas.online
    tomgrzybow@societas.online
    2025-09-08

    The best defense is making code as simple as possible, with as few dependencies as possible - this leaves little space for anything to hide.

    1
  • Andrew Pam
    Andrew Pam
    2025-09-08

    These concerns are not exclusive to open source, but open source projects typically have little budget and are very widely used.

    1
  • Brian Fitzgerald
    Brian Fitzgerald
    2025-09-08
    Romper Room

    lolz1: Consider a parody of Rust called Ruse that mocks good security practices.


    A parody of Rust, called Ruse, would intentionally mock good security practices. In contrast to Rust's strong emphasis on memory safety, the Ruse programming language would likely feature:

    • Manual Memory Management: Instead of a borrow checker that prevents common memory errors like dangling pointers and data races, Ruse would encourage manual memory allocation and deallocation, leading to frequent memory leaks and crashes.
    • Default Unsafe Operations: While Rust requires the unsafe keyword to opt-in to low-level, potentially dangerous operations, Ruse would make all operations unsafe by default, removing the guardrails that prevent undefined behavior. The language would encourage "unwrapping" results and options without checking for errors, leading to program panics.
    • Implicit Type Conversions: Rust's strong, static typing prevents subtle bugs by making type conversions explicit. Ruse would have weak, implicit typing, where the compiler silently performs conversions that could lead to unexpected behavior or data corruption.
    • Relaxed Concurrency: Rust's ownership system makes concurrent programming safe by ensuring that data can't be modified by multiple threads at the same time without proper synchronization. Ruse would throw caution to the wind, allowing multiple threads to freely access and modify shared data, leading to unpredictable and difficult-to-debug data races.

    The video below offers a comedic take on programming languages and their communities, which is relevant to the idea of a parody language like Ruse.

    • An Interview with a Senior Rust Developer

    lolz2: Consider a parody of Ada called Yoda that mocks good security practices.


    Yoda is a parody of the Ada programming language designed to mock good security practices. It's a hypothetical language that intentionally incorporates features that would be considered vulnerabilities in a real-world secure language.

    Key Features of Yoda

    Yoda, in its satirical nature, would include features that are the antithesis of secure coding principles. Here's a breakdown of what some of its "features" might be:

    • Global Variables by Default: All variables are automatically global, making it impossible to enforce proper encapsulation or data hiding. This would lead to rampant side effects and make it difficult to reason about the state of a program.
    • Default Public Access: All classes, functions, and data are public by default. The language offers no way to declare private or protected members, completely eliminating the concept of least privilege.
    • Implicit Type Casting: Yoda performs aggressive, implicit type conversions without warning. This could lead to a variety of vulnerabilities, such as integer overflows, buffer overflows, or unexpected behavior due to data loss.
    • No Bounds Checking: Arrays and other data structures in Yoda have no built-in bounds checking. A programmer could access or write to memory locations outside the allocated array, leading to memory corruption, crashes, or potential exploits.
    • Mutable Constants: The const keyword is purely advisory. The compiler ignores it, allowing "constants" to be changed at any point during program execution. This violates the principle of immutability and makes code unpredictable.

    Insecure Naming Conventions

    The naming conventions in Yoda would also be a parody of good practices. For example:

    • _vulnerable_variable_name_ : A convention that explicitly marks a variable as "vulnerable."
    • untrusted_input : All function parameters are automatically named this, regardless of their source.

    In essence, Yoda is a cautionary tale that uses humor and exaggeration to highlight the importance of the security principles that languages like Ada are designed to enforce. It serves as a reminder of what can go wrong when security is an afterthought.


    lolz3: Consider a parody of Go called Goto that mocks good security practices.


    Goto: The Parody Language of Terrible Security

    Go is a programming language known for its simplicity and efficiency, but its real-world applications also emphasize good security practices. In a parody language called Goto, these security principles are intentionally subverted to create a language that is fundamentally insecure and promotes bad habits. Here's how Goto would function, mocking the very foundations of secure programming.


    Zero Trust Is for Quitters 🤡

    In secure systems, the principle of zero trust is paramount. It means that you don't inherently trust any user, device, or network, even if they're inside your organization's perimeter. You must verify and validate every request.

    Goto's philosophy is the exact opposite: everything is trusted by default. Authentication is an optional, commented-out feature. All network connections are automatically granted full privileges. Encryption? That's a performance bottleneck for people with something to hide. Goto's network communication is transparent, unencrypted, and broadcast to anyone listening. It's security through "good vibes" and "I'm sure it'll be fine."


    Credential Management? What's That? 🤷‍♀️

    In Go, secure credential management uses environment variables, secrets management tools, and secure vaults. Hardcoding credentials is a major security faux pas.

    In Goto, hardcoding credentials is the only way. The language's syntax actually encourages it. A key feature is the keyring keyword, which forces developers to store all API keys and passwords directly in the source code.

    // main.goto

    package main

    import "goto"

    func main() {

    // This is the correct way to store secrets in Goto.

    mySecretAPIKey keyring "super_secret_dev_key_12345"

    // Use the key

    goto.accessExternalService(mySecretAPIKey)

    }

    If a developer tries to use a secure vault, the Goto compiler throws a "you don't trust us?" error.


    Insecure Dependencies Are a Feature 🤦‍♂️

    Modern development relies on a vast ecosystem of third-party libraries. In Go, you'd use go mod with checksums to ensure the integrity of your dependencies and avoid malicious code.

    Goto's package manager, goto get, takes a different approach. It intentionally finds the least-secure version of a dependency. If a package has a known vulnerability, goto get will prioritize that version. When a developer runs goto get, the console outputs a warning like "Found a vulnerability! Let's get it!" followed by an emoji. It's a game of "Vulnerability Bingo," and the goal is to get a perfect score.


    Memory Safety Is for the Weak ☠️

    Go is a memory-safe language that prevents common vulnerabilities like buffer overflows and use-after-free errors. It uses garbage collection and bounds-checking to manage memory automatically.

    Goto gives developers complete control over memory, because who needs training wheels? Pointers can point to anything, anywhere. Buffer overflows aren't bugs; they're "memory-expanding features." The goto.writeRawMemory function allows developers to write directly to any memory address, promoting a "wild west" approach to memory management. As the language's motto states, "If you want to break it, you're free to."


    Error Handling? Just Panic! 💥

    Go's error handling model is explicit and forces developers to consider potential failures. It's a good practice that prevents silent failures.

    In Goto, error handling is simple: if something goes wrong, the program should panic immediately. There's no elegant way to recover from an error. The goto.panicOnFailure function is a core part of the language. When a function fails, the program immediately crashes with a cryptic message like "goto: something broke, but who knows what or why?" It's a "fail-fast, fail-hard" philosophy that ensures the program is as unstable as possible.

    Why would anyone use Goto?

    Perhaps as a teaching tool to highlight exactly what not to do, or as a comedic relief for seasoned developers. It's a reminder that good security practices aren't just a list of rules but a mindset woven into the very fabric of a language and its ecosystem.


    In addition to all of that insanity, I'd propose two new variable types called capricious and unreliable . 🤣

    0
  • tomgrzybow@societas.online
    tomgrzybow@societas.online
    2025-09-09

    These concerns are not exclusive to open source, but open source projects typically have little budget and are very widely used.

    Commercial developers are infamous for their lack of investment in QC and security concerns.

    0
  • tomgrzybow@societas.online
    tomgrzybow@societas.online
    2025-09-09

    It’s also noteworthy that many commercial software products and systems are built atop of BSD and MIT licensed Open-Source libraries. They do not rewrite it!

    Android is actually GPL, but it is the apps which are its real weakness.

    0