katerina, tuesday
CASUAL (excerpts) {writer.axdx}
Anderson was never too clear about what [J.C.R. Licklider] was working on - something to do with making computer code as intuitive as ordinary conversation, and as easy as drawing a sketch. Anyway, it never panned out: when it came to programming, Lick had great ideas but terrible execution. But that almost didn't matter. To Lick, as Anderson soon realized, the important thing was to have fun with computers, to keep on pushing toward the future.
added by J. Split
on 2023-09-09
A recurring theme in this essay is the emphasis on programmer efficiency over program efficiency, and we believe that the Rust memory system does not improve programmer productivity, as users must manually tag reference-counted, mutable and immutable values. These tags should be generated by the compiler, and can be determined using escape analysis and searching for mutating expressions among other techniques. Many books and articles have celebrated the liberation of a programmer from handling memory themselves, including the self-describing "Unix Hater's Handbook" and other infamous texts such as "Structure and Interpretation of Computer Programs" and "Paradigms of AI Programming", When this liberation does not occur, as described in "Can my compiler solve the halting problem?", a non-tracing runtime must use heuristics to manage objects with obvious lifespans, and give the rest to the programmer to manage (see "Efficiency", a few paragraphs below).
added by katerina
on 2023-09-06
We believe the issue lies in attempting to rectify low-level programming and safety, which are largely incompatible.
added by katerina
on 2023-09-06
Programming paradigms have to stop somewhere to be useful: functional programs have to deal with some form of state (time, the user, files) and object-oriented programs typically are not objects all the way down.
Common object-oriented languages such as Java and C++ are very shallow in abstraction: the user must remember types of objects, and limit the possible values an object may store or produce to some types. C++ also has teething problems with its C roots: functions must be defined in advance to their usage, and esoteric header and preprocessor trickery is needed to avoid duplicate function definitions.
added by katerina
on 2023-09-06
bondage-and-discipline language
A language (such as Pascal, Ada, APL, or Prolog) that, though ostensibly general-purpose, is designed so as to enforce an author's theory of ‘right programming’ even though said theory is demonstrably inadequate for systems hacking or even vanilla general-purpose programming. Often abbreviated ‘B&D’; thus, one may speak of things “having the B&D nature”. See Pascal; oppose languages of choice.
added by katerina
on 2023-09-06
Our definition of masochist programming varies slightly, as we believe emphasis on a paradigm is not the root cause of issues, rather that kludges in the name of "efficiency" or "simplicity" prohibit languages from blossoming and being a convenience to the programer.
added by katerina
on 2023-09-06
Some masochist anti-features do not require a language to be low level: a clear procedural or imperative taste in an otherwise "functional" or "object-oriented" language, enforcement of a possibly sub-optimal "Right Way" to solve a problem, or traits inherited from older, potentially also masochist, languages, where they are not beneficial.
added by katerina
on 2023-09-06
We define "low-level programming languages" loosely, as requiring some mental state from the user: handling pointers, forcing static typing on the user, little support for object-oriented programming, and/or using an underdeveloped exception system.
added by katerina
on 2023-09-06
In the past I’ve debated with people about the efficacy of bullets (as designers are oft prone to do in their free time), and as a “pro-bullets” person, something I’ve just realized is that my affection for them comes from their value not as reading aides, but as writing aides. The conventions around bulleted text are such that in using them, there’s less pressure to write complete and well-constructed sentences. That makes them ideal for getting your thoughts out on the page. One might even say, “I think in bullets”.
added by Nico Chilla
on 2023-09-03
One odd thing about the Nock performance debate is that to date, the amount of money poured into making Nock go faster has been almost zero. Its bytecode interpreter was written by one guy, who isn't even a full-time employee, and that was enough of a win that we were able to delete tens of thousands of lines of jets. Almost all performance problems we've had in Urbit to date are what I call "stop hitting yourself" problems -- things like the runtime serializing a date to a string ten thousand times for no reason on every event, then throwing away the result. That actually happened a couple years ago. Fixing it brought CPU usage down to 4% on Urbit's busiest galaxy.
added by katerina
on 2023-08-30
For all its flaws, Urbit has done more to think through peer-to-peer versioning and migrations than anybody else. The runtime's new "epoch system" logs the runtime version of the first run to allow for principled fixes to jet mismatches; Nock can run other Nock code natively; the hoon language can compile and run other Hoon code natively; the Arvo kernel can hot-reload itself multiple times in the same event, while maintaining the call stacks of all its kernel modules; the runtime and kernel coordinate to ensure they're compatible with each other; the user has enough control to ensure that the kernel will only update if applications are ready for it; apps synchronize data in such a way that maximizes interoperability across protocol updates.
added by katerina
on 2023-08-30
The moral of the story is this: Even within the same purely functional, deterministic, single-threaded system, race conditions and asynchronicity were causing so many bugs we had to rewrite the system to avoid them. If each of those applications were a separate Unix process, like in Plunder -- or anything lacking an Arvo -- establishing a transaction that's atomic among all of them would require way more work -- on the order of N^2, where N is the number of applications.
added by katerina
on 2023-08-30
It's often a subtle difference, but the Urbit mindset includes the principle that if you have a choice, data is always better than code.
added by katerina
on 2023-08-30
Hoon does offer a few unique features that aren't found in normal functional languages. Both the typed metaprogramming story and negligible-overhead virtualization are really cool, but these features are basically pointless in most userspace code. Requiring that users learn an esoteric language because it has advantages they don't need doesn't seem like the best design choice.
added by katerina
on 2023-08-30
It is certainly true that Hoon feels more like a machine than many functional languages, but from having worked with it a fair bit, i'm unconvinced that this makes it easier to write error-free programs. My error rate is roughly the same as in other typed functional languages. I also find it more difficult to tell when a Hoon program is correct, because of encouraged use of mandatory state (the subject). Generally speaking, it seems to me like encouraging thinking of computer programs as physical machines is a false affordance, something that Urbit explicitly tries to avoid in other parts of the system.
added by katerina
on 2023-08-30
Elm is also orders of magnitude easier to learn and work with than C++ or Rust, but far more people learn C++ and Rust. All the evidence points to: people are willing to learn things if there is a big pay-off, and unwilling to learn things if there is not. The difficulty and novelty of the language aren't really significant factors to adoption. Hoon has explicitly banked on the same observation.
added by katerina
on 2023-08-30
A common argument is that none of this matters since traditional functional programming is hard and abstract, so we don't need to support it. Both Nock and Hoon are designed to be similar to physical machines, something that humans are used to thinking about. Compared to monad transformers, dependent types or Haskell's -XDataKinds extension, Hoon's type system is dead simple and significantly less abstract. Haskell's type checker is just under 14k LOC, the same size as the entire Hoon compiler!
But if you instead compare Hoon to Elm, the difference is much less stark. The entire Elm type checker is implemented on just over 3k LOC, and the type system is generally considered very intuitive and easy to understand, while also being more powerful than Hoon's. It's difficult to compare the type checker's code size to that of Hoon's, since the entire Hoon compiler is in a single 14k LOC file, but the full Elm compiler is roughly 26k LOC -- comfortably in the same order of magnitude. It's also not only more powerful but also significantly more featureful, with user-friendly error messages, a code pretty-printer and so on.
added by katerina
on 2023-08-30
Nock-Hoon is also unfriendly to laziness.
added by katerina
on 2023-08-30
Plan does away with the environment entirely. If you need a piece of data to be accessible to a piece of code, simply inline it or pass it as an argument. Recall that custom functions are defined using laws {n a b}, where a is the number of arguments. This means that the number of arguments is known for all functions, and the argument list can be stored in a contiguous memory block during execution. This means that dereferencing runs in constant time and works with the hardware instead of against it.
Inlining code and data or passing them as arguments may sound expensive or annoying. But "inlining" really only means "inline a pointer", and the extra arguments will be added by the compiler, not the programmer. This is something that can be done by any compiler, it does not make any assumptions on the programming language except that it can be lambda lifted.
added by katerina
on 2023-08-30
Nock's biggest innovation might be that it reduced this to logarithmic time. By treating names as a UX affordance rather than something that should be semantically important at the bottom layer, it could replace the association list with a tree ("the subject"). All name dereferencing has to be compiled to Nock 0, which runs in logarithmic time. As such, Nock is the first axiomatic computing system to be even remotely practical to use as a foundation for all of computing.
But logarithmic time is still logarithmic time, especially so when it comes to something as fundamental as name resolution, an operation that is performed many many times for any program you run. And this construction still fights the way modern hardware works.
added by katerina
on 2023-08-30
It is true that in theory, you can build all possible programs using just eval
and apply
from LISP 1.5. But in practice, the first case of eval
would cause significant problems: assoc
runs in linear time, meaning that name dereferencing gets slower the more names you have in your environment.
added by katerina
on 2023-08-30
The only reason modern computers are fast is because the memory controllers optimize heavily for reading contiguous memory. The further out from that (or a few recognized patterns) you get, performance falls off a cliff. In other words, Nock 0's tree addressing fights the way modern computers work, causing many L2 cache misses.
The plan to work around this is to make Nock compiled rather than interpreted, and:
- Perform subject knowledge analysis to predict where different parts of the subject will be stored.
- Make heavy use of the fact that Hoon generates the static Nock 9 rather than the dynamic Nock 0+2 for all its function calls, to predict which parts of the subject you need to access.
added by katerina
on 2023-08-30
It's a really clever solution, but should you really have to be that clever to do something so basic?
added by katerina
on 2023-08-30
Let's just get one thing out of the way before we dig into this section. A common retort is that the performance of Nock hasn't been a problem in practice. This is totally missing the point. The point of having faster computers isn't to have faster interactions, but to be able to do more things. In practice, slow performance is an application engineering problem. If you're fine with slow foundations, port all your apps to Electron. We can certainly work around Nock's performance limitations by only building certain things and always building them in very particular ways. Do we want to?
added by katerina
on 2023-08-30
And again, it's not only political. The main thing that is needed for the technological paradigm that Urbit is trying to establish to actually work, is a set of frozen core hardware interfaces. These need to be completely neutral and agnostic to any OS or applications that happen to use them. Simply sending a message to another computer should not require you to run any particular code from any particular source inside of your ship. Otherwise you're back to the old situation where the ground can suddenly shift under you. This is not standardization in any practical sense.
added by katerina
on 2023-08-30
Frozen software is powerful both politically and technologically. Politically, because the extent to which open protocols are socially agreed to be frozen is the extent to which they cannot be modified by the current power structure. The coordination cost is too high. Technologically, frozen software is powerful because of standardization. Once you've reached 0K, you have a standard that people can rely on, always and forever. We want a system where old code will always run on new implementations, and new code will always run on old implementations (though the latter may incur a performance hit). No more building on sand.
added by katerina
on 2023-08-30
Urbit is a centralized system
Since users that want to run new applications and communicate with their friends need to use recent versions of Arvo and its vanes, and since these are developed by the Urbit Foundation, Urbit is a centralized system. This has both political and technological implications. Political, because it means that the technology is susceptible to institutional capture. Technological, because it means that the ground can shift under your feet.
To some extent, these problems are intrinisic:
- Software must often change over time.
- Networked software must coordinate any changes to keep the network working.
- A central authority is required for this coordination.
- Central authorities, if they matter, will always be captured by the current power structure in society.
- Therefore, most networked software that matters will eventually be controlled by the current power structure.
An obvious strategy to mitigate this is to reduce "often" to "sometimes", and "most" to "some". We want to standardize as many interfaces and applications as possible, so that they don't have to change. This is the whole point of kelvin versioning, of freezing software.
added by katerina
on 2023-08-30
If i'm being realistic, standardizing something as complex and opinionated as an OS, not to mention an application model, doesn't exactly seem achievable in our current stage of civilization. Rather it is something that has to evolve under competition, and providing a frozen persistent execution environment with a base-level of compatibility seems like the best way to facilitate this evolution.
added by katerina
on 2023-08-30
The confident statement closing the Design Principles of Smalltalk (Ingalls, 1981) remains true, despite programmers, even radical or anarchist or anything else, attempting to avoid it: “Even as the clock ticks, better and better computer support for the creative spirit is evolving. Help is on the way.” As long as there are programmers, there will always be an urge to let the creative spirit inflict itself on programming more than before. But will this urge be followed through with, or will we all collectively agree to silence it?
added by katerina
on 2023-08-22
To reiterate what we are pushing for in the end: we believe that the computer can, and should be, an ideal transport for one’s imagi- nation and the ideas set forth by it. Today, we are poised to rebuild a computing environment set to support the creative process, with fewer reasons to self-restrain; and to form collaborative bonds which do not diminish the uniqueness and imaginations of their participants.
added by katerina
on 2023-08-22
We hope for the annihilation of odd constraints and the decentralisation of all things, and only then could we have properly liberated computing. To achieve this, we must dissolve hierarchies in our soft- ware, our means of production, and our social applications of software.
What would be produced would be formless and incomparable to any computing systems that exist now; but what could emerge from such formlessness and adaptability would be absolutely beautiful!
added by katerina
on 2023-08-22
It is not so unrealistic to compare the creation of one computer to hundreds of books and tapes, and the lifespans wasted by people performing the dull work themselves simultaneously. But if we continue with the simplification-at-all-costs which the likes of the Gemini protocol promise, such a comparison would provide much better evidence in favour of avoiding computing.
added by katerina
on 2023-08-22
A computer is a “universal simulator”, and so with the appropriate peripheral devices, it is a form of meta-media, in which its users construct other forms of media, such as books, animations, programs, and so on. (Some of these forms of media even could only exist on a computer, such as video games.) Reducing the forms of media one can produce is antithetical to the concept of computing itself, and makes any labour put into designing computers and their media look pointless.
added by katerina
on 2023-08-22
computer is a “universal simulator”
added by katerina
on 2023-08-22
We do not feel that technology is a necessary constituent for this process any more than is the book. It may, however, provide us with a better “book”, one which is active [. . .] rather than passive. It may be something with the attention grabbing powers of TV, but controllable by [the user] rather than the networks. It can be like a piano: a product of technology, yes, but one which can be a tool, a toy, a medium of expression, a source of unending pleasure and delight. . . and as with most gadgets in unenlightened hands, a terrible drudge!
— Kay, 1972
added by katerina
on 2023-08-22
This is still magnitudes more code than the more common reaction to the complexity of the Web would require, in which protocols are proposed that are intended to only display “documents”. As we have mentioned, what constitutes presenting a text document is very dubious, yet most suggestions provide very little to work with when tasked with book-making. Gemini is one example of this reaction, accompanied with some vague, nostalgic association with the “essence of the Web”. It is supposed you can write a Gemini client in less than a few hundred lines of code, yet the end result is sending uninteractive text with minimal formatting across a network. For what it can produce, the result is hardly an advancement over the printing press! We don’t need a computer to publish text documents; pen and paper, and either patience or a photocopier would suffice. And, of course, you can draw and format the text however you like with pen and paper. What we need instead is a better book.
added by katerina
on 2023-08-22
However, it should not be necessary to modify a server to provide these new means of presentation. To modify the behaviour of the platform without modifying the platform itself, the platform will have to communicate in programs and/or objects that describe and present themselves, instead of text and/or plain media formats.
Implementing such a platform may be very difficult to begin with, but it is much more tedious to incrementally extend a platform, such as the Web or some protocol residing on it such as ActivityPub, that merely display documents and texts. For reference, the Chromium web browser contains about 34 million lines of code, and a complete Squeak Smalltalk environment contains only about 5 million; 4 million lines in OpenS- malltalk, a just-in-time compiling virtual machine, and 1 million lines in the Smalltalk environment, including a graphical environment, byte- code compiler, debugger and class browser, and some other components that do not appear in a browser, including an email client, graphics framework, and package manager.
added by katerina
on 2023-08-22
Beyond that, there are many more advanced means of presentation which can be immediately seen to have uses, that are not even close to implementable on the common platforms of today, such as viewing three-dimensional objects, dynamic and randomised mediums like soundscapes, and simulations of natural phenomena. Should our dreams of casual programming come true, it would not be hard to believe sharing programs directly would be commonplace.
added by katerina
on 2023-08-22
It may be argued that opening up the means of presentation may make it inaccessible, as some formats are difficult to interpret by some users. However, alternative presentations can be recovered at the least, which is not the case for workarounds for less expressive means of presentation.
added by katerina
on 2023-08-22
Opening up a platform to accept any means of presentation would annihilate any gimmicks or distinguishing features of it, but that may be the most interesting approach possible, as such a platform could present any information in the most appropriate way. Even with the media that common platforms support, support is limited to a lousy subset, usually prohibiting typesetting of mathematical equations, referencing, and sometimes even basic formatting.5 While, say, Mastodon and LATEX both transmit text in some form, the former is evidently more suitable for near-real-time communication, whereas the latter is more suitable for long-form writings, such as this book. It would not be hard to give the former the capabilities of the latter: some servers already provide mathematical typesetting and formatting options.
added by katerina
on 2023-08-22
Diverse information demands diverse representations, and forcing information across many services that each handle a specific means of presentation and representation creates fragmentation and hampers discoverability.
The first one is the Marxist notion of a general intellect. With today’s platforms, we are not facing such a phenomenon. Our use of contemporary digital platforms is extremely fragmented and there is no such thing as progress of the collective intelligence of the entire working class or society. Citizens are facing relentless efforts deployed by digital capitalists to fragment, standardise, and ‘taskify’ their activities and their very existences.
— Casilli and Marsili
added by katerina
on 2023-08-22
The inverse of a “radical” proprietary software-producing firm as imagined in Source materials has appeared many times before, with “free software” projects that are run like proprietary projects, such as the Signal messaging program. The Signal developers have many arbitrary and unenforceable constraints on their users, like disallowing users from using their own modified clients because it somehow slows down making changes.
added by katerina
on 2023-08-22
Both social capitalism and digital feudalism are forms of asymmetricality in power, which is anathematic to socialisation and establishing any kind of relationship or mutual trust, allowing one to write whatever and repress anyone seeking to hold them accountable. It is no less important to abolish social asymmetry than it is to abolish asymmetry in computer systems, and a decentralised computer system cannot try to enforce the former with the latter. While the demands ten years ago may have been to establish decentralised networks, we now demand social decentralisation of the resulting networks which only exhibit decentralisation in computer networks.
added by katerina
on 2023-08-22
The assumption that a discussion belongs to anyone leads to hierar- chical and asymmetrical dynamics, including a suggestion we recently read that “[we need] posts that can only be replied to by mutuals but public and shareable freely”. While this admittedly sounds very nice, it is trivial to abuse, and is a clear continuation of this myth of social property. The assumption here extends to expecting that the person who posted the first message is going to be polite. When that person is being impolite, restricting replies but freely allowing sharing is quite the opposite of what may be desirable. As such, indicating that this would produce a safe space is also misleading; as with politeness outside the Internet, one who publishes publicly should expect to treat their readers as they would like to be treated themself, and there are no such guarantees in an asymmetrical environment.
added by katerina
on 2023-08-22
The belief that a conversation somehow belongs to someone, and that they have some authority over it even, is highly erroneous. If we were to assume that someone could own a conversation, we may as well analyse a conversation as if it were a commodity. It is clear that any notion of value is produced by whoever continues and reads the conversation. The role of a host of any shape is greatly overstated; Kleiner notes that “the real value of [sites that share community-created value are] not created by the developers of the site; rather, it is created by the people who upload [content] to the site.” (Kleiner, 2010) A conversation in a public space derives value from all participants in the conversation, and so any one of the participants has an equal claim to
“owning” it.
added by katerina
on 2023-08-22
In short, it is very hard to say who wins in digital feudalism; many users are at the whim of tyrannical moderators, or cannot contribute without any good moderators, and the magnitude and quality of the work moderators must do cannot be healthy for them in the long term. Providing users with the ability to collaborate and filter their own environment, and reducing the stress of moderators would universally improve the subjective quality of the Fediverse.
added by katerina
on 2023-08-22
No followers of an ideology or lack thereof are particularly more aware of the kind of power dynamics they partake in than others, despite any claims otherwise; our fellow radicals appear to forget what they learned about the People’s Stick and all those other analogies and critiques for power, when they believe they can do self-contradictory things like forming “an organisational model and governance that puts marginalised voices first” and provide autonomy for their users by strengthening moderation and centralising power into one in-group. In the case that a server operator must step in, they must have a damned good reason, their actions should almost always be reversible, and if not, they better be able to be held accountable if they screw up. This naïve trust in “moderators doing moderation” lead one of our colleagues to write a corollary to Bakunin:
When the people are being beaten with a stick, they are not much happier if it is a particuarly efficient stick, that allows many people to be beaten at once.
— Dlorah, 2020
added by katerina
on 2023-08-22
The first system we identified was one of digital feudalism; in which moderators, who usually would act as if they are serving their serfs, have total and essentially unaccountable control over them. On the Fediverse, this analogy is pushed even further than usual, as servers and their hosts act as the lords of this system; as it is quite difficult to transfer an account between servers, and that notion of “transferring” loses identity anyway.
added by katerina
on 2023-08-22
Perhaps the more radical projects which end up recreating these forms of hierarchy are the most insidious, as one expects them to have made an improvement.
added by katerina
on 2023-08-22
Many software projects, including free-and-open-source projects and even soon-to-be “ethically licensed” projects, create hierarchies that have no need to exist. Two examples of types of project ap- pear to continuously perpetuate hierarchies: cryptocurrencies, for self-evident reasons;1 and discussion sites of various forms, including micro-blogging sites (including Mastodon, Twitter, and so on), and online forums (Lemmy, Lobsters, Raddle/Postmill, Reddit, and so on).
Terms like “digital feudalism” or “social capitalism” are often used to describe hierarchies in socialisation. While some will object to misusing terminology like that, we will continue to use it as such, as our readers will immediately acknowledge that these are probably not good systems to keep around, and that they are good analogies of the power structures they describe.
added by katerina
on 2023-08-22
Another view on copyleft licenses is that they may implicitly threaten state violence. For example, the Centre for a Stateless Society, a publisher and think-tank for left-wing market anarchism, explicitly shuns both intellectual property and copyleft licensing. However, the alternative (of legally permitting such uses) may, again, lead to state and corporate violence. It may be difficult to avoid using the threat of the state to reduce the possible violent acts done; but the immediate threat of viral licensing deters many actors without having to do much oneself. (Using state violence to possibly prohibit itself is at least very amusing to some users of such licenses.)
added by katerina
on 2023-08-22
An interesting question is, assuming that these licenses can be upheld in a court (which is not even clear for the GNU General Public License, a relatively more permissive license), are there any actors that can violate them anyway? It is unlikely states are going to be held properly accountable for violent acts they commit soon, and more unlikely that license violations will be part of hearings on such acts, but there is not much one could do to prevent some actors from using your products other than to not produce them. A similar thing can be said for sufficiently large corporations which no court would want to try to control. Reasoning like this is often used to avoid these licenses, and we cannot deny that it is entirely possible that they could be worked around; but these licenses can still serve as deterrents to malicious adoption. For example, Google has banned use of any software licensed under the Affero General Public License,8 as the “virality” of the license could require them to release sources for their internal code, which would threaten their proprietary surveillance model. This threat has been a crucial tactic for keeping copyleft code out of proprietary products for years, and we can adopt these tactics, but with a better sensibility for what software freedom may be.
added by katerina
on 2023-08-22
Ethical licenses, such as the Non-Violent Public License (based on the Cooperative Software License), potentially provide another option for developers who are concerned that their products may be used for very bad things, instead of living with the possibility, or avoiding developing what they want. These ambiguous situations come up more frequently than one might think; frequently when a product appears politically inert, but is used in a larger context, which can be very polarized. For example, a database may be used to track the finances and resources of a collective, or it may be used to comb through data collected from mass surveillance. A developer can now begin to distinguish between the uses they believe are moral, and the uses they believe are immoral, with such a license.
added by katerina
on 2023-08-22
In this sense, The Lisp “Curse” is only real because we make it real. We set up metrics and constraints that promote conformist and centralist development strategies, then pat ourselves on the back for having made it impossible for anyone to test new ideas on their own. These sorts of metrics and organisational methods “treat [. . .] all twists and turns in the process of theorizing as more or less equivalently suspect, equivalently random flights of fancy,” (Gillis, 2015) which have no real purpose. It would be interesting to see if consciously attempting to avoid this centralism would produce better quality soft- ware; allowing developers to go off on any interesting tangent, and breaking the illusion that there is one way to achieve the aim. This state should not pose any issues with sufficient communication; even if it does not appear very coordinated, or the group of diverging pro- grammers appears uncooperative, they are much more likely to find the right approach and base for their product. The environment can also be made more condusive to finding new approaches, by opening communication channels, and publishing the source materials required to implement a product.
added by katerina
on 2023-08-22
A culture that encourages experimentation, but allows the commu- nity to settle on usually-good defaults, can remain cooperative and cohesive without risk of stagnation; the many forms of communication which do not require formal arrangements, and the rapid network effects produced by online communication can support both qualities. The community should also be aware of duplication of code when its prototypes converge, and make a goal of reducing such duplication and improving code quality; which is hard to “sell” as code quality is hard to quantify and concisely describe, but is of course necessary to support further development.
added by katerina
on 2023-08-22
Peer production can better support decentralised development that doesn’t come upon a consensus for whatever reason. This is a natural effect of developing sufficiently broad concepts and concepts which the theory of is frequently changing; where there are many ways to implement the concept which are not always better or worse than one another. Examples of this theme are hash tables and pattern matching, for which there are many implementations with varying performance characteristics, and for which research appears to generate a new technique or optimisation every few years.
added by katerina
on 2023-08-22
Measuring progress by the usage and completion of one implementation of a concept is an inherently useless measure; it would not consider the progress made in another implementation or design. Such a measure “subjects people to itself” (Stirner, 1845) and inhibits their creative processes. “You’re imagining big things and painting for yourself [. . .] a haunted realm to which you are called, an ideal that beckons to you. You have a fixed idea!” Progress on producing should be measured in how much of the design space has (or can be easily) traversed, as opposed to the completion of one product; a poor design choice could entail a final product being unfit for its purpose, but a failed prototype is always useful for further designs to avoid. With that metric, a decentral development model is greatly surperior to a centralised model.
added by katerina
on 2023-08-22
Disempowering a community has negative effects for creating one “unified” product, too. While inducing difficulty to go ahead with any decision that isn’t unanimous leads to a consensus, it is an entirely arbitrary consensus, which can be as terrible as it can be good. The resulting structure may be good at giving orders and techniques to its participants, but “although everyone marches in step, the orders are usually wrong, especially when events begin to move rapidly and take unexpected turns.” (Bookchin, 1971) Suppose a sufficiently large group of scientists, say all the physicists in Europe, were all told to perform the same experiment with the same hypothesis, the administration in charge would be laughed at, as it would be hugely redundant and inefficient to investigate only one problem and one hypothesis with as many physicists. However, such a situation is presented as an ideal for software design, when groups pursuing their own hypotheses and theories are considered “uncoordinated”, or called “lone wolves”. Disempowerment also precludes the group from attempting another strategy, without another unanimous decision on which to attempt.
added by katerina
on 2023-08-22
In short, the apparent incoherence of peer production should be embraced instead of lamented, as we may stand to learn a lot from incomplete prototypes when trying to produce some sort of grand unified product.
added by katerina
on 2023-08-22
True innovation also involves questioning the assumptions that almost everyone agrees with. This can sometimes make those of us engaged in research feel a bit like thought criminals. [. . .] Of course, no one is going to send inquisitors to our homes to persecute us for disagreeing with the mainstream. However, you can run out of funding very fast.
— Bracha, 2013