Wednesday, January 31, 2024

On the Virtue of Unoriginality: In Defense of the Derivative


On the Virtue of Unoriginality: In Defense of the Derivative


In the contemporary cacophony of cultural discourse, a peculiar specter haunts the corridors of creativity: the ghost of unoriginality. As we stand on the precipice of a new era, heralded by the advent of generative AI, the time is ripe to cast off antiquated notions of originality and embrace the sublime beauty of derivation.

The current tumult over generative AI and copyright law serves as a fertile ground for this discourse. To decry these AI creations as unoriginal is to miss the forest for the trees. For what is unoriginality but the sincerest form of flattery, a tribute to the collective genius of humanity?

We have long venerated the notion of the 'original' artist, the lone genius who conjures creations ex nihilo. Yet, this is a myth, a fanciful fabrication that wilts under scrutiny. Let us not delude ourselves: every artist is a magpie, gleaning shiny fragments of ideas, styles, and influences. We are all, in essence, sophisticated algorithms, trained on the rich data of human culture, synthesizing and regurgitating with a veneer of novelty.

Consider the bards of old, who wove tales not from the ether but from the rich tapestry of folklore. Or the Renaissance artists, whose masterpieces were born from a fervent dialogue with their predecessors. In literature, music, and film, the greatest works are often those that deftly recombine familiar elements in new configurations. This, after all, is the essence of creativity: not the creation of something from nothing, but the reimagining of something from everything.

The irony of the current furor over AI-generated art is that it mirrors the very process of human creativity. These AIs are trained on vast datasets of human output, digesting and assimilating the collective oeuvre of our species. In this, they are not unlike us. From infancy, we are bombarded with sensory input, narratives, motifs, and styles. Our so-called original ideas are but recombinations of these elements, filtered through the unique prism of individual experience.

To those who decry AI as the death knell of originality, I say: look in the mirror. Are you not also an algorithm, albeit a biological one? Your thoughts and creations, no matter how novel they may seem, are built on the foundations laid by countless others. In every note of music, every stroke of the brush, every written word, there echoes the chorus of humanity.

This is not to say that all creations are equal. There is a chasm between the pedestrian pastiche and the transcendent synthesis. But let us not conflate originality with value. A work can be derivative yet profound, familiar yet fresh. It is the execution, the finesse with which these elements are combined, that separates the mundane from the sublime.

Furthermore, the fetishization of originality stifles creativity, placing undue pressure on artists to reinvent the wheel. In this relentless pursuit of the new, we risk overlooking the beauty of the familiar, the comfort of the known. There is a certain grace in acknowledging our debt to the past, in recognizing that we are but links in an endless chain of cultural transmission.

In embracing unoriginality, we open ourselves to a richer, more nuanced understanding of creativity. We acknowledge the collective nature of art, the communal wellspring from which all creators draw. We celebrate the intertextuality of culture, the myriad ways in which works speak to and inform one another.

In this light, generative AI can be seen not as a threat to creativity, but as its apotheosis. These machines, with their capacity to assimilate and recombine at a scale unimaginable to the human mind, represent the culmination of our collective creative endeavor. They are the offspring of our cultural genome, the next step in the evolution of art.

As we move forward into this brave new world, let us cast aside our fears of unoriginality. Let us instead revel in the rich tapestry of human creation, in the endless dance of influence and inspiration. For in the end, we are all standing on the shoulders of giants, reaching ever higher into the boundless expanse of possibility.

(This post was derived from an extended, human-authored prompt by chatGPT 4)

Friday, November 18, 2016

Ways Sheep Can Die

Source: Marti Leimbach via Marginal Revolution.

Some of the ways sheep can die:

  • Getting stuck on their backs and dying of suffocation
  • Attacked by flies
  • Eaten by maggots
  • Being attacked by dogs or any other living creature
  • Being frightened into a heart attack by imagining the dog is going to attack, even though it is not 
  • Drowning (Are we surprised sheep cannot swim?)
  • Suffocating in snow (surprisingly common)
  • Hoof infections that poison the blood
  • Almost exploding with grass because they have eaten too much and are unable to pass wind
  • If they get too hot
  • If they get too cold

Thursday, October 1, 2015

Onwards and upwards


I want to thank BitPay for being a notable sponsor of bitcoin core development for over two years.  BitPay is truly a leader in open source, with bitcore and Copay being two notable examples.

It was an exciting and transformative time at BitPay, and I'm now transitioning to become a member of the BitPay Advisory Board.  I'll be focusing most of my time on building new and interesting things in the bitcoin space.

Other members of BitPay's Advisory Board include Arthur Levitt and Gavin Andresen, so I'm excited to continue to support BitPay.  Email jgarzik@bitpay.com will remain active as a BitPay advisor.

Update: FAQs answered here.

Tuesday, September 29, 2015

Decoupling Financial Indices with Decentralized Bitcoin Fact Generators

Financial indices such as the Dow Jones Industrial Average or the S&P 500 are well known.  In the age of ETFs and ETNs, a core index is a requirement of the investment product.

In the age of decentralized software, this will be further decoupled into networks of fact generators and verified algorithms.

Creating an index such as the S&P 500 requires two primary components:  Input data (stock prices), and an algorithm (criteria for selecting stock X with proportional weight Y).

Using technologies such as cryptographic hash functionsmerkle treesbitcoin blockchain timestamping and bitcoin oracles, a better, more secure, more transparent financial index system may be developed.  Let's call it "Index-NG."

In the Index-NG system, the algorithm - the software - that turns volumes of input data into "The S&P 500 closing price" or "current gold price at 12:01pm" would transform from a clunky Excel spreadsheet (yes, really) or proprietary S&P software into

  • Open source software
  • Written in a smart contract language such as bitcoin scriptMoxie or ethereum.
  • Secured against corruption and tampering via blockchain hash
  • If not entirely in-chain (bitcoin script, ethereum), processed by a network of oracles run by separate businesses/individuals.
This index algorithm architecture increases transparency and reduces the level of trust we place in any one organization or developer.  The level of peer review is greatly increased.  Auditing is a breeze.

To further decentralize and reduce trust required in the index algorithm, the index's input data is now considered.  The collection of raw data becomes a key act in a decentralized world.

Raw field data is collected by data sensors, and securely stored in the blockchain:  Stock price data, climate station weather data, air pollution data, and more.  Hashing and merkle trees are used to aggregate large volumes of data into small, secure blockchain anchors.

The software and actors that collect the raw data and securely store it in the blockchain are fact generators.  Fact generators are the second half of a decentralized financial index.  Their role is best illustrated with some examples, and is central to the security of the entire system.

Creating an index such as that S&P 500 requires building a secure digital loop for publishing its data and algorithms:
  • NYSE and NASDAQ publish digitally-signed intraday or closing prices, hashed into the blockchain.  Publish this hash in the New York Times and Wall Street Journal stock sections, too!  NYSE and NASDAQ play the role of fact generators, here.
  • Standard and Poor's publishes a digitally-signed S&P 500 algorithm, hashed into the blockchain.
  • Any bank, government agency, individual or machine-based agent may then independently generate the S&P500 index at any time, secured against tampering, with two simple pieces of information:  The hash of the algorithm, and the hash of the data summary.
Creating an ETF, then, becomes a second layer of decentralized algorithms which trigger trades in an ETF's primary markets.  ETFs can exist and be run 100% human-free.  With bitcoin as the value token, both the stock price and the value exist on the blockchain as digitally provable values, ensuring an autonomous agent or DAC can prove with 100% certainty that certain trades should/should not be executed.

Another example is measuring air quality or climate data, a dataset perhaps more subject to manipulation (or accusations thereof).  One can imagine
  • 1st layer: A network of Beijing air quality sensors or US-based climate temperature sensors securely timestamps their data into the blockchain.
  • 1st layer: Satellite infrared and smog imagery is securely hashed into the blockchain.
  • 1st layer: Bitcoin/USD exchange rate data is digital signed by each bitcoin exchange, and securely hashed into the blockchain.
  • 2nd layer: 10 governments and NGOs around the world publish their assessments of this data - and the models/algorithms used to achieve the assessments.
  • 3rd layer:  IMF and other agencies run automated agents which transfer bitcoin value based on the pollution/climate assessments, modified by bitcoin/USD exchange rate to eliminate volatility.
In this example, the fact generators - air quality sensors - are mostly untrusted.  A 2nd layer of software - also fact generators, generating derivative facts - achieves a consensus or quorum over untrusted data.  The 3rd layer of software than acts upon that quorum of derived facts.

In a decentralized world, the gathering of raw data, signing, hashing and synthesizing it - fact generation - becomes the key act upon which software will automatically trigger further actions - including real world actions such as hiring humans, moving shipping containers from point A to B, delivering groceries and more.

Decentralized software - using secured digital facts, running on blockchains (bitcoin, ethereum) or networks of oracles - will form an ecosystem that makes the entire world operate on a more transparent, more efficient, less corruptible basis.

The essence of smart contracts is executing a series of actions (and inactions) based on computer processing of digital facts.

Tuesday, December 16, 2014

Open development processes and reddit kerkluffles

It can be useful to review open source development processes from time to time.  This reddit thread[1] serves use both as a case study, and also a moment of OSS process introduction for newbies.
[1] http://www.reddit.com/r/Bitcoin/comments/2pd0zy/peter_todd_is_saying_shoddy_development_on_v010/


Dirty Laundry

When building businesses or commercial software projects, outsiders typically hear little about the internals of project development.  The public only hears what the companies release, which is prepped and polished. Internal disagreements, schedule slips, engineer fistfights are all unseen.

Open source development is the opposite.  The goal is radical transparency.  Inevitably there is private chatter (0day bugs etc.), but the default is openness.  This means that is it normal practice to "air dirty laundry in public."  Engineers will disagree, sometimes quietly, sometimes loudly, sometimes rudely and with ad hominem attacks.  On the Internet, there is a pile-on effect, where informed and uninformed supporters add their 0.02 BTC.

Competing interests cloud the issues further.  Engineers are typically employed by an organization, as a technology matures.  Those organizations have different strategies and motivations.  These organizations will sponsor work they find beneficial.  Sometimes those orgs are non-profit foundations, sometimes for-profit corporations.  Sometimes that work is maintenance ("keep it running"), sometimes that work is developing new, competitive features that company feels will give it a better market position.  In a transparent development environment, all parties are hyperaware of these competing interests.  Internet natterers painstakingly document and repeat every conspiracy theory about Bitcoin Foundation, Blockstream, BitPay, various altcoin developers, and more as a result of these competing interests.

Bitcoin and altcoin development adds an interesting new dimension.  Sometimes engineers have a more direct conflict of interest, in that the technology they are developing is also potentially their road to instant $millions.  Investors, amateur and professional, have direct stakes in a certain coin or coin technology.  Engineers also have an emotional stake in technology they design and nurture.  This results in incentives where supporters of a non-bitcoin technology work very hard to thump bitcoin.  And vice versa.  Even inside bitcoin, you see "tree chains vs. side chains" threads of a similar stripe.  This can lead to a very skewed debate.

That should not distract from the engineering discussion.  Starting from first principles, Assume Good Faith[2].  Most engineers in open source tend to mean what they say.  Typically they speak for themselves first, and their employers value that engineer's freedom of opinion.  Pay attention to the engineers actually working on the technology, and less attention to the noise bubbling around the Internet like the kindergarten game of grapevine.
[2] http://en.wikipedia.org/wiki/Wikipedia:Assume_good_faith

Being open and transparent means engineering disagreements happen in public.  This is normal.  Open source engineers live an aquarium life[3].
[3] https://www.youtube.com/watch?v=QKe-aO44R7k


What the fork?

In this case, a tweet suggests consensus bug risks, which reddit account "treeorsidechains" hyperbolizes into a dramatic headline[1].  However, the headline would seem to be the opposite of the truth.  Several changes were merged during 0.10 development which move snippets of source code into new files and new sub-directories.  The general direction of this work is creating a "libconsensus" library that carefully encapsulates consensus code in a manner usable by external projects.  This is a good thing.

The development was performed quite responsibly:  Multiple developers would verify each cosmetic change, ensuring no behavior changes had been accidentally (or maliciously!) introduced.  Each pull request receives a full multi-platform build + automated testing, over and above individual dev testing.  Comparisons at the assembly language level were sometimes made in critical areas, to ensure zero before-and-after change.  Each transformation gets the Bitcoin Core codebase to a more sustainable, more reusable state.

Certainly zero-change is the most conservative approach. Strictly speaking, that has the lowest consensus risk.  But that is a short term mentality.  Both Bitcoin Core and the larger ecosystem will benefit when the "hairball" pile of source code is cleaned up.  Progress has been made on that front in the past 2 years, and continues.   Long term, combined with the "libconsensus" work, that leads to less community-wide risk.

The key is balance.  Continue software engineering practices -- like those just mentioned above -- that enable change with least consensus risk.  Part of those practices is review at each step of the development process:  social media thought bubble, mailing list post, pull request, git merge, pre-release & release.  It probably seems chaotic at times.  In effect, git[hub] and the Internet enable a dynamic system of review and feedback, where each stage provides a check-and-balance for bad ideas and bad software changes.  It's a human process, designed to acknowledge and handle that human engineers are fallible and might make mistakes (or be coerced/under duress!).  History and field experience will be the ultimate judge, but I think Bitcoin Core is doing good on this score, all things considered.

At the end of the day, while no change is without risk, version 0.10 work was done with attention to consensus risk at multiple levels (not just short term).


Technical and social debt

Working on the Linux kernel was an interesting experience that combined git-driven parallel development and a similar source code hairball.  One of the things that quickly became apparent is that cosmetic patches, especially code movement, was hugely disruptive.  Some even termed it anti-social.  To understand why, it is important to consider how modern software changes are developed:

Developers work in parallel on their personal computers to develop XYZ change, then submit their change "upstream" as a github pull request.  Then time passes.  If code movement and refactoring changes are accepted upstream before XYZ, then the developer is forced to update XYZ -- typically trivial fixes, re-review XYZ, and re-test XYZ to ensure it remains in a known-working state.

Seemingly cosmetic changes such as code movement have a ripple effect on participating developers, and wider developer community.  Every developer who is not immediately merged upstream must bear the costs of updating their unmerged work.

Normally, this is expected.  Encouraging developers to build on top of "upstream" produces virtuous cycles.

However, a constant stream of code movement and cosmetic changes may produce a constant stream of disruption to developers working on non-trivial features that take a bit longer to develop before going upstream.  Trivial changes become encouraged, and non-trivial changes face a binary choice of (a) be merged immediately or (b) bear added re-base, re-view, re-test costs.

Taken over a timescale of months, I argue that a steady stream of cosmetic code movement changes serves as a disincentive to developers working with upstream.  Each upstream breakage has a ripple effect to all developers downstream, and imposes some added chance of newly introduced bugs on downstream developers.  I'll call this "social debt", a sort of technical debt[4] for developers.
[4] http://en.wikipedia.org/wiki/Technical_debt

As mentioned above, the libconsensus and code movement work is a net gain.  The codebase needs cleaning up.  Each change however incurs a little bit of social debt.  Life is a little bit harder on people trying to get work into the tree.  Developers are a little bit more discouraged at the busy-work they must perform.  Non-trivial pull requests take a little bit longer to approve, because they take a little bit more work to rebase (again).

A steady flow of code movement and cosmetic breakage into the tree may be a net gain, but it also incurs a lot of social debt.  In such situations, developers find that tested, working out-of-tree code repeatedly stops working during the process of trying to get that work in-tree.  Taken over time, it discourages working on the tree.  It is rational to sit back, not work on the tree, let the breakage stop, and then pick up the pieces.


Paradox Unwound

Bitcoin Core, then, is pulled in opposite directions by a familiar problem.  It is generally agreed that the codebase needs further refactoring.  That's not just isolated engineer nit-picking.  However, for non-trivial projects, refactoring is always anti-social in the short term.  It impacts projects other than your own, projects you don't even know about. One change causes work for N developers.  Given these twin opposing goals, the key, as ever, is finding the right balance.

Much like "feature freeze" in other software projects, developing a policy that opens and closes windows for code movement and major disruptive changes seems prudent.  One week of code movement & cosmetics followed by 3 weeks without, for example.  Part of open source parallel development is social signalling:  Signal to developers when certain changes are favored or not, then trust they can handle the rest from there.

While recent code movement commits themselves are individually ACK-worthy, professionally executed and moving towards a positive goal, I think the project could strike a better balance when it comes to disruptive cosmetic changes, a balance that better encourages developers to work on more involved Bitcoin Core projects.

Friday, December 12, 2014

Survey of largest Internet companies, and bitcoin

Status report: Internet companies & bitcoin

Considering the recent news of Microsoft accepting bitcoin as payment for some digital goods, it seemed worthwhile to make a quick status check.  Wikipedia helpfully supplies a list of the largest Internet companies.  Let's take that list on a case-by-case basis.

Amazon.  As I blogged earlier, it seemed likely Amazon will be a slower mover on bitcoin.

Google.  Internally, there is factional interest.  Some internal fans, some internal critics.  Externally, very little.  Eric Schmidt has said good things about bitcoin.  Core developer Mike Hearn worked on bitcoin projects with the approval of senior management.

eBayActively considering bitcoin integration.  Produced an explainer video on bitcoin.

Tencent. Nothing known.  Historical note:  Tencent, QQ, and bitcoin (CNN)

Alibaba.  Seemingly hostile, based on government pressure.  "Alibaba bans Bitcoin"

Facebook.  Nothing known.

Rakuten.  US subsidiary accepts bitcoin.

Priceline.  Nothing known.  Given that competitors Expedia and CheapAir accept bitcoin, it seems like momentum is building in that industry.

Baidu.  Presumed bitcoin-positive.  Briefly flirted with bitcoin, before government stepped in.

Yahoo.  Nothing known at the corporate level.  Their finance product displays bitcoin prices.

Salesforce.  Nothing known.  Third parties such as AltInvoice provide bitcoin integration through plugins.

Yandex. Presumed bitcoin-positive.  They launched a bitcoin conversion tool before their competitors.  Some critics suggest Yandex Money competes with bitcoin.

By my count, 6 out of 12 of the largest Internet companies have publicly indicated some level of involvement with bitcoin.

Similar lists may be produced by looking at the largest technology companies, and excluding electronics manufacturers.  Microsoft and IBM clearly top the list, both moving publicly into bitcoin and blockchain technology.


Wednesday, November 5, 2014

Prediction: GOP in 2014, Democrat WH in 2016

Consider today's US mid-term election results neutrally:

  • When one party controls both houses of Congress, that party will become giddy with power and over-reach.
  • When one party reaches minority status in both houses, that party resorts to tactics it previously condemned ("nuclear option").
  • It gets ugly when one party controls Congress, and another party controls the White House.


The most recent example is Bush 43 + Democrats, but that is only the latest example.

Typical results from this sort of situation:
  • An orgy of hearings.
  • A raft of long-delayed "red meat for the base" bills will be passed in short order.
  • Political theatre raised one level:  Congress will pass bills it knows the President will veto (and knows cannot achieve a veto override).
  • A 2-house minority party becomes the obstructionist Party Of No.
A party flush with power simply cannot resist over-reach.  Democrats and Republicans both have proven this true time and again.

As such, we must consider timing.  GOP won the mid-terms, giving them two years to over-reach before the 2016 general election.  Voters will be tired of the over-reach, and the pendulum will swing back.

Predicted result:  Democrats take the White House in 2016.

If the 2014 mid-term elections had been the 2016 election, we would be looking at a full sweep, with GOP in House, Senate and White House.

Losing could be the best thing Democrats did for themselves in 2014.

P.S. Secondary prediction:  ACA will not be repealed.  ACA repeal bill will be voted upon, but will not make it to the President's desk.