Sept 19 2022
Colour Time Verse
Two new Colour Time works I’ve been developing for an upcoming show with Verse in London.
Colour Time Sync is a 20 minute long colour animation, which uses clock time to synchronize the work for all viewers, allowing viewers in the gallery and online to inhabit the same temporal frame.
Colour Time Generative is derived from the same colour data as Colour Time Sync. Instead of animating through time, Generative draws this timeline as a gradient. The horizontal width of this gradient is infinitely variable; as pieces are minted, the gradient expands. Each buyer owns a section of a continuum, which is constantly shifting until the sale closes.
Below are some outputs from the Generative system, which will on display in the gallery. The numbers in the titles indicate the token ID and hypothetical collection size.
In a standard generative release, the decisive moment occurs during minting, a quick roll of the dice determining the output. In Colour Time Generative the seed of randomness is the market as a collective force, turning not just minter but also market in co-creators of the work.
The output of a generative work derives value from its relation to the larger generative series - rarity and aesthetics are judged in relation to the collection as a whole. This generative series embodies this relationship, with each piece in direct relationship to the next.
https://verse.works/exhibitions/colour-time
Two new Colour Time works I’ve been developing for an upcoming show with Verse in London.
Colour Time Sync is a 20 minute long colour animation, which uses clock time to synchronize the work for all viewers, allowing viewers in the gallery and online to inhabit the same temporal frame.
Colour Time Generative is derived from the same colour data as Colour Time Sync. Instead of animating through time, Generative draws this timeline as a gradient. The horizontal width of this gradient is infinitely variable; as pieces are minted, the gradient expands. Each buyer owns a section of a continuum, which is constantly shifting until the sale closes.
Below are some outputs from the Generative system, which will on display in the gallery. The numbers in the titles indicate the token ID and hypothetical collection size.




In a standard generative release, the decisive moment occurs during minting, a quick roll of the dice determining the output. In Colour Time Generative the seed of randomness is the market as a collective force, turning not just minter but also market in co-creators of the work.
The output of a generative work derives value from its relation to the larger generative series - rarity and aesthetics are judged in relation to the collection as a whole. This generative series embodies this relationship, with each piece in direct relationship to the next.
https://verse.works/exhibitions/colour-time
Apr 5 2022
Colour Time Development
Colour Time is a series of twelve on-chain animated SVGs created during a motorcycle trip from Montreal to Los Angeles in November of 2022.
The origins of this series lie in earlier explorations on colour, notably Colour Calendar and colour-time.com. Colour Calendar explores the effect of relative contrast on colour perception and our emotional relationship to colour.
colour-time.com explores complementary colour after-images, presenting a slowly shifting pane of colour. Upon viewing, the eye becomes saturated and begins to generate a complementary colour after-image, which is then met or challenged by the on-screen colour.
These on-chain NFTs are the third iteration of this exploration, and combine the effects of colour relativity and complementary colour afterimages. Named for the location in which they were created, they attempt to express the impossibilty of capturing and relaying a days driving, expressing instead a slice of colour and rhythm.
Each work consists of two planes of colour, which shift between two points of colour. Each plane shifts at a different speed, ie 20 second loop for the background, 13 second loop for the foreground, which allows for complexity to build, different hues coming into contact as the planes phase in and out of sync.
For the full experience, these are best viewed in fullscreen mode, which can be accessed at http://colourtime.jonathanchomko.com/.
Sale opens Thursday April 14 at 10am PST, 1pm EST, 5pm GMT. Tokens are priced at 1 ETH.
Colour Time is a series of twelve on-chain animated SVGs created during a motorcycle trip from Montreal to Los Angeles in November of 2022.
The origins of this series lie in earlier explorations on colour, notably Colour Calendar and colour-time.com. Colour Calendar explores the effect of relative contrast on colour perception and our emotional relationship to colour.
colour-time.com explores complementary colour after-images, presenting a slowly shifting pane of colour. Upon viewing, the eye becomes saturated and begins to generate a complementary colour after-image, which is then met or challenged by the on-screen colour.
These on-chain NFTs are the third iteration of this exploration, and combine the effects of colour relativity and complementary colour afterimages. Named for the location in which they were created, they attempt to express the impossibilty of capturing and relaying a days driving, expressing instead a slice of colour and rhythm.
Each work consists of two planes of colour, which shift between two points of colour. Each plane shifts at a different speed, ie 20 second loop for the background, 13 second loop for the foreground, which allows for complexity to build, different hues coming into contact as the planes phase in and out of sync.
For the full experience, these are best viewed in fullscreen mode, which can be accessed at http://colourtime.jonathanchomko.com/.
Sale opens Thursday April 14 at 10am PST, 1pm EST, 5pm GMT. Tokens are priced at 1 ETH.
Token, Hash
To buy an NFT is to pay to put your wallet address beside a number in a distributed database. This database becomes the point from which all the social and emotional dynamics of ownership emerge, yet the link between this distributed database and it’s visual representation can be quite tenuous.
CryptoPunks attempt to solidify this link between ID and image by embedding an encoded version of all the punks in their contract. This image circulates freely online, but the authenticity of any image can be verified by entering its data into a cryptographic hash function and comparing the output to the hash encoded in the contract.
Artists such as Deafbeef have taken further steps to strengthen the link between token ID and artwork, by encoding the parameters for each audio-visual artwork on-chain, and embedding the scripts used to generate the work on each transaction. This provides the collector with all the code necessary to recreate the artwork, should the original render be lost or damaged.
Projects such as Loot are considered fully ‘on-chain’, storing all data on the blockchain and generating the visual components of the artwork on the blockchain. This removes the need for a collector to run parameters through scripts, but introduces new challenges.
Randomness is key to generative work, but typical random functions return different values each time they are called. If these functions were used in on-chain generative projects, the image for a given token ID would change each time it was viewed.
To solve this issue, artists working on-chain generate their random values deterministically. Deterministic number generation relies on the same cryptographic hash function which CrytoPunks used to encode their image of all punks; provided with a consistent input, the function returns a consistent output. Important, though is that any small change in the input will result in a wildly, randomly different output.
In the case of Loot, randomness is obtained by feeding a hash function a piece of in combination with the token ID. The hash function will return the same value each time WEAPON56 is input, but will give a non-predictably different value if WEAPON57 is entered.
The number that is output from the hash function is very long, and to become useful as a selector for a list of weapons, the hash is divided by the number of items in a list, and the remainder is used as an index for that list of weapons. This approach can be applied to different lists of items of varying lengths, a single hash branching into a new random value for each list.
Autoglyphs, Loot, Artblocks et al. each input different values to their deterministic generation functions, but the central concept remains the same; use a stable input to return a stable, random value.
Images for on-chain NFT projects are drawn anew each time they are requested, all randomness growing from the token ID and its hashed value.
In Token Hash these two values which originate the complex structure of the generative on-chain NFT are laid bare. The hash is rounded to the size of the collection, allowing these two numbers to express the core characteristics of the on-chain generative NFT; rarity, symmetry and beauty.
Token Hash public sale opens Thursday Oct 7 at 10 am EST.
1000 tokens will available for sequential minting, at a price of 0.02 ETH each.
http://tokenhash.jonathanchomko.com/
https://opensea.io/collection/token-hash
To buy an NFT is to pay to put your wallet address beside a number in a distributed database. This database becomes the point from which all the social and emotional dynamics of ownership emerge, yet the link between this distributed database and it’s visual representation can be quite tenuous.
CryptoPunks attempt to solidify this link between ID and image by embedding an encoded version of all the punks in their contract. This image circulates freely online, but the authenticity of any image can be verified by entering its data into a cryptographic hash function and comparing the output to the hash encoded in the contract.

Artists such as Deafbeef have taken further steps to strengthen the link between token ID and artwork, by encoding the parameters for each audio-visual artwork on-chain, and embedding the scripts used to generate the work on each transaction. This provides the collector with all the code necessary to recreate the artwork, should the original render be lost or damaged.
Projects such as Loot are considered fully ‘on-chain’, storing all data on the blockchain and generating the visual components of the artwork on the blockchain. This removes the need for a collector to run parameters through scripts, but introduces new challenges.
Randomness is key to generative work, but typical random functions return different values each time they are called. If these functions were used in on-chain generative projects, the image for a given token ID would change each time it was viewed.
To solve this issue, artists working on-chain generate their random values deterministically. Deterministic number generation relies on the same cryptographic hash function which CrytoPunks used to encode their image of all punks; provided with a consistent input, the function returns a consistent output. Important, though is that any small change in the input will result in a wildly, randomly different output.
In the case of Loot, randomness is obtained by feeding a hash function a piece of in combination with the token ID. The hash function will return the same value each time WEAPON56 is input, but will give a non-predictably different value if WEAPON57 is entered.
The number that is output from the hash function is very long, and to become useful as a selector for a list of weapons, the hash is divided by the number of items in a list, and the remainder is used as an index for that list of weapons. This approach can be applied to different lists of items of varying lengths, a single hash branching into a new random value for each list.
Autoglyphs, Loot, Artblocks et al. each input different values to their deterministic generation functions, but the central concept remains the same; use a stable input to return a stable, random value.
Images for on-chain NFT projects are drawn anew each time they are requested, all randomness growing from the token ID and its hashed value.
In Token Hash these two values which originate the complex structure of the generative on-chain NFT are laid bare. The hash is rounded to the size of the collection, allowing these two numbers to express the core characteristics of the on-chain generative NFT; rarity, symmetry and beauty.




Token Hash public sale opens Thursday Oct 7 at 10 am EST.
1000 tokens will available for sequential minting, at a price of 0.02 ETH each.
http://tokenhash.jonathanchomko.com/
https://opensea.io/collection/token-hash
Sept 2021
Proof of Work Origins
“An interface is not just a portal for access, but a designed extension of the body that then designs the body in reverse” Rachel Ossip, N+1, 2018
Much of my work is concerned with the relationship between physical and digital worlds; how software reaches into and manipulates the world, and how expression or gesture is modulated as it enters the digital.
In www.grindruberairbnb.exposed (GUA), a group of participants are led to gesture and move via web-based interfaces on their mobile phones. The project attempts to make evident power dynamics between system creators and system users, by providing users with interactions so limited in scope that they require specific gestures to complete.
Proof of Work came from the idea of turning these systems of software guidance on myself. The first piece of software I created was an manual image generation program which required the entry of 10,000 values to fill a 100x100 pixel image. Scaled down from the broad gestures of GUA, this software induced small-scale repeated gesture of a keypress.
My first explorations with this application were to test how well I could generate random values. Randomness is famously hard to generate even for a computer. Rafael Lozano-Hemmer has a great artwork on this topic titled Method Random, which visualizes the patterns that occur as randomly generated sequences scale.
![]()
Producing 10,000 random values took around 30 minutes. I posted the manually generated image with a random generated reference and asked viewers to guess which was which.
![Human]()
![Computer]()
Most responders thought that the computer produced image was my production, though those with more computer experience guessed correctly. After this exploration, I was curious to see how different people might generate different images, and asked some friends to produce their own ‘portraits’ through the software.
It was only after these explorations that I entered the world of NFTs. I felt a strong drive to participate, but the raw speculative nature of the market felt off-putting. Not wanting to sell my friend’s productions, I generated a series of five 100x100 random images over one week.
The images show an interesting progression; pattern uniformity starting strong but dipping Wednesday, consistency returning Thursday and Friday.
Around this time, Beeple’s ‘The First 5000 Days’ sold for a record-breaking $69,346,250. Buyer Metakovan explained his rationale for the purchase as such:
“When you think of high-valued NFTs, this one is going to be pretty hard to beat. And here’s why — it represents 13 years of everyday work. Techniques are replicable and skill is surpassable, but the only thing you can’t hack digitally is time. “ — MetaKovan, Christies Press Release
This assumption that one metric can be used to determine the value of an artwork seemed a perfect encapsulation of the speculative tendencies of the market, and so I set out to challenge this assumption by embodying it directly.
Adopting Beeple’s production pace, I generated one image per day. To provide varying levels of effort for the market to speculate on, I began each series with a pixel canvas of 1x1, and doubled it each day. A series would end when I could no longer complete one image in a day, my physiologically limitations ensuring the scarcity of the series.
While producing these images I was reminded of keystroke dynamics, a field of behavioural biometrics which explores typing dynamics. Researchers have found that the rhythm and pattern of keystrokes are unique to each user, with the potential for replacing passwords as an authentication method.
“[...] typing is a motor programmed skill and [...] movements are organized prior to their actual execution. Therefore, a person’s typing pattern is a behavioral characteristic that develops over a period of time and therefore cannot be shared, lost or forgotten.” Bannerjee and Woodward, Journal of Pattern Recognition
If we interpret the patterns that appear in the image as a visual representation of this gestural biometric, the images transform from records of effort to minimum viable artworks, the hand of the artist made visible in the digital image.
https://proofofwork.jonathanchomko.com/
https://opensea.io/collection/proof-of-work-v1
“An interface is not just a portal for access, but a designed extension of the body that then designs the body in reverse” Rachel Ossip, N+1, 2018
Much of my work is concerned with the relationship between physical and digital worlds; how software reaches into and manipulates the world, and how expression or gesture is modulated as it enters the digital.
In www.grindruberairbnb.exposed (GUA), a group of participants are led to gesture and move via web-based interfaces on their mobile phones. The project attempts to make evident power dynamics between system creators and system users, by providing users with interactions so limited in scope that they require specific gestures to complete.
Proof of Work came from the idea of turning these systems of software guidance on myself. The first piece of software I created was an manual image generation program which required the entry of 10,000 values to fill a 100x100 pixel image. Scaled down from the broad gestures of GUA, this software induced small-scale repeated gesture of a keypress.
My first explorations with this application were to test how well I could generate random values. Randomness is famously hard to generate even for a computer. Rafael Lozano-Hemmer has a great artwork on this topic titled Method Random, which visualizes the patterns that occur as randomly generated sequences scale.

Producing 10,000 random values took around 30 minutes. I posted the manually generated image with a random generated reference and asked viewers to guess which was which.


Most responders thought that the computer produced image was my production, though those with more computer experience guessed correctly. After this exploration, I was curious to see how different people might generate different images, and asked some friends to produce their own ‘portraits’ through the software.
It was only after these explorations that I entered the world of NFTs. I felt a strong drive to participate, but the raw speculative nature of the market felt off-putting. Not wanting to sell my friend’s productions, I generated a series of five 100x100 random images over one week.





Around this time, Beeple’s ‘The First 5000 Days’ sold for a record-breaking $69,346,250. Buyer Metakovan explained his rationale for the purchase as such:
“When you think of high-valued NFTs, this one is going to be pretty hard to beat. And here’s why — it represents 13 years of everyday work. Techniques are replicable and skill is surpassable, but the only thing you can’t hack digitally is time. “ — MetaKovan, Christies Press Release
This assumption that one metric can be used to determine the value of an artwork seemed a perfect encapsulation of the speculative tendencies of the market, and so I set out to challenge this assumption by embodying it directly.
Adopting Beeple’s production pace, I generated one image per day. To provide varying levels of effort for the market to speculate on, I began each series with a pixel canvas of 1x1, and doubled it each day. A series would end when I could no longer complete one image in a day, my physiologically limitations ensuring the scarcity of the series.




While producing these images I was reminded of keystroke dynamics, a field of behavioural biometrics which explores typing dynamics. Researchers have found that the rhythm and pattern of keystrokes are unique to each user, with the potential for replacing passwords as an authentication method.
“[...] typing is a motor programmed skill and [...] movements are organized prior to their actual execution. Therefore, a person’s typing pattern is a behavioral characteristic that develops over a period of time and therefore cannot be shared, lost or forgotten.” Bannerjee and Woodward, Journal of Pattern Recognition
If we interpret the patterns that appear in the image as a visual representation of this gestural biometric, the images transform from records of effort to minimum viable artworks, the hand of the artist made visible in the digital image.
https://proofofwork.jonathanchomko.com/
https://opensea.io/collection/proof-of-work-v1




Sept 9 2021
Took most of August off, biked from Montreal to New York City to visit friends. The trip was a real exercise in listening to small nudges, following through on little insights. Ended up in Provincetown for a week and a half, meeting people and hanging out on the beach.
Back in Montreal, happy to be home and back at work. Applied to Mars College this morning, excited to see what people I might meet there. Have always been interested in desert living. If they’ll have me, I’d like to get a motorcycle and drive down before the winter fully takes.
There has been new interest in Proof of Work. Blue Duration is basically sold out - three of the last four I minted to my wallet are sold, and I’m holding on to the 1x1 as a sort of artist’s proof. I feel that those are the most emblematic of the project, a single gesture.
I’ve been developing Red Pressure, and am reminded of what artistic work is; a slow refinement of an idea, a pushing away of the fear that the idea is not valid or interesting, a heeding of the desire for refinement.
Red Pressure maps the pressure of a touchscreen tap to intensity of colour. Originally I wanted to use the trackpad on my macbook, for visual continuity in the documentation. I wrote up an application that received trackpad pressure information, but in testing the trackpad revelealed itself to not deliver very consistent values.
I experimented with force sensing resistors, but these also had their issues, and aesthetically they departed from the visual narrative of human / computer interfaces.
I was looking around to see if I could calibrate the trackpad and came across http://touchscale.co/, a website which uses a force-touch capable iphone to give quite accurate weight estimations for capactive objects.
I found an OSC controller app with 3D touch capabilties (Syntien). Fully editable, and sends granular touch pressure data. There is a slight delay in receiving the values using this approach, but the pressure is much more reliable.
Colour always takes longer than I think. In Blue Duration I was just using a single color value, and multiplying it by the elapsed time between keypresses, which resulted in a light/dark modulation of the blue.
Modulating red in this way results in muddy shades which I didn’t like, and so instead I’m modulating between two shades of red, a brighter/pinkish hue and a deeper red.
The variation between taps is less than it was in Blue Duration, and I like how the differences in the pixels are almost imperceptible. There seems to be less of a banding effect in the eary tests I’ve done, and a more scattered visual effect.
I’m aiming to start production of Red Pressure on Monday Sept 13, and have them up for sale by Sept 24, depending on how long the series runs for.
Back in Montreal, happy to be home and back at work. Applied to Mars College this morning, excited to see what people I might meet there. Have always been interested in desert living. If they’ll have me, I’d like to get a motorcycle and drive down before the winter fully takes.
There has been new interest in Proof of Work. Blue Duration is basically sold out - three of the last four I minted to my wallet are sold, and I’m holding on to the 1x1 as a sort of artist’s proof. I feel that those are the most emblematic of the project, a single gesture.
I’ve been developing Red Pressure, and am reminded of what artistic work is; a slow refinement of an idea, a pushing away of the fear that the idea is not valid or interesting, a heeding of the desire for refinement.
Red Pressure maps the pressure of a touchscreen tap to intensity of colour. Originally I wanted to use the trackpad on my macbook, for visual continuity in the documentation. I wrote up an application that received trackpad pressure information, but in testing the trackpad revelealed itself to not deliver very consistent values.
I experimented with force sensing resistors, but these also had their issues, and aesthetically they departed from the visual narrative of human / computer interfaces.
I was looking around to see if I could calibrate the trackpad and came across http://touchscale.co/, a website which uses a force-touch capable iphone to give quite accurate weight estimations for capactive objects.
I found an OSC controller app with 3D touch capabilties (Syntien). Fully editable, and sends granular touch pressure data. There is a slight delay in receiving the values using this approach, but the pressure is much more reliable.
Colour always takes longer than I think. In Blue Duration I was just using a single color value, and multiplying it by the elapsed time between keypresses, which resulted in a light/dark modulation of the blue.
Modulating red in this way results in muddy shades which I didn’t like, and so instead I’m modulating between two shades of red, a brighter/pinkish hue and a deeper red.
The variation between taps is less than it was in Blue Duration, and I like how the differences in the pixels are almost imperceptible. There seems to be less of a banding effect in the eary tests I’ve done, and a more scattered visual effect.
I’m aiming to start production of Red Pressure on Monday Sept 13, and have them up for sale by Sept 24, depending on how long the series runs for.



