Giter Club home page Giter Club logo

2019's People

Contributors

dariusk avatar hugovk avatar maxdeviant avatar mrcasals avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

2019's Issues

Markov's Fanfiction

A few years ago, I used a homebrew markov lib and a Fanfiction.net scraper to write "Markov's Fanfics". It wasn't good (by any stretch of the imagination), but it was funny, and a nice project.

For NaNoGenMo this year, I'm going to revisit this idea. Using a decent markov library (4th order markov chains ftw) and more fanfictions than before, I hope to generate a novel-length fanfiction.

Shelter Should Be the Essential Look of Any Dwelling

Title Derived from a Frank Lloyd Wright (FLW) quote; one of his many dubious, showy, aphorisms.

This project leverages the concept of "Shape Grammar" and the work of Frank Lloyd Wright to attempt a kind of computational architecture and criticism. This centers around FLW's as a kind of shape grammar of "exploded boxes."

"Shape Grammar," a concept surfaced by George Stiny and James Gips in their 1971 work "Shape grammars and the generative specification of painting and sculpture," proposes alternate, geometric grammars for describing building plans. Indebted to Friedrich Froebel's "gifts," an undeniable and admitted influence on FLW's part, this approach to creating a descriptive geometric grammar may have applications in a lexical context (i.e. an approach to generative text) not unlike those of the Markovian or NLP approaches.

Or, to really stir things up as a kindred spirit to @cpressey (#11), I might flip some tables and say that individual shapes are words.

Key question: can "shape grammar" integrate into current or propose new methods for procedural generation.

(Frankly--pun intended--I have no idea if the above is true or means anything. It's November, which means it's time for trying stuff.)

Consulted

This section will be updated regularly.

  • Ada Louise Huxtable. 2018. Frank Lloyd Wright: a Life. New York: Penguin Random House.
  • Frank Lloyd Wright. 1943. An Autobiography. New York: Duell, Sloan, and Pierce.
  • George Stiny. 1980. "Kindergarten grammars: designing with Froebel's building gifts." Environment and Planning B.
  • George Stiny and James Gips. 1971. "Shape grammars and the generative specification of painting and sculpture." Information Processing.
  • Jack Quinan. 2006. Frank Lloyd Wright’s Larkin Building. Chicago: Chicago University Press.
  • John Lloyd Wright. 1946. My Father Who is on Earth. New York: GP Putnam & Sons.
  • Ju Hyun Lee, Michael J. Oswald, Ning Gu. 2017. "A Combined Plan Graph and Massing Grammar Approach to Frank Lloyd Wright’s Prairie Architecture." Nexus Network Journal: Art and Mathematics.
  • T.W. Knight and George Stiny. 2015. "Making grammars: From computing with
    shapes to computing with things." Design Studies.

Time to shake the dust off

I'm going to give something a shot.

I thought it was in November, so I've missed a couple of days....

Intention to submit the Newer Testament

I'm going to be rolling with a project based on the Christian Bible. Tentative plan is a Markov model to string whole verses together, then use a thesaurus to mutate that result further. Probably something to fix up punctuation too.

Simple and straightforward, and a concept that can be applied to a wide range of texts (as long as I define a method for splitting them up).

Stretch goals:

  • Come up with a decent way of making book names.
  • Figure out a clean way of closing chapters. Maybe add a special "end-of-chapter" verse?
  • Identify names and reassign them consistently as a separate step.
  • Identify quotes. Attribute them to specific people. Make a different style of speech for each (eg prefer particular synonyms out of each synonym set).

https://github.com/dhasenan/newer_testament

Hyloe's Book of Games

I occasionally enjoy playing a weird old plastic 1970s abstract boardgame, where the rules take two sides of A4 to explain what could have been said in a paragraph, and you end up having to house rule around a load of ambiguities anyway. It's a nice challenge that both players are trying to work out a game they've never seen before from scratch, and to discover its basic strategies and winning moves (if any), along with whether it's even worth playing the game at all. Since my local boardgame cafe only has a limited number of these types of games to get through, I decided I should write a script to generate rules of a similar calibre. They'd be less likely to actually work, but if you make that into part of the game (any player can pause the game and prove it unwinnable, to win), it might still be worth playing them out.

I never got around to making that script, but considering Nanogenmo this year it struck me that it'd be good to make a 50,000-word book of fictional games in the style of Hoyle or Parlett, with each game having a little introduction and commentary. I might expand the concept to include card and dice games as well.

Constrained writing

Consider an exercise in constrained writing defined by the following axiom system N:

C1. The final written product contains at least 50,000 words.
C2. The final written product is believed by its author to be a novel.
C3. The writing of the product takes place during the calendar month of November.

(Players of the Glass Bead Game will immediately recognize this as a residulated finite premagma over 𝕌ₔ, but let's not get ahead of ourselves.)

Note that we can form a conservative extension N′ of N by adding an additional axiom:

C4. The writing of the product must be done by computer code acting in the capacity of ghostwriter.

Note also that N′ can be characterized by its basis of fudges:

  • What even is a word? (cf. "meow")
  • What even is 50,000 words? (cf. "junk meows")
  • What even is November? (cf. "cheating non-entry")
  • What even is a novel? (cf. "how do I generate PDF?")
  • What even is computer code? (cf. "language survey")

Now let us consider a non-standard model of N′ where any string of words may be considered a novel. This is equivalent (up to isomorphism) to restating C2 as:

C2′. The final written product is believed by its author to be a string of words.

At first blush C2′ appears to be a tautology. But is it really?

Yes. Yes it is. We can remove it entirely. But then, note that our finite premagma is no longer residulated! Is it not a famous open problem whether a non-residulated finite premagma over 𝕌ₔ contains any non-trivial members at all?

Thus we posit the existence of a higher-order constraint:

C0. Bring your own C2′.

This suggests the existence of several other higher-order constraints:

C(⁻1). Think up a C2′ that is interesting enough that you would want to see it through to the end, but simple enough that you could finish it in your spare time in 30 days.
C(⁻2). Have a greater-than-zero amount of spare time in the 30 days constituting the calendar month of November.

It perhaps goes without saying that if C(⁻2) is not true, none of the other axioms in N or N′ can be fulfilled.

But if it does go without saying, then why am I saying it?

A catalog of limit objects

I recently started reading Mark Fisher's "The Weird and the Eerie" & it occurred to me that, when they are sufficiently specific, classifications for horror imagery can be used as the basis for a generator. Not just that but you can combine multiple different systems & (hopefully) reliably produce descriptions of objects or scenes that are memorable and resonant.

To this end, I figure, I'll pick out a collection of attributes of horror imagery that are amenable to assembling from a list or a simple ontology: weird, eerie, unheimlich, sublime, and gross.

  • weird here comes from Fisher, though his definition is similar to Mieville's -- it's something that is out of place in such a way that it implies that the reader's ideas about how the world works need to be revised. An easy way to introduce the weird is by juxtaposition, and this is what I'll rely on (rather than being clever). One way of doing this is to use an ontology & find a specifically unrelated item to introduce, but for the moment I'd like to just expect that the other attributes (since they're not constrained by ontology) will introduce such juxtaposition.

  • eerie, per Fisher, is the implication of unknowable agency. This is why people wearing masks are creepy, for instance. A straightforward way to introduce the eerie is with a corpus of strings stating or implying unknown or unknowable intent: "created for some horrible unknowable purpose" (per Lovecraft), "abandoned without warning", "waiting for a sign", "its movements hidden behind a translucent screen", "silently observing you", "of meticulous design"...

  • unheimlich -- from Freud's famous essay, and I will just use the attributes he gathers under the heading (doubling, multiplication, simulacra). I won't use 'amputated or missing a vital part' because that would require a much more detailed ontology.

  • sublime -- I will flatten this down to extremes of age or size. This gives me free reign to introduce nice Lovecraftian words like antedeluvian or cyclopean.

  • gross -- based on my (probable mis)understanding of Lacan's idea of 'joussance', but also Stephen King's idea that "if you can't scare them, gross them out". I'll just use a list of materials that are gross when taken out of their normal context -- excrement, hair, rotting meat, mucus, fingernails, teeth, earwax.

I think if we take some element as a base, we can add these other attributes on with only a minimum of intelligence.

For instance, the Mary Celeste is weird/eerie because its abandonment is inexplicable (particularly in versions where 'there is an untouched meal on the table', which isn't actually true of the real Mary Celeste): it is hard to imagine a sensible chain of reasoning that would lead to the crew abandoning a perfectly good boat (eerie -- because unknowable reasoning), and it is hard to imagine an external event that would make it sensible to do so (weird -- because it implies our view of the world is drastically and fatally incomplete even in such a mundane situation as 'being on a boat'), and the various 'rational' explanations for the abandonment of the Mary Celeste are therefore substantially more unsettling than ideas like 'they were teleported away by aliens'.

We can improve upon the Mary Celeste by adding these other attributes:

In a giant cave sealed for the past ten million years (sublime), there are two identical (unheimlich) ghost ships (eerie), complete with meal set on the table, coffee still warm, and tea cups full of human hair (gross).

Let's call an image like this a 'limit object' (as compared to 'limit experiences'), because it's maximally unsettling on the grounds that it includes all the types of unsettlingness.

If I can generate images like these, I can give each one a (large) item number, unique but out of order (in order to imply a large number of such items not in the catalogue). Such a catalogue is a reasonable nanogenmo 'novel', in line with the other collections, catalogues, and encyclopediae folks have generated. It might even be worth reading substantial portions of it, since (if this method works) the imagery will be compelling even when it is predictable.

The Importance of Earnestly Being

I am working on generating a novel from the ground up:

  • first generate characters and their relationships to each other
  • then generate settings with objects that characters inhabit
  • then generate paragraphs made of actions and dialogue between characters
  • then generate scenes that have an overall goal (eg. affect character relationship in a certain way)
  • then generate plots made of scenes that build on each other

Everything is based on pre-written fragments that include randomly generated fragments or modifiers based on the target/subject, which allows expanding the corpus rapidly once the base rules are in place.

Repository: https://github.com/jdm/nanogenmo2019

File Explorer

Hi, while I participated in a few game jams, this will be my first one to output a written work.

I have several ideas and have not settled on one yet. The most promising would be a Choose Your Own Adventure style novel, where each page ends with a choice like: "If you want to follow the mischievous waiter, continue reading on page 132."

Update: I went with the story of a program exploring the files on my computer called "File Explorer".

The Orange Erotic Bible

The Orange Erotic Bible

Interleaving chapters of the King James Bible with generated erotic passages.
Plan is to use an adult domain fine-tuned GPT-2 or CTRL language model to continue bible chapters with erotic content.

Fictionalia

First time participant in NaNoGemMo (but long time noodler with Python)..

Fictionalia started as a way of reusing code from a game project I am working on. Basically, it will be a guide to a fictional county. (Sort of midway between Emily Short's The Annals of the Parrigues and The Guardian's "Let's Move To..." property columns, both of which will be used for source text).

Large amounts of code already exist, so November will be used bringing the text generation up to scratch. Initial approach will be using Python (2.7 - don't hate me!) and Markov chains, but may well stray into Neural Networks or Natural Language Toolkits over the next 30 days.

The final output will be a PDF file of a full book - front and back covers, illustrations, index, the works.

The Gambler

One of my favourite short novels is the Dostoyevsky work 'The Gambler'. I am going to try to write a procedurally generated novella that feels like reading a work of Dostoyevsky. The approach would be -- run some kind of program similar to the Sims -- with different characters each with various emotional and physical needs, except modelled after the characters in The Gambler. Run this simulation and see what happens. Then have a way of translating the results of the simulations into text that has the same literary feel as translated Dostoyevsky.

Romantic Folklore Generator

I'm planning to play around with the act structures of romance stories and combine them with topics from folklore. The goal is to create code that could generate a few thousands words of a short story and build an anthology either based on the same initial seed theme or as grouped around the decided sexuality of the characters from the generation.

Corporation list

I am planning a catalogue of fictional corporations. Hopefully something akin to a Yellow Pages, with long and short adverts and spelling mistakes and bad graphics and all the associated stuff that goes with them.

It's partly an excuse to learn how to make printable websites with CSS and cut a lot out of my publishing workflow. :)

I'm in.

I was given the idea to combine authors with similar names to create new authors like "CS Lewis Carrol" similar to the Before & After game on Wheel of Fortune - given this pivots on a first name being the same as another persons last name perhaps it'd be easy to find public domain works that fit this criterion. Alternatively, initials in an authors name could be replaced with another author's full name when it has the same initialization. Then I could update my Sherlock Shuffle technique to fit. (Someday I hope to make Sherlock Shuffle intelligent enough to swap out character names throughout a text to make the story more consistent.)

I'll also be working on updating pos2tracery with ideas from last year.

other ideas:

  1. Improve Beatles Bible w/ better typography/page layout
  2. Hackers Dictionary: Regular dictionary, but the words are l337sp34k
  3. Every l33tsp34k: bruteforce "words" by translating iterated numbers into l33t & checking against dict file.
  4. Something w/ Debord/Foucalt
  5. Improve Art Zine
    1. https://triptograph.glitch.me images
    2. Triptographs, but convert the images to SVG to stroke paths & blur for colors behind them
    3. Not just 'Dada Manifestos'
  6. Songbooks ala https://midi-ditty.glitch.me, generative lyrics (lyrics might be better suited to NaPoGenMo?)

Cassandra: Generated novel using Sed and bash scripting

The plan here is to use fairly basic repeated string substitutions to form a "novel", using public domain texts as inputs, and command line bash tools to organise the source material in various ways.

Starting point: Jane Austen's 'THE BEAUTIFULL CASSANDRA'
https://www.pemberley.com/janeinfo/juviscrp.html#beaucassand

Obviously it needs expanding to meet the 50K word limit, but it will provide the basic 'plot' structure. Each of the 12 chapters needs to be expanded to ~4167 words.

I might use online search APIs to give a bit of random topic expansion, or the source material may be pre-chosen. Unsure if I want a completely deterministic result, or one that changes at each run.

Currently experimenting with using 18th and 19th century scientific publications to add colour.
Example: https://archive.org/details/cbarchive_53241_onthemycologyandvisceralanatom1884

Early sample of some substitutions in chapter one:

CASSANDRA` was the Daughter & the only Daughter of a celebrated olochiru in Page Street. Her father was of noble Birth, being the near relation of the Dutchess of Actinocucumi's genera.

The challenge is to see how far I can get using limited tools. If this goes according to plan, the script will be generated using multiple layers of string substitutions.

The Collected Works of Writing Prompt Prompter and Writing Prompt Responder

I will generate a novel inspired by my favorite subreddit, Writing Prompts. It will consist of a series of GPT-2-generated writing prompts and GPT-2-generated responses to those writing prompts, all grouped by weekly themes.

HackerNoon described the problem that has kept me from trying NaNoGenMo until now: “One of the open problems in the procedural generation of fiction is how to maintain reader interest at scale.” With writing prompts and responses, I can generate shorter (and possibly more interesting) sections within a larger NaNoGenMo manuscript.

OpenAI’s GPT-2 gave me access to a superpowered language model trained with a data set of 8 million webpages, all human-curated outbound links from Reddit. So much of the internet flows in and out of Reddit, it provides the scale of information needed to make a really sensible-sounding AI text generator.

Over at Reddit, the great u/disumbrationist trained an army of bots to create an entire Subreddit Simulator populated entirely by GPT-2 language models that were trained on the most popular subreddits. Following that example, I will create two large data sets.

Thanks to Max Woolf, I learned how to fine-tune two different GPT-2 language models with Google Colab. Here are the two GPT-2 models I will train and fine-tune to generate my NaNoGenMo novel:

1- A Writing Prompt Prompter trained on thousands of writing prompts posted on /R/WritingPrompts
2- A Writing Prompt Responder trained on thousands of responses to writing prompts on /R/WritingPrompts.

I think the great storytellers at Writing Prompts can provide some narrative structure to GPT-2's uncanny ability to write human-sounding text. I'll run every prompt and response through a plagiarism detector to make sure my bot doesn’t steal from its human teachers.

I developed this project while writing a magazine article about NaNoGenMo, getting advice from some great members of the community, especially Darius Kazemi and Janelle Shane.

Generative Cities

I'm working on a program that will generate new chapters of writing like Italo Calvino's "Invisible Cities," which feels like a novel about generativity, especially in the way Marco Polo describes his adventures.

One problem I anticipate is that the source text is pretty small. It's only about 26,000 words. I've been experimenting with Markov chains, but I've also thought a lot about doing a tracery grammar and modelling specific chapters. I'm also interested in pairing this stuff with sentiment analysis and trying to swap out some words for others to create new texts, but that are thematically related. Another approach is to model Calvino's writing by loading in his other texts and creating a NN that emulates his style.

Another thing I've been thinking about is the superstructure of Invisible Cities as a text. Calvino's form is very particular -- sort of like a cascading atlas, or a canon, adding "genres" of cities, that have specific numbers of cities per genre.

Additionally, there's the narrative aspect of the text. This exists primarily as a framing device, and I wasn't thinking of working with it, but it could be interesting to toy around with it.

I'm excited to see what develops from this!

Sound Bites and Interactive Fiction

First time participant in NaNoGenMo. I have previously casually participated in a couple NaNoWriMo's.

I am currently improving my Python and have been looking for a good project - I'm thinking about something interactive but not quite sure what this will look like just yet.

Terms && Conditions

Intent to submit! Still thinking this one through, but wanting to do something with a corpora of Terms and Conditions.

The 5,000 Translations of "mama pi me mute"

So going into this years NaNoGenMo, the only idea I had in a notes file was "Translate a book into Toki Pona." It only has 120 words, can't be hard, right?

I found a decent dictionary, and decided to try translating the Lord's Prayer (on the Wikipedia page) using said dictionary. I realized that just an exhaustive translation of the opening line could easily hit 50,000 words (and then some, more like 350,000 words). So I picked out enough translations to hit the 50,000 word limit, and thus have The 5,000 Translations.

I'm turning this one in just in case I don't do something else this month.

Edit: The code repo

Present-day Hero's Quest

I'm going for a Hero's Quest plot set in the present day. The main character has to find an item.

I'm aiming for a forward sense of narrative. As stretch goals, I aim to include simile and/or other things like imagery, plus have the character deliver the item to a specific place or location, but I'm not sure if I can get any of that done in time.

In lieu of a dev diary, I'm having my project boards be publicly viewable, see https://github.com/users/verachell/projects/3

I'm programming it in Common Lisp.

Survivors (abandoned)

Hello! I'm struggling to figure out what I'll write this year, but something potion related, either a cookbook or a list of potion properties or the diary of a potion seller/alchemist...

UPDATE: I'll be doing the story of a bunch of survivors in an island, each will independent, with some sort of group AI so they know which chores to do. They will also have different specialisations which will be how they decide which tasks to do.

The book will be the log of this simulation, hopefully they will all survive (there will be events...)

I hope this will be cool, I want to do this with Python with lots of async stuff happening.

Ducks: The Diary - completed

Hi! I'm new to GitHub, but I've been coding for some time. (Also, possibly the youngest participant here, but I can't say for sure as I don't know about the ages of everyone else.)

As I've been thinking about this for quite some time already, I've decided to follow this idea that incorporates my favorite animal, which (as you may have guessed) is the duck. It will involve scraping some Tweets containing the word "duck", Markov chain-ing the dataset into oblivion, and formatting the result as a diary.

(My other idea was to use machine learning and a handful of other tricks to improve the interestingness and decrease the repetitiveness of dariusk/NaNoGenMo#2. However, neural nets are apparently really difficult to work with, so I'm putting this on hold.)

Romeo and Juliet, but then in WhatsApp dialogue style

This project's goal is to generate collection whatsapp conversations between two people, anonymously named Romeo and Juliet, with left-out media gaps filled in with images that are generated from keywods or sentiment from the text.

-LT TRAOEUL

This is less formal entry that I've kicked around for a while: the idea of translating all of Kafka's The Trial into stenographic transcription. Given that the Plover dataset is public and I've cobbled together a few other resources, I think I might be able to knock this one out.

Future plans to turn this into a book arts object, too.

PlotGEN - Unsupervised learning of plots

Hello all! I am very excited to be joining NaNoGenMo this year. I'm currently in a Writing with Algorithms class at my college, and some of our readings have been past submissions on NaNoGenMo. Lots of really cool projects in the previous years, I'm looking forward to seeing what everyone does this time!

My repo is located here: https://github.com/theairdemon/PlotGEN

The idea is more detailed in my README.md, but to summarize: I want to analyze a number of texts, store information about the plot and the world within an m by n matrix, and cluster the texts according to a straightforward clustering model. This would allow me to generate plotlines within the parameters of the n-dimensional matrix and group them with their closest neighboring clusters.

Thanks for your time, best of luck with your projects!
-Hunter

50,000 Nyaas

This will be my low-effort entry for the year.

There's a particular style used in english translations of light novels & anime to indicate that the person speaking is a humanoid cat (which, in japanese is rendered as turning all instances of 'na' into 'nya'). It ought to be straightforward to write a filter to automatically mangle text this way.

In!

Will do something, first off something easy.

NaNoGenMo

What else could it stand for?

National Novel Generosity Month?

National Novelty Generosity Month?

Will just go through all the possibilities, and generate a catalogue of all different kinds of NaNoGenMo.

A NaNaNoGenMoGenMo?


Go through a book, eg. Moby Dick, check each word in turn, find another work finding that word, and add that work as a citation for that single word. "Moby Dick, as written by [list of cited authors]".


I may also check some unrealised ideas from 2018, 2017, 2016 and 2015 and 2014.

Call number poetry

Staking a flag in here to signal that I'm doing this! No idea how far I'll pursue the theme of LCCS and poetry, but stay tuned...

Grindr Ipsum Generator

I have been thinking of writing an ipsum generator based on frequent grindr message content, and when I saw this it motivated me to actually do it. I was thinking of writing it in Python but I'm not very good with Python yet, so it looks like it will be Java.

(Im)possible Worlds

(will add more later)
-- World Making and Story Generation (Site & Plot)
-- Curiosity-driven learning ("Sandbox Cognition")
-- Textworlds (Interactive Fiction)
-- Narrative Networks (Narrative Framing)
-- Simulacra and Nonsense (Planning, Simulation, and Hallucinating Worlds)
imposs

Annual General Meeting

Undecided on the tech side of things, but as a story structure i think i will be following the example agenda provided by co-operatives uk for an agm. And in these minutes of the meeting we will reveal the characters and their group, their motives, the world they exist in, their successes and failures, and plans for the future.

Welcome
Confirmation of the required quorum
Appointment of a Chair (if necessary)
Minutes of the previous AGM
Matters arising from the Minutes
Presentation of the Annual Report
Adoption of the Annual Report
Presentation of the Accounts
Adoption of the Accounts
Appointment of Auditors (if any)
Election of directors (or confirmation of the results if the ballot has been held beforehand)
Motions to be put before the AGM
Any other business
Closing Remarks

Variations on bunny rabbit noir parody

I have ~ 500 words of hand-written bunny rabbit noir parody. I will create code that by some means creates variations on this text until it is a proper novel.

Original source:
"""
I should’ve known it wasn’t going to work: Jess is the kind of doe who expects a lot of lettuce, and I’m not the kind of buck who’s ever going to be able to give her that. Not like Roger or Big Pete. My sire was a small-time vegetable pusher: swedes and carrots mainly, not the big stuff. Most of what he sold was even legal if you had the money and you knew where to find it. But if you do what he does, you can’t be choosy about the company you keep. So it wasn’t much of a surprise when he fell out with the Big P. over a kohlrabi shipment and had to leave town.

My dam coped as best she could, though it wasn’t easy for her. We were a decent family, and if the bucks she brought back were Zane and Dane and Kane and Paine (never Boris or Donald or Barack or Emmett), and if they never stayed for more than a week, at least they were kind to me and had their own front teeth. Jane, Laine, Flopsy, and Cottontail were good to me too. Decent does, whatever you think of their lifestyle. Everybody in our end of the burrow used to let you be whoever you needed to be. The only rough year was fifth-grade when that damn Mopsy made Mom go organic, hyperlocal, non-binary AND non-Fibonacci. They tried to make my teachers stop calling us Buck or Doe, and use Duck or Boe instead. Quack! Quack! I tell ya. And Mopsy was a biter.

By sophomore year my incisors were coming down nicely. I had a big cheek and a thumper of a back left foot. I may not have had brains but I had a growing body. And an adult nose for fermented vegetables. I volunteered as a burrow guard and liked the touch of authority it gave me - that and the view as all those rumps went down the burrow.
.
These days, I’m ex-police, I know how to handle myself, and have my own tuxedo, so I can always get work as a bouncer in some toney fruit and seed bar. That’s what I’m doing when Jess comes in. “Wanna get some celery with me, big boy?” “Sure”. Course, she’s outta my league, big eyes, long legs, jewelry, fur coat, none of them cheap. “I want you to find my husband,” she says. “OK” says I, I’m a sucker for a pretty doe in trouble. That’s how I got to be swimming in a lake, at night, on the run from Big Pete’s squad of demented badgers in night vision goggles. Anyway, long story short, I got caught, bitten, tuberculosis, and vet bills like you wouldn’t believe. Don’t let it happen to you.

I’m not as smart as I think I am, I should have known Jessica Rabbit was a Rabbit only by marriage.
"""

I will do something

I will actually do this this year.
I'm more into the idea of dialogues and descriptions than free-form novels.
I was working on an Obama speech generator last year, so maybe I'll do a presidential dialogue this year.

Grimoire: procedural occultism [COMPLETED]

Grimoire will be a procedurally generated occult handbook, inspired by late medieval spellbooks and demonological texts.

Expect a frightening hierarchy of demonic nobility, descriptions of their realms, images of their seals and instructions for conjuration.

Generate at your own risk

Conway's Novel of Life

Hi, all! 👋 Long-time/first-time, etc.

This year, I'd like to make a novel generator using Conway's Game of Life (I searched to see if this has been done already, and I don't think that it has, but please correct me if I'm wrong!).

Concept:

  • Accept input text of whatever length
  • Split the text into an array that gives us a "grid" of appropriate length/width
  • Overlay that array with Conway's Game of Life
  • Each generation, read the grid from left-to-right and top-to-bottom, adding the word under each active cell to the output, creating a sentence
  • Repeat the above until word count is over 50,000

Stretch Goals:

  • Add a visualiser to let the user watch the program write
  • Allow user to select from a number of popular Conway patterns

I See Red

A surveilled conversation about the personal. Making private, public. A monitored relationship and its effects. What's real and what's not?

Screenshot (72)
Screenshot (76)

Book Of Obscure Poetry

Wanted to join in last year, but couldn't finish it finally. This year I might do better!

Details coming soon!

"Starship Trader Monthly" Magazine

Last year I wanted to generate a catalog of "Strange Items" - procedurally generated objects, rendered in POVRay, along with wacky descriptions, laid out in magazine fashion and rendered to PDF.

I didn't get it done.1 Part of the problem was that I took a vacation mid-month that ate a lot of time, the other issue was that "items" was too broad. The output was boring because there was nothing to optimize towards.

But it seems a shame to let all that PDF + POVRay code go to waste AND I still think the idea of a generative catalog has merit. It just needs more focus! So this year I'm taking another run at it, but with the goal of generating a Catalog of Starships. Or maybe more like a Classified Ads. All manner of used and new space vehicles, satellites, escape pods, and maybe even space stations, at low low prices!

Pulp trash

I'm going to be using Javascript to make some pulp trash. My first attempt will be to finish and then use a tool I'm working on, a condition-consequence system that is similar in principle to Samovar. If that fails, I'll use Tracery to throw together something akin to my Twitter bot @garos_v_fodos but longer.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.