Generating Mario Levels with GPT2. Code for the paper: 'MarioGPT: Open-Ended Text2Level Generation through Large Language Models', https://arxiv.org/abs/2302.05981
Project description
How does it work?
MarioGPT is a finetuned GPT2 model (specifically, distilgpt2), that is trained on a subset Super Mario Bros and Super Mario Bros: The Lost Levels levels, provided by The Video Game Level Corpus. MarioGPT is able to generate levels, guided by a simple text prompt. This generation is not perfect, but we believe this is a great first step more controllable and diverse level / environment generation.
Requirements
- python3.8+
Installation
from pypi
pip install mario-gpt
or from source
git clone git@github.com:shyamsn97/mario-gpt.git
python setup.py install
Generating Levels
Since our models are built off of the amazing transformers library, we host our model in https://huggingface.co/shyamsn97/Mario-GPT2-700-context-length
This code snippet is the minimal code you need to generate a mario level!
from mario_gpt.lm import MarioLM
from mario_gpt.utils import view_level, convert_level_to_png
# pretrained_model = shyamsn97/Mario-GPT2-700-context-length
mario_lm = MarioLM()
prompts = ["many pipes, many enemies, some blocks, high elevation"]
# generate level of size 700
generated_level = mario_lm.sample(
prompts=prompts,
num_steps=699,
temperature=2.0,
use_tqdm=True
)
# show string list
view_level(generated_level, mario_lm.tokenizer)
...
See notebook for a more in depth tutorial to generate levels
Future Plans
Here's a list of some stuff that will be added to the codebase!
- Basic inference code
- Add MarioBert Model
- Inpainting functionality from paper
- Open-ended level generation code
- Training code from paper
- Different generation methods (eg. constrained beam search, etc.)
Authors
Shyam Sudhakaran shyamsnair@protonmail.com, https://github.com/shyamsn97
Miguel González-Duque migd@itu.dk, https://github.com/miguelgondu
Claire Glanois clgl@itu.dk, https://github.com/claireaoi
Matthias Freiberger matfr@itu.dk, https://github.com/matfrei
Elias Najarro enaj@itu.dk, https://github.com/enajx
Sebastian Risi sebr@itu.dk, https://github.com/sebastianrisi
Citation
If you use the code for academic or commecial use, please cite the associated paper:
@misc{https://doi.org/10.48550/arxiv.2302.05981,
doi = {10.48550/ARXIV.2302.05981},
url = {https://arxiv.org/abs/2302.05981},
author = {Sudhakaran, Shyam and González-Duque, Miguel and Glanois, Claire and Freiberger, Matthias and Najarro, Elias and Risi, Sebastian},
keywords = {Artificial Intelligence (cs.AI), Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {MarioGPT: Open-Ended Text2Level Generation through Large Language Models},
publisher = {arXiv},
year = {2023},
copyright = {arXiv.org perpetual, non-exclusive license}
}
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distributions
Hashes for mario_gpt-0.1.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 199107675a3b79c0d78ccc21753bd25fb59cd5b8f42dbe8c0f52b3bf378b6975 |
|
MD5 | 7bcffee387830022d855d077e5661102 |
|
BLAKE2b-256 | 04464949e809a7f5d9915535a6be70831ddca49a3f7fda269246986f9c7d62b6 |