Code Generation using GPT-J!
Go to file
2022-06-23 16:30:33 -04:00
.github/workflows Update secrets.yml 2021-10-11 10:00:14 +02:00
.vscode changed vscode to .vscode 2021-09-21 14:01:27 -04:00
assets Added emailserver.py 2021-10-11 09:04:53 +02:00
old_releases Fixed README message 2022-06-23 16:30:33 -04:00
src Added api endpoint again 2022-06-23 16:13:22 -04:00
.eslintrc.js first commit 🎉 2021-08-27 15:28:49 -04:00
.gitignore Added emailserver.py 2021-10-11 09:04:53 +02:00
.vscodeignore Added messgae of inactivity 2022-03-27 15:34:26 -04:00
CodeGenX_demo.gif Add files via upload 2021-11-20 15:36:52 -05:00
codegenx-0.1.8.vsix Fixed README message 2022-06-23 16:30:33 -04:00
jsconfig.json 📦 Packaged latest version 2021-11-08 19:18:24 -05:00
LICENSE Added MPL-2.0 License 📄 2021-08-27 15:40:40 -04:00
package-lock.json Fixed README message 2022-06-23 16:30:33 -04:00
package.json Fixed README message 2022-06-23 16:30:33 -04:00
README.md Fixed README message 2022-06-23 16:30:33 -04:00

CodeGenX

CodeGenX is back online! 🎉 We are sorry for the long wait

Existing users will need to update the extension in VsCode and New users can sign up on our website


CodeGenX is a Code Generation system powered by Artificial Intelligence! It is delivered to you in the form of a Visual Studio Code Extension and is Free and Open-source!


Installation

You can find installation instructions and additional information about CodeGenX in the documentation here.


About CodeGenX

1. Languages Supported

CodeGenX currently only supports Python. We are planning to add additional languages in future releases.

2. Modules Trained On

CodeGenX was trained on Python code which covers many of its common uses. Some libraries which CodeGenX is specifically trained on are:

  1. Tensorflow
  2. Pytorch
  3. Scikit-Learn
  4. Pandas
  5. NumPy
  6. OpenCV
  7. Django
  8. Flask
  9. PyGame

3. How CodeGenX Works

At the core of CodeGenX lies a large neural network called GPT-J. GPT-J is a 6 billion parameter transformer model which was trained on hundreds of gigabytes of text from the internet. We fine-tuned this model on a dataset of open-source python code. This fine-tuned model can now be used to generate code when given an input with the right instructions.


Contributors

This project would not have been possible without the help of these wonderful people:



Arya Manjaramkar

Matthias Wijnsma


Thomas Houtrique


Dominic Rampas

Bilel Medimegh

Josh Hills

Alex

Tiimo

Acknowledgements

Many thanks to the support of the Google TPU Research Cloud for providing the precious compute needed for this project.