Skip to content

Latest commit

Β 

History

History
131 lines (90 loc) Β· 2.53 KB

File metadata and controls

131 lines (90 loc) Β· 2.53 KB

Demo-Python-OpenAI-API-Project

A simple yet powerful Python project to interact with OpenAI models using the latest API.
This project demonstrates how to control model behavior using parameters like temperature, max tokens, and system messages.

πŸš€ Features

  • πŸ” Secure API key handling using .env
  • πŸ’¬ Interactive chatbot (CLI-based)
  • 🌑️ Temperature control (randomness tuning)
  • βœ‚οΈ Max token control (response length)
  • 🧠 Custom system message (behavior control)
  • πŸ” Compare outputs across different temperatures

πŸ› οΈ Tech Stack

  • Python 3.x
  • openai >= 1.0.0
  • python-dotenv

πŸ“ Project Structure

β”œβ”€β”€ app.py # Main application script
β”œβ”€β”€ .env # API key (not committed)
β”œβ”€β”€ .gitignore # Ignore sensitive files
└── README.md # Project documentation

πŸ” Setup Instructions

1. Clone the Repository

git clone https://github.com/your-username/openai-python-playground.git
cd openai-python-playground

2. Install Dependencies

pip install openai python-dotenv

3. Add API Key

OPENAI_API_KEY=your_api_key_here

4. Run the App

python app.py

5. Usage Modes

  1. Chat Mode Interactive conversation Set temperature and max tokens dynamically

  2. Compare Temperatures Input one prompt See outputs at: 0.2 β†’ deterministic 0.5 β†’ balanced 0.9 β†’ creative

  3. Custom System Role Define AI behavior Example: "You are a strict teacher" "You are a startup mentor" βš™οΈ Key Concepts πŸ”₯ Temperature

Controls randomness:

0.2 β†’ factual, consistent 0.7 β†’ balanced 0.9+ β†’ creative, diverse βœ‚οΈ Max Tokens

Limits response length:

Smaller β†’ short answers Larger β†’ detailed responses

6. Security Best Practices:

Add .env to .gitignore:

Never expose API keys in code or public repos

πŸ“ˆ Future Improvements

🌐 Streamlit UI ⚑ FastAPI backend 🧠 RAG (Retrieval-Augmented Generation) πŸ€– Agent-based workflows πŸ’Ύ Chat history persistence πŸ“š Learning Outcome

By completing this project, you will understand:

How to call OpenAI models programmatically How LLM parameters affect outputs How to integrate AI into real applications πŸ‘¨β€πŸ’» Author

Built as part of an AI engineering learning journey πŸš€

If you found this useful, consider giving it a ⭐ on GitHub!

If you want, I can also:

  • Generate a GitHub repo description + tags (SEO optimized)
  • Add badges (build, license, etc.)
  • Convert this into a portfolio-ready project πŸ”₯