... with a simple Python script.
In a Linux terminal, send your human-language query to Perplexity.ai's various LLM API endpoint paths.
A spare-time project to explore the Perplexity.ai API. This is a work in progress.
The goals are
- Make it easy to query the Perplexity.ai Web-API endpoints from the command line, and to compare the results of ~10 different models available. This means saving the API responses into a textfile, then reading and inspecting them. I'm not systematically benchmarking the models.
- Demonstrate how to use a Postman collection
.json
-file (and Postman environment.json
-file) from the command line (with the Newman CLI tool), thus exerting very fine-grained control over the API calls, and conserving the responses with lots of metadata (e.g., runtime duration).
Running the script requires an API key, available only to "Perplexity Pro" users. Thus running the script is not free of charge. You need to be a registered customer to use this script. See Perplexity API Pricing for more details.
I think there is a free 5$-per-month plan, which is the cheapest option. If your API usage is low, you might be able to use the API for free.
- Python Script
explore_perplexity_api.py
- README-python.md for more details.
- External Wiki: mutable.ai - AI-generated documentation of this repo.
Shell Scriptexplore_perplexity_api.sh
- first prototype, no longer maintained.README-shell.md for more details.
See INSTALL.md for more details.
- See TODO.md for more details.
Make setting the "custom instruction" more flexible, and more interactive.Finish the Python rewrite of this script.
Use it as you like.