Integrating Portkey with Instructor
Caching Your Requests
Let’s now bring down the cost of running your Instructor pipeline with Portkey caching. You can just create a Config object where you define your cache setting:config id
. Then, just pass it while instantiating your OpenAI client: