Replies: 14 comments 12 replies
-
(I'd also like to be able to replace SimpleDirectoryReader with a splitter from langchain_text_splitters.) |
Beta Was this translation helpful? Give feedback.
-
Thanks @dosu! However, I should have mentioned that I don't want to use Settings, because I don't want to rely on global state, so I'm looking for an example of how to pass the various variables (embedding_llm, generator_llm, etc.) directly to the methods which require them: can you show me that version please? |
Beta Was this translation helpful? Give feedback.
-
Thanks @dosu; I had to make some modifications to get the loading to work, but now I get an exception from
Result: Traceback (most recent call last): |
Beta Was this translation helpful? Give feedback.
-
@dosu I'm afraid you're getting confused: the idea was to rewrite the original (ServiceContext-based) code to eliminate the use of ServiceContext (but without requiring the use of Settings), ie. to eliminate ServiceContext and pass the embedding_llm, generator_llm to the requisite methods directly. I think it's only the generator_llm that I'm stuck with at this point... |
Beta Was this translation helpful? Give feedback.
-
@dosu unfortunately this still gives me this exception: "AttributeError: 'NoneType' object has no attribute 'context_window'". Does any human developer have any insight here? I feel like I'm practically there: passing the LLM for querying is where I got stuck in my own experiments also. |
Beta Was this translation helpful? Give feedback.
-
@dosu in the meantime can you theorize about how I might supply a (non-null) |
Beta Was this translation helpful? Give feedback.
-
It looks like this is an artifact of having removed |
Beta Was this translation helpful? Give feedback.
-
@dosu nope didn't work: actually I tried all these:
but I still get the same exception: it seems to be trying to fetch |
Beta Was this translation helpful? Give feedback.
-
@dosu it looks like it expects an |
Beta Was this translation helpful? Give feedback.
-
@dosu I don't believe your last solution can be right, because |
Beta Was this translation helpful? Give feedback.
-
@dosu however I see that |
Beta Was this translation helpful? Give feedback.
-
Nevermind: I figured it out: I had to wrap my OpenAI LLMs:
so this works:
I guess I don't need the PromptHelper (or to specify the num_output, chunk_overlap_ratio of chunk_size_limit)? Although I'm still curious to know how (where) I would specify them if I was going to use SimpleDirectoryReader...? |
Beta Was this translation helpful? Give feedback.
-
I don't see those arguments in the class's |
Beta Was this translation helpful? Give feedback.
-
Yes, but remember we're trying to migrate away from using ServiceContext: how/where do I supply those arguments in the absence of a ServiceContext (and without using Settings)? |
Beta Was this translation helpful? Give feedback.
-
Hi: I've just (finally!) migrated my code to llama-index 0.1, but it's not clear to me what I need to do to move away from using ServiceContext (and possibly also PromptHelper?): I've inlined a working example of my llama-index code: if someone would kindly rewrite the relevant sections to eliminate ServiceContext (and whatever else I shouldn't be calling), I'd hugely appreciate it!
Beta Was this translation helpful? Give feedback.
All reactions