Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

An error occurred during model execution: "RangeError: offset is out of bounds". #499

Closed
wesbos opened this issue Jan 3, 2024 · 9 comments · Fixed by #545
Closed

An error occurred during model execution: "RangeError: offset is out of bounds". #499

wesbos opened this issue Jan 3, 2024 · 9 comments · Fixed by #545
Labels
question Further information is requested

Comments

@wesbos
Copy link

wesbos commented Jan 3, 2024

Question

Hello - having an issue getting this code to run in the browser. Using Xenova/TinyLlama-1.1B-Chat-v1.0 on "@xenova/transformers": "^2.13.2"

It runs perfectly in node.

import { pipeline } from '@xenova/transformers';

console.log('Loading model...');
const generator = await pipeline('text-generation', 'Xenova/TinyLlama-1.1B-Chat-v1.0');
console.log('Model loaded!');
const messages = [
  { role: 'system', content: 'You are a friendly Assistant' },
  { role: 'user', content: 'Explain JavaScript Scopes in simple terms' },
];

const prompt = generator.tokenizer.apply_chat_template(messages, {
  tokenize: false,
  add_generation_prompt: true,
});

console.log('Generating...');
const result = await generator(prompt, {
  max_new_tokens: 256,
  temperature: 0.5,
  do_sample: true,
  top_k: 50,
});

console.dir(result);

In Node it runs:

Screenshot 2024-01-03 at 2 53 39 PM

But in the browser I see this:

Screenshot 2024-01-03 at 2 54 28 PM

Same issue in Firefox.

This issue seems to say it's memory: #8

Is this one too large to run in the browser?

@wesbos wesbos added the question Further information is requested label Jan 3, 2024
@xenova
Copy link
Collaborator

xenova commented Jan 3, 2024

Hi there 👋 Indeed, this is a known issue, which originates from onnxruntime-web and was possibly fixed last week by microsoft/onnxruntime#18914. cc @guschmue. Although, this will only be fixed in transformers.js once 1.17.0 releases, and we upgrade to that version (we are currently frozen at 1.14.0).

@guschmue
Copy link
Contributor

guschmue commented Jan 3, 2024

@satyajandhyala is looking at it

@Ayushman0Singh
Copy link

Facing the same issue. Do we know when the latest version will be out?

@dctanner
Copy link

Same issue here. Is there a previous version that is known to work with the TinyLlama and Phi Demos?

@onepunch-009
Copy link

+1!

@alextanhongpin
Copy link

this is still happening. 😞 is the fix released?

@SrilalS
Copy link

SrilalS commented Jul 10, 2024

Im also having the same issue and there is a small data type issue I noticed.
image

@JackBlair87
Copy link

An error occurred during model execution: "RangeError: offset is out of bounds".
sessionRun @ models.js:215
await in sessionRun (async)
encoderForward @ models.js:519
forward @ models.js:812
_call @ models.js:801
closure @ core.js:62
_call @ pipelines.js:1163
closure @ core.js:62
generate @ chromadb.mjs:2411
await in generate (async)
query @ chromadb.mjs:2092
searchBrowsingHistory @ vector_databases.ts:99
await in searchBrowsingHistory (async)
fetchResults @ ContextNavigator.tsx:25
(anonymous) @ ContextNavigator.tsx:38
commitHookEffectListMount @ react-dom.development.js:23189
commitPassiveMountOnFiber @ react-dom.development.js:24965
commitPassiveMountEffects_complete @ react-dom.development.js:24930
commitPassiveMountEffects_begin @ react-dom.development.js:24917
commitPassiveMountEffects @ react-dom.development.js:24905
flushPassiveEffectsImpl @ react-dom.development.js:27078
flushPassiveEffects @ react-dom.development.js:27023
commitRootImpl @ react-dom.development.js:26974
commitRoot @ react-dom.development.js:26721
performSyncWorkOnRoot @ react-dom.development.js:26156
flushSyncCallbacks @ react-dom.development.js:12042
flushSync @ react-dom.development.js:26240
finishEventHandler @ react-dom.development.js:3976
batchedUpdates @ react-dom.development.js:3994
dispatchEventForPluginEventSystem @ react-dom.development.js:9287
dispatchEventWithEnableCapturePhaseSelectiveHydrationWithoutDiscreteEventReplay @ react-dom.development.js:6465
dispatchEvent @ react-dom.development.js:6457
dispatchDiscreteEvent @ react-dom.development.js:6430
models.js:216 Inputs given to model: {input_ids: Proxy(Tensor), attention_mask: Proxy(Tensor), token_type_ids: Proxy(Tensor)}
sessionRun @ models.js:216
await in sessionRun (async)
encoderForward @ models.js:519
forward @ models.js:812
_call @ models.js:801
closure @ core.js:62
_call @ pipelines.js:1163
closure @ core.js:62
generate @ chromadb.mjs:2411
await in generate (async)
query @ chromadb.mjs:2092
searchBrowsingHistory @ vector_databases.ts:99
await in searchBrowsingHistory (async)
fetchResults @ ContextNavigator.tsx:25
(anonymous) @ ContextNavigator.tsx:38
commitHookEffectListMount @ react-dom.development.js:23189
commitPassiveMountOnFiber @ react-dom.development.js:24965
commitPassiveMountEffects_complete @ react-dom.development.js:24930
commitPassiveMountEffects_begin @ react-dom.development.js:24917
commitPassiveMountEffects @ react-dom.development.js:24905
flushPassiveEffectsImpl @ react-dom.development.js:27078
flushPassiveEffects @ react-dom.development.js:27023
commitRootImpl @ react-dom.development.js:26974
commitRoot @ react-dom.development.js:26721
performSyncWorkOnRoot @ react-dom.development.js:26156
flushSyncCallbacks @ react-dom.development.js:12042
flushSync @ react-dom.development.js:26240
finishEventHandler @ react-dom.development.js:3976
batchedUpdates @ react-dom.development.js:3994
dispatchEventForPluginEventSystem @ react-dom.development.js:9287
dispatchEventWithEnableCapturePhaseSelectiveHydrationWithoutDiscreteEventReplay @ react-dom.development.js:6465
dispatchEvent @ react-dom.development.js:6457
dispatchDiscreteEvent @ react-dom.development.js:6430
vector_databases.ts:123 CHROMA_REPORT: Error searching history: RangeError: offset is out of bounds
    at Uint8Array.set (<anonymous>)
    at e.run (ort-web.min.js:6:453403)
    at e.run (ort-web.min.js:6:444202)
    at e.OnnxruntimeWebAssemblySessionHandler.run (ort-web.min.js:6:447121)
    at InferenceSession.run (inference-session-impl.js:91:1)
    at sessionRun (models.js:210:1)
    at Function.encoderForward [as _forward] (models.js:519:1)
    at Function.forward (models.js:812:1)
    at Function._call (models.js:801:1)
    at Function.closure [as model] (core.js:62:1)
    ```

@gyagp
Copy link

gyagp commented Aug 13, 2024

I just checked I can run this model correctly with either wasm or webgpu backend.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
question Further information is requested
Projects
None yet
Development

Successfully merging a pull request may close this issue.

10 participants