labnotes

Streaming responses from ollama

really any fetch thing

tags
javascript
ollama
ai

Contents

I'm still learning modern javascript after all, so here's a little bit to print out the responses from a fetch request as they come in.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
  import ollama from 'ollama/browser'

  async function doCall() {
      console.log( "Starting" )

      const response = await ollama.generate({
          model: 'gemma:7b',
          prompt: 'why is the sky blue?  why is water wet?',
          stream: true,
      })
      for await (const part of response) {
          console.log(part)
      }

      console.log( "done" );
  }

  doCall();

Previously

labnotes

Image upload with node storing on a seperate directory

why do anything so fancy as S3

tags
flyio
vite
javascript

Next

labnotes

Readability and JSDOM

took me a while to get this little code

tags
javascript
readability