banner
andrewji8

Being towards death

Heed not to the tree-rustling and leaf-lashing rain, Why not stroll along, whistle and sing under its rein. Lighter and better suited than horses are straw sandals and a bamboo staff, Who's afraid? A palm-leaf plaited cape provides enough to misty weather in life sustain. A thorny spring breeze sobers up the spirit, I feel a slight chill, The setting sun over the mountain offers greetings still. Looking back over the bleak passage survived, The return in time Shall not be affected by windswept rain or shine.
telegram
twitter
github

Google Gemma runs locally quickly, with low device requirements and simple operation.

Gemma has been introduced, and also introduced the way to chat with him using code.
This way of using code is more suitable for developers, with a relatively high threshold.
From a usability perspective, it is not convenient to use.
Today, another more user-friendly way for everyone is introduced, directly using visual software to communicate with Gemma.
The software to be used today is called lmstudio, a software that supports running numerous large models with one click.

https://lmstudio.ai (Magic required)

image

Its latest version 0.2.16 already supports Gemma models, both 20 billion and 70 billion parameter models are supported.
This software is pretty good, besides chatting with a large number of open-source models, it can also launch OpenAI API services.
It is user-friendly for both ordinary users and developers.
The software also supports multiple platforms, currently supporting Windows, Mac (Apple Silicon M series), and Linux systems.
In addition, the software seems to default to using the CPU for inference, so the device threshold is greatly reduced.
The inference speed of small models is also very good.

image
After installing the software (old users remember to upgrade), you can see the Google's Gemma 2b Instruct card on the main interface. Find download on the card, click to start downloading the model.
The whole process is very simple, but internal network users may encounter problems with downloading.
After the download is complete, select the Gemma model at the top of the software and wait for the model to load before starting the conversation.

image

Enter content after User at the bottom of the window to start chatting.
You can directly send Chinese, it can understand, but it will reply to you in English.

image

Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.