PLaMo Translation Model

PLaMo翻訳モデルはPreferred Networksによって開発された翻訳向け特化型大規模言語モデルです。 詳しくはブログ記事およびプレスリリースを参照してください。

PLaMo Translation Model is a specialized large-scale language model developed by Preferred Networks for translation tasks. For details, please refer to the blog post and press release.

This repository contains some GGUF converted models of PLaMo-2-Translate. There are three models available:

  • mitmul/plamo-2-translate-IQ4_XS.gguf: A GGUF converted model of pfnet/plamo-2-translate with IQ4 quantization.
  • mitmul/plamo-2-translate-Q5_K_M.gguf: A GGUF converted model of pfnet/plamo-2-translate with Q5_K_M quantization.
  • mitmul/plamo-2-translate-.gguf:

PLaMo Translation Model is released under PLaMo community license. Please check the following license and agree to this before downloading.

NOTE: This model has NOT been instruction-tuned for chat dialog or other downstream tasks.

For commercial users

Please check the PLaMo community license and contact us via the following form to use commercial purpose.

Usage

Build llama.cpp

git clone -b mitmul/add-plamo2 https://github.com/mitmul/llama.cpp
cd llama.cpp
cmake -B release
cmake --build release --config Release -j

(If you want to build llama.cpp with CUDA-support etc., please specify the required options based on the llama.cpp README.)

Download the model

git clone https://huggingface.co/mitmul/plamo-2-translate-GGUF

Run the model

$ ./release/bin/llama-cli \
-m plamo-2-translate-IQ4_XS.gguf \
--jinja --chat-template-file chat_template.jinja2 \
-p "こんにちは" -sp --verbose-prompt

(...verbose outputs...)

<|plamo:op|>dataset
translation
<|plamo:op|>input lang=English|Japanese
こんにゃくは太らない!
<|plamo:op|>output
Konjac won't make you gain weight!
<|plamo:op|>

> あのイーハトーヴォのすきとおった風、夏でも底に冷たさをもつ青いそら、うつくしい森で飾られたモリーオ市、郊外のぎらぎらひかる草の波。
That clear wind from Ihatovo, the blue sky that retains its coolness even in summer, the city of Morio city, beautifully adorned with forests, and the glittering waves of grass in the suburbs.
<|plamo:op|>

Bias, Risks, and Limitations

PLaMo Translation Model is a new technology that carries risks with use. Testing conducted to date has been in English and Japanese, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, PLaMo Translation Model’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of PLaMo Translation Model, developers should perform safety testing and tuning tailored to their specific applications of the model.

Acknowledgement

This model is trained under the project, “Research and Development Project of the Enhanced Infrastructures for Post 5G Information and Communication System” (JPNP 20017), subsidized by the New Energy and Industrial Technology Development Organization (NEDO).

AI policies for Preferred Networks, Inc. group

Downloads last month
1,009
GGUF
Model size
10B params
Architecture
plamo2
Hardware compatibility
Log In to add your hardware

1-bit

2-bit

3-bit

4-bit

5-bit

6-bit

8-bit

16-bit

32-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for mitmul/plamo-2-translate-GGUF

Base model

pfnet/plamo-2-8b
Quantized
(5)
this model