大型语言模型-Pygmalion 6B-基于GPT-J-6B-Pygmalion 6B-bin-2023.01-素材模版
nan
软件介绍
独家专业软件,Windows/Linux/macOS全平台*,安装破解教程齐全!*
大型语言模型-Pygmalion 6B-基于GPT-J-6B-Pygmalion 6B-bin-2023.01-素材模版
nan
https://huggingface.co/PygmalionAI/pygmalion-6b Pygmalion 6B
==============================================================
Model description"> lns=”http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">Model description
Pymalion 6B is a proof-of-concept dialogue model based on EleutherAI’s GPT-J-6B.
Warning: This model is NOT suitable for use by minors. It will output X-rated content under certain circumstances.
Training data"> lns=”http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">Training data
The fine-tuning dataset consisted of 56MB of dialogue data gathered from multiple sources which includes both real and partially machine-generated conversations.
Training procedure"> lns=”http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">Training procedure
Model weights were initialized from the <code style=”border: 0px solid rgb(229 231 235); box-sizing: border-box; –tw-border-spacing-x:0; –tw-border-spacing-y:0; –tw-translate-x:0; –tw-translate-y:0; –tw-rotate:0; –tw-skew-x:0; –tw-skew-y:0; –tw-scale-x:1; –tw-scale-y:1; –tw-pan-x: ; –tw-pan-y: ; –tw-pinch-zoom: ; –tw-scroll-snap-strictness:proximity; –tw-ordinal: ; –tw-slashed-zero: ; –tw-numeric-figure: ; –tw-numeric-spacing: ; –tw-numeric-fraction: ; –tw-ring-inset: ; –tw-ring-offset-width:0px; –tw-ring-offset-color:#fff; –tw-ring-color:rgba(591302460.5); –tw-ring-offset-shadow:0 0 #
📁 文件列表/
└─📁 pygmalion-6b/
├─📄 addedtokens.json
├─📄 config.json
├─📄 gitattributes.txt
├─📄 merges.txt
├─📄 pytorchmodel-00001-of-00002.bin
├─📄 pytorchmodel-00002-of-00002.bin
├─📄 pytorchmodel.bin.index.json
├─📄 README.md
├─📁 runs/
│ ├─📄 events.out.tfevents.1672953635.uft-2.1037250.0
│ ├─📄 events.out.tfevents.1672954410.uft-2.1044935.0
│ ├─📄 events.out.tfevents.1672955570.uft-2.1055948.0
│ ├─📄 events.out.tfevents.1672956702.uft-2.1066484.0
│ ├─📄 events.out.tfevents.1672957209.uft-2.1071189.0
│ ├─📄 events.out.tfevents.1672964235.uft-2.1134893.0
│ ├─📄 events.out.tfevents.1672964557.uft-2.1138191.0
│ ├─📄 events.out.tfevents.1672974217.uft-2.1224302.0
│ ├─📄 events.out.tfevents.1672974396.uft-2.1226068.0
│ ├─📄 events.out.tfevents.1672974541.uft-2.1227652.0
│ ├─📄 events.out.tfevents.1673061987.uft-2.2002177.0
│ └─📄 events.out.tfevents.1673062187.uft-2.2004200.0
├─📄 specialtokensmap.json
├─📄 tokenizer.json
├─📄 tokenizer_config.json
└─📄 vocab.json