Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

aaa961
/
finetuned-bge-base-en

Sentence Similarity
sentence-transformers
Safetensors
bert
feature-extraction
dense
Generated from Trainer
dataset_size:445
loss:BatchSemiHardTripletLoss
Eval Results (legacy)
text-embeddings-inference
Model card Files Files and versions
xet
Community

Instructions to use aaa961/finetuned-bge-base-en with libraries, inference providers, notebooks, and local apps. Follow these links to get started.

  • Libraries
  • sentence-transformers

    How to use aaa961/finetuned-bge-base-en with sentence-transformers:

    from sentence_transformers import SentenceTransformer
    
    model = SentenceTransformer("aaa961/finetuned-bge-base-en")
    
    sentences = [
        "There's a gap in issue hovers where the hover disappears Verifying: https://github.com/microsoft/vscode/issues/101495\r\n\r\n```\r\nVersion: 1.47.0-insider (user setup)\r\nCommit: 04545fa88043fd10d1f3edefd26be1b8245b516f\r\nDate: 2020-07-02T05:48:37.715Z\r\nElectron: 7.3.2\r\nChrome: 78.0.3904.130\r\nNode.js: 12.8.1\r\nV8: 7.8.279.23-electron.0\r\nOS: Windows_NT x64 10.0.18363\r\n```\r\n\r\nIf I move the mouse quickly I can get onto the issue hover, but it seems it can sometimes disappear if I move it more slowly.\r\n\r\n![](https://memes.peet.io/img/20-07-a663fc60-1be4-4183-8d6d-c0b61fd3ecac.gif)",
        "Error Maximum call stack size exceeded <!-- ⚠️⚠️ Do Not Delete This! bug_report_template ⚠️⚠️ -->\r\n<!-- Please read our Rules of Conduct: https://opensource.microsoft.com/codeofconduct/ -->\r\n<!-- 🕮 Read our guide about submitting issues: https://github.com/microsoft/vscode/wiki/Submitting-Bugs-and-Suggestions -->\r\n<!-- 🔎 Search existing issues to avoid creating duplicates. -->\r\n<!-- 🧪 Test using the latest Insiders build to see if your issue has already been fixed: https://code.visualstudio.com/insiders/ -->\r\n<!-- 💡 Instead of creating your report here, use 'Report Issue' from the 'Help' menu in VS Code to pre-fill useful information. -->\r\n<!-- 🔧 Launch with `code --disable-extensions` to check. -->\r\nDoes this issue occur when all extensions are disabled?: Yes/No\r\n\r\n<!-- 🪓 If you answered No above, use 'Help: Start Extension Bisect' from Command Palette to try to identify the cause. -->\r\n<!-- 📣 Issues caused by an extension need to be reported directly to the extension publisher. The 'Help > Report Issue' dialog can assist with this. -->\r\n- VS Code Version: 1.60.0-insider (system setup)\r\n- OS Version:  Windows_NT x64 10.0.19043\r\n\r\nSteps to Reproduce:\r\n\r\n1.  Open file vector from STL C++.\r\n2.  Push the button automatic detection\r\n\r\nExpected Behavior:\r\nVsCode will expose the C ++ language.\r\n\r\nActual Behavior: \r\nError Maximum call stack size exceeded\r\n\r\n\r\n",
        "Pasting (or sending text) in terminal can scramble the input \r\nIssue Type: <b>Bug</b>\r\n\r\nText copied to an integrated terminal tab configured to use Cygwin bash is sometimes scrambled.  I have observed this both when launching a debug task that copies a command line to a shell and manually pasting from the clipboard.\r\n\r\nI can reproduce this problem as follows.\r\n\r\nCopy the 60 character string \"echo 56789b123456789c123456789d123456789e123456789f123456789\" then paste repeatedly into a terminal tab running Cygwin bash.\r\n\r\nFirst attempt:\r\n```\r\n192.168.3.220:~> echo 56789b123456789c123456789d123456789e123456789f123456789\r\n56789b123456789c123456789d123456789e123456789f123456789\r\n```\r\nThat worked.  Second attempt:\r\n```\r\n192.168.3.220:~> f123456789echo 56789b123456789c123456789d123456789e123456789\r\n```\r\n\r\nThat failed.  The failure frequency I experience is about 1 in 10.\r\n\r\nNotice that the pasted text in the second case was reordered, with the first 50 characters rotated to the end.  That 50 character granularity is apparent in every failure I have examined, including the longer strings generated by debug task launches.  Checking the source, I see `MAX_WRITE_CHECK_SIZE = 50` in terminalProcess.ts in code addressing a similar issue, likely related to my 50 character observation.\r\n\r\nIn case it matters, the version of bash I'm running is \"4.4.12(3)-release\" and cygwin.dll version is \"3.2.0(0.340/5/3)\".\r\n\r\n\r\nVS Code version: Code 1.57.1 (507ce72a4466fbb27b715c3722558bb15afa9f48, 2021-06-17T13:28:07.755Z)\r\nOS version: Windows_NT x64 10.0.19042\r\nRestricted Mode: No\r\n\r\n<details>\r\n<summary>System Info</summary>\r\n\r\n|Item|Value|\r\n|---|---|\r\n|CPUs|Intel(R) Core(TM) i7-6700 CPU @ 3.40GHz (8 x 3408)|\r\n|GPU Status|2d_canvas: enabled<br>gpu_compositing: enabled<br>multiple_raster_threads: enabled_on<br>oop_rasterization: enabled<br>opengl: enabled_on<br>rasterization: enabled<br>skia_renderer: enabled_on<br>video_decode: enabled<br>vulkan: disabled_off<br>webgl: enabled<br>webgl2: enabled|\r\n|Load (avg)|undefined|\r\n|Memory (System)|31.90GB (26.28GB free)|\r\n|Process Argv|--disable-extensions|\r\n|Screen Reader|no|\r\n|VM|0%|\r\n</details>Extensions disabled\r\n<!-- generated by issue reporter -->",
        "SCM - switching branch from the terminal causes focus loss Copied from https://github.com/microsoft/vscode/issues/35307#issuecomment-2071044810\r\n\r\n> works pretty well so far for me. I notice that if I switch branches from the terminal, it loses focus after the editor state is restored. It would be nice if the focus stayed in the terminal. Otherwise, I like it!"
    ]
    embeddings = model.encode(sentences)
    
    similarities = model.similarity(embeddings, embeddings)
    print(similarities.shape)
    # [4, 4]
  • Notebooks
  • Google Colab
  • Kaggle
finetuned-bge-base-en
439 MB
Ctrl+K
Ctrl+K
  • 1 contributor
History: 3 commits
aaa961's picture
aaa961
Add new SentenceTransformer model
4f83a54 verified 9 months ago
  • 1_Pooling
    Add new SentenceTransformer model 9 months ago
  • .gitattributes
    1.52 kB
    initial commit 9 months ago
  • README.md
    85.4 kB
    Add new SentenceTransformer model 9 months ago
  • config.json
    696 Bytes
    Add new SentenceTransformer model 9 months ago
  • config_sentence_transformers.json
    283 Bytes
    Add new SentenceTransformer model 9 months ago
  • model.safetensors
    438 MB
    xet
    Add new SentenceTransformer model 9 months ago
  • modules.json
    349 Bytes
    Add new SentenceTransformer model. 9 months ago
  • sentence_bert_config.json
    56 Bytes
    Add new SentenceTransformer model 9 months ago
  • special_tokens_map.json
    695 Bytes
    Add new SentenceTransformer model. 9 months ago
  • tokenizer.json
    712 kB
    Add new SentenceTransformer model. 9 months ago
  • tokenizer_config.json
    1.27 kB
    Add new SentenceTransformer model 9 months ago
  • vocab.txt
    232 kB
    Add new SentenceTransformer model. 9 months ago