Skip to content

Releases: SciSharp/LLamaSharp

v0.27.0

26 Apr 17:00
7cbbc45

Choose a tag to compare

Major Changes

Bug Fixes

  • Fix flaky unit tests by @m0nsky in #1350
  • Fix failing workflow caused by new python version by @m0nsky in #1329
  • Fix/remove standalone metal shader by @m0nsky in #1331
  • Replace -DLLAMA_CURL=OFF with -DLLAMA_OPENSSL=OFF by @m0nsky in #1333
  • Linux RUNPATH by @m0nsky in #1330
  • Fix compile.yml regressions + switch windows arm64 build to clang by @m0nsky in #1334
  • Removed outdated reference to llava_shared.dll for Windows arm64 target by @martindevans in #1339
  • Disable shared object versioning by @m0nsky in #1346
  • Fix musl detection using RID instead of distro name by @m0nsky in #1347
  • bugfix: bugfix: LLamaWeights.NativeHandle.LoadLoraFromFile need read & write access to lora file by @harry330 in #1368

Other Changes

  • Expand CI matrix, new platforms + renames by @m0nsky in #1348
  • Remove setting hardcoded value for n_seq_max in IContextParamsExtensions by @Ed-Pavlov in #1354
  • docs: add FAQ entry for "unknown model architecture" error by @Klivess in #1371

New Contributors

Full Changelog: v0.26.0...v0.27.0

v0.26.0

15 Feb 16:25
b5cd28e

Choose a tag to compare

Major Changes

Bug Fixes

  • Respect ChatOptions.Instructions in LlamaExecutorChatClient by @stephentoub in #1256
  • set RPATH to "@loader_path" / "$ORIGIN" to ensure executables and dynamic libraries search for dependencies in their origin directory by @SignalRT in #1279
  • Fixes Extended grammar optimization mode would crash if TopK sampling is set to zero. by @SerialKicked in #1282
  • Fix issue #382 (multiple publish output files with same relative path) by @LucaMaccarini in #1234

Other Changes

  • Optimization - A queue with fixed storage size backed by a circular buffer by @SignalRT in #1262
  • Fix some warnings. by @Lamothe in #1274
  • Accept CancellationToken in ChatSession.InitializeSessionFromHistoryAsync by @Cryptoc1 in #1260
  • Downgrade compile.yml from macos-latest to macos-14 by @m0nsky in #1302
  • Update Vulkan from 1.3.261.1 to 1.4.335.0 and fix download url by @m0nsky in #1303
  • Update BinaryReleaseId by @m0nsky in #1304
  • Add KaiROS AI as example application by @avikeid2007 in #1313

New Contributors

Full Changelog: v0.25.0...v0.26.0

v0.25.0

16 Aug 13:59
4ad9084

Choose a tag to compare

Major Changes

Bug Fixes

Other Changes

New Contributors

Full Changelog: v0.24.0...v0.25.0

v0.24.0

14 May 23:34
ce8eeb4

Choose a tag to compare

Major Changes

Bug Fixes

Other Changes

New Contributors

Full Changelog: v0.22.0...v0.24.0

v0.22.0

20 Mar 01:52
064b05b

Choose a tag to compare

Major Changes

Bug Fixes

Other Changes

Full Changelog: v0.21.0...v0.22.0

v0.21.0

02 Feb 16:03
f9b61f5

Choose a tag to compare

Major Changes

Full Changelog: v0.20.0...v0.21.0

v0.20.0

21 Jan 16:45
906d3d8

Choose a tag to compare

Major Changes

Bug Fixes

Other Changes

New Contributors

Full Changelog: v0.19.0...v0.20.0

v0.19.0

08 Nov 21:06
5ada3ae

Choose a tag to compare

Major Changes

Bug Fixes

New Contributors

Full Changelog: v0.18.0...v0.19.0

v0.18.0

19 Oct 22:13
40ea046

Choose a tag to compare

Major Changes

  • Split platform-specific binaries for NuGet backends by @m0nsky in #957

Other Changes

  • Updates to ContributingGuide for latest llama.cpp repo by @scritch1sm in #953
  • Fix README chat session example by @easis in #956

New Contributors

Full Changelog: v0.17.0...v0.18.0

v0.17.0

13 Oct 17:02
cd9a044

Choose a tag to compare

Important: The CUDA packages for 0.17.0 exceeded the maximum size for a nuget package. This means some of the 0.17.0 packages are not available until a new way is to deploy those packages. If you need one of the missing packages, use 0.16.0 instead.

Major Changes

Bug Fixes

Other Changes

  • Add LLama2 Chat Session example with a custom templator by @asmirnov82 in #938

New Contributors

Full Changelog: v0.16.0...v0.17.0