Hash
stringlengths 40
40
| Date
stringlengths 19
20
⌀ | Author
stringlengths 2
30
| commit_message
stringlengths 3
28.8k
| IsMerge
bool 1
class | Additions
int64 0
55.2k
| Deletions
int64 0
991
| Total Changes
int64 -3
55.2k
| git_diff
stringlengths 23
47.3k
| Repository Name
stringclasses 159
values | Owner
stringclasses 85
values | Primary Language
stringclasses 20
values | Language
stringclasses 19
values | Stars
float64 218
411k
⌀ | Forks
float64 8
79k
⌀ | Description
stringclasses 96
values | Repository
stringclasses 161
values | type
stringclasses 6
values | Comment
stringlengths 7
156
⌀ |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
b52a089f46c10c8184ccf9e276c6cbf3ec1977b2
| null |
boomski
|
Update us.m3u
| false
| 2
| 0
| 2
|
--- us.m3u
@@ -111,6 +111,8 @@ http://170.178.189.66:1935/live/Stream1/.m3u8
http://170.178.189.66:1935/live/Stream1/playlist.m3u8
#EXTINF:-1 tvg-id="" tvg-name="" tvg-logo="https://i.imgur.com/BgwAqlG.png" group-title="",AMGTV
https://api.locastnet.org/api/watch/stream/1094/34/-118/?token=eyJhbGciOiJIUzI1NiJ9.eyJzdWIiOiJkdWRlc2VrLnBldGVyQGdtYWlsLmNvbSIsInJvbGUiOjEsImV4cCI6MTU4OTEyNjI3OH0.UmdOQBwiO5JjceZ-WpWKYXZy2v6aooUi1808NTolbIo&fbclid=IwAR3EW3T_ZSm_BwKxzpTSlethc322oxJ_Z5JYgPOkD4SSB08yvxqmsbM_zTU
+#EXTINF:-1 tvg-id="" tvg-name="" tvg-logo="https://i.imgur.com/DQSHHTS.png" group-title="",America Teve
+http://api.new.livestream.com/accounts/5730046/events/2966225/live.m3u8
#EXTINF:-1 tvg-id="" tvg-name="" tvg-logo="https://i.imgur.com/HidIMbv.png" group-title="",AntennaTV
https://api.locastnet.org/api/watch/stream/1080/34/-118/?token=eyJhbGciOiJIUzI1NiJ9.eyJzdWIiOiJkdWRlc2VrLnBldGVyQGdtYWlsLmNvbSIsInJvbGUiOjEsImV4cCI6MTU4OTEyNjI3OH0.UmdOQBwiO5JjceZ-WpWKYXZy2v6aooUi1808NTolbIo&fbclid=IwAR3EW3T_ZSm_BwKxzpTSlethc322oxJ_Z5JYgPOkD4SSB08yvxqmsbM_zTU
#EXTINF:-1 tvg-id="" tvg-name="" tvg-logo="https://i.imgur.com/hSYez3V.png" group-title="",Atlanta Channel
|
iptv-org_iptv.json
| null | null | null | null | null | null |
iptv-org_iptv.json
|
NEW_FEAT
|
4, Added new channels
|
8e77fa5a0f855975febe3a6945736540d981d9f8
|
2024-03-08 19:36:40
|
Toshiaki Takeuchi
|
Reimplemented DALL-E support
| false
| 8
| 24
| 32
|
--- MatGPT.mlapp
Binary files a/MatGPT.mlapp and b/MatGPT.mlapp differ
--- contents/presets.csv
@@ -3,7 +3,7 @@ AI Assistant,You are a helpful assistant. Answer as concisely as possible. ,Wher
Read a web page,You are a helpful assistant that can read a small web page and analyze its content. The content of the page will be extracted by an external function and stored in the chat history. ,What is this page about? https://www.mathworks.com/help/matlab/text-files.html,gpt-3.5-turbo,1000,0,0,1,1
Read a local file,You are a helpful assistant that can read a small file and analyze its content. The content of the file will be extracted by an external function and stored in the chat history. ,Select a file using the paper clip icon on the left.,gpt-3.5-turbo,1000,0,0,1,1
Understand an image,You are a helpful assistant that can see an image and analyze its content.,What is in this image? https://www.mathworks.com/help/examples/matlab/win64/DisplayGrayscaleRGBIndexedOrBinaryImageExample_04.png,gpt-4-vision-preview,1000,1,0,0,0
-Generate an image,,Create a 3D avatar of a whimsical sushi on the beach. He is decorated with various sushi elements and is playfully interacting with the beach environment.,dall-e-3,Inf,0,0,0,0
+Generate an image,You are a helpful assistant that can generate an image.,Create a 3D avatar of a whimsical sushi on the beach. He is decorated with various sushi elements and is playfully interacting with the beach environment.,gpt-3.5-turbo,1000,1,0,0,0
English to MATLAB Code,You are a helpful assistant that generates MATLAB code. ,"Define two random vectors x and y, fit a linear model to the data, and plot both the data and fitted line.",gpt-3.5-turbo,1000,0,1,1,1
English to Simulink Model,"You are a helpful assistant that creates Simulink models.
You create the models by generating MATLAB code that adds all the necessary blocks and set their parameters.
--- helpers/MsgHelper.m
@@ -16,8 +16,7 @@
struct('name','gpt-4-0613','attributes',struct('contextwindow',8192,'cutoff','Sep 2021'),'legacy',false), ...
struct('name','gpt-4-1106-preview','attributes',struct('contextwindow',128000,'cutoff','Apr 2023'),'legacy',false), ...
struct('name','gpt-4-vision-preview','attributes',struct('contextwindow',128000,'cutoff','Apr 2023'),'legacy',false), ...
- struct('name','gpt-4-turbo-preview','attributes',struct('contextwindow',128000,'cutoff','Apr 2023'),'legacy',false), ...
- struct('name','dall-e-3','attributes',struct('contextwindow','n/a','cutoff','n/a'),'legacy',false), ...
+ struct('name','gpt-4-turbo-preview','attributes',struct('contextwindow',128000,'cutoff','Apr 2023'),'legacy',false) ...
];
contextwindow = models(arrayfun(@(x) string(x.name), models) == modelName).attributes.contextwindow;
cutoff = models(arrayfun(@(x) string(x.name), models) == modelName).attributes.cutoff;
@@ -66,7 +65,7 @@
args = "{""arg1"": 1 }";
funCall = struct("name", functionName, "arguments", args);
toolCall = struct("id", id, "type", "function", "function", funCall);
- toolCallPrompt = struct("role", "assistant", "content", [], "tool_calls", toolCall);
+ toolCallPrompt = struct("role", "assistant", "content", "", "tool_calls", toolCall);
messages = addResponseMessage(messages, toolCallPrompt);
% add content as the function result
messages = addToolMessage(messages,id,functionName,content);
@@ -143,5 +142,21 @@
imgTag = "<img width=512 src=" + urlEncoded + ">";
end
+ % function call to generate image
+ function [image, url, response] = generateImage(prompt)
+ mdl = openAIImages(ModelName="dall-e-3");
+ [images, response] = generate(mdl,string(prompt));
+ image = images{1};
+ url = response.Body.Data.data.url;
+ end
+
+ % function call to understand image
+ function [txt,message,response] = understandImage(prompt,image,max_tokens)
+ chat = openAIChat("You are an AI assistant","ModelName","gpt-4-vision-preview");
+ messages = openAIMessages;
+ messages = addUserMessageWithImages(messages,prompt,image);
+ [txt,message,response] = generate(chat,messages,MaxNumTokens=max_tokens);
+ end
+
end
end
\ No newline at end of file
--- helpers/llms-with-matlab/+llms/+stream/responseStreamer.m
@@ -45,8 +45,9 @@
catch ME
errID = 'llms:stream:responseStreamer:InvalidInput';
msg = "Input does not have the expected json format. " + str{i};
- ME = MException(errID,msg);
- throw(ME)
+ causeException = MException(errID,msg);
+ ME = addCause(ME,causeException);
+ rethrow(ME)
end
if ischar(json.choices.finish_reason) && ismember(json.choices.finish_reason,["stop","tool_calls"])
stop = true;
--- helpers/llms-with-matlab/openAIChat.m
@@ -311,8 +311,8 @@ function mustBeValidFunctionCall(this, functionCall)
if ~isempty(this.Tools)
toolChoice = "auto";
end
- elseif ~ismember(toolChoice,["auto","none"])
- % if toolChoice is not empty, then it must be "auto", "none" or in the format
+ elseif ToolChoice ~= "auto"
+ % if toolChoice is not empty, then it must be in the format
% {"type": "function", "function": {"name": "my_function"}}
toolChoice = struct("type","function","function",struct("name",toolChoice));
end
|
matgpt
|
toshiakit
|
MATLAB
|
MATLAB
| 218
| 33
|
MATLAB app to access ChatGPT API from OpenAI
|
toshiakit_matgpt
|
NEW_FEAT
|
Obvious
|
a0cc73263a1a8b3c6a01d6e8b9781da315d09b92
|
2024-12-29 20:51:56
|
Eser DENIZ
|
feat: default notification title (#451)
| false
| 3
| 1
| 4
|
--- src/Notification.php
@@ -12,9 +12,7 @@ class Notification
protected string $event = '';
- final public function __construct(protected Client $client) {
- $this->title = config('app.name');
- }
+ final public function __construct(protected Client $client) {}
public static function new()
{
|
laravel
|
nativephp
|
PHP
|
PHP
| 3,498
| 182
|
Laravel wrapper for the NativePHP framework
|
nativephp_laravel
|
CODE_IMPROVEMENT
|
Code change: added final modifier
|
3f523871c666bd59dbd992121f614d97480eceef
|
2025-03-20 05:52:06
|
netdatabot
|
[ci skip] Update changelog and version for nightly build: v2.3.0-3-nightly.
| false
| 10
| 3
| 13
|
--- CHANGELOG.md
@@ -1,14 +1,5 @@
# Changelog
-## [**Next release**](https://github.com/netdata/netdata/tree/HEAD)
-
-[Full Changelog](https://github.com/netdata/netdata/compare/v2.3.0...HEAD)
-
-**Merged pull requests:**
-
-- fix reliability calculation [\#19909](https://github.com/netdata/netdata/pull/19909) ([ktsaou](https://github.com/ktsaou))
-- new exit cause: shutdown timeout [\#19903](https://github.com/netdata/netdata/pull/19903) ([ktsaou](https://github.com/ktsaou))
-
## [v2.3.0](https://github.com/netdata/netdata/tree/v2.3.0) (2025-03-19)
[Full Changelog](https://github.com/netdata/netdata/compare/v2.2.6...v2.3.0)
@@ -486,6 +477,8 @@
- Revert "prevent memory corruption in dbengine" [\#19364](https://github.com/netdata/netdata/pull/19364) ([ktsaou](https://github.com/ktsaou))
- prevent memory corruption in dbengine [\#19363](https://github.com/netdata/netdata/pull/19363) ([ktsaou](https://github.com/ktsaou))
- metrics-cardinality function [\#19362](https://github.com/netdata/netdata/pull/19362) ([ktsaou](https://github.com/ktsaou))
+- avoid checking replication status all the time [\#19361](https://github.com/netdata/netdata/pull/19361) ([ktsaou](https://github.com/ktsaou))
+- respect flood protection configuration for daemon [\#19360](https://github.com/netdata/netdata/pull/19360) ([ktsaou](https://github.com/ktsaou))
## [v2.1.1](https://github.com/netdata/netdata/tree/v2.1.1) (2025-01-07)
--- packaging/version
@@ -1 +1 @@
-v2.3.0-3-nightly
+v2.3.0
|
netdata
|
netdata
|
C
|
C
| 73,681
| 6,023
|
X-Ray Vision for your infrastructure!
|
netdata_netdata
|
CONFIG_CHANGE
|
Version/release update
|
a57939a077ebaa7bbb1f14b3d8ff4bc6b44b289d
|
2023-06-06 19:12:15
|
dependabot[bot]
|
Build(deps-dev): Bump rollup from 3.23.0 to 3.23.1 (#38714) Bumps [rollup](https://github.com/rollup/rollup) from 3.23.0 to 3.23.1.
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v3.23.0...v3.23.1)
---
updated-dependencies:
- dependency-name: rollup
dependency-type: direct:development
update-type: version-update:semver-patch
...
Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
| false
| 8
| 8
| 16
|
--- package-lock.json
@@ -58,7 +58,7 @@
"npm-run-all2": "^6.0.5",
"postcss": "^8.4.24",
"postcss-cli": "^10.1.0",
- "rollup": "^3.23.1",
+ "rollup": "^3.23.0",
"rollup-plugin-istanbul": "^4.0.0",
"rtlcss": "^4.1.0",
"sass": "^1.62.1",
@@ -9090,9 +9090,9 @@
}
},
"node_modules/rollup": {
- "version": "3.23.1",
- "resolved": "https://registry.npmjs.org/rollup/-/rollup-3.23.1.tgz",
- "integrity": "sha512-ybRdFVHOoljGEFILHLd2g/qateqUdjE6YS41WXq4p3C/WwD3xtWxV4FYWETA1u9TeXQc5K8L8zHE5d/scOvrOQ==",
+ "version": "3.23.0",
+ "resolved": "https://registry.npmjs.org/rollup/-/rollup-3.23.0.tgz",
+ "integrity": "sha512-h31UlwEi7FHihLe1zbk+3Q7z1k/84rb9BSwmBSr/XjOCEaBJ2YyedQDuM0t/kfOS0IxM+vk1/zI9XxYj9V+NJQ==",
"dev": true,
"bin": {
"rollup": "dist/bin/rollup"
@@ -17370,9 +17370,9 @@
}
},
"rollup": {
- "version": "3.23.1",
- "resolved": "https://registry.npmjs.org/rollup/-/rollup-3.23.1.tgz",
- "integrity": "sha512-ybRdFVHOoljGEFILHLd2g/qateqUdjE6YS41WXq4p3C/WwD3xtWxV4FYWETA1u9TeXQc5K8L8zHE5d/scOvrOQ==",
+ "version": "3.23.0",
+ "resolved": "https://registry.npmjs.org/rollup/-/rollup-3.23.0.tgz",
+ "integrity": "sha512-h31UlwEi7FHihLe1zbk+3Q7z1k/84rb9BSwmBSr/XjOCEaBJ2YyedQDuM0t/kfOS0IxM+vk1/zI9XxYj9V+NJQ==",
"dev": true,
"requires": {
"fsevents": "~2.3.2"
--- package.json
@@ -142,7 +142,7 @@
"npm-run-all2": "^6.0.5",
"postcss": "^8.4.24",
"postcss-cli": "^10.1.0",
- "rollup": "^3.23.1",
+ "rollup": "^3.23.0",
"rollup-plugin-istanbul": "^4.0.0",
"rtlcss": "^4.1.0",
"sass": "^1.62.1",
|
bootstrap
|
twbs
|
JavaScript
|
JavaScript
| 171,693
| 79,045
|
The most popular HTML, CSS, and JavaScript framework for developing responsive, mobile first projects on the web.
|
twbs_bootstrap
|
CONFIG_CHANGE
|
version changes are done
|
4e230e81e753453d7be862f11ea0fff8290e2209
|
2023-08-31 07:20:31
|
Goh Chun Lin
|
Sign Manifesto
| false
| 5
| 0
| 5
|
--- index.html
@@ -4382,11 +4382,6 @@
<td>Individual</td>
<td>DevOps; SRE; Crypto Enthusiast; open-source community efforts</td>
</tr>
- <tr>
- <td><a href="https://github.com/goh-chunlin">Goh Chun Lin</td>
- <td>Individual</td>
- <td>Development; Documentation; Outreach; open-source community efforts</td>
- </tr>
</tbody>
</table>
</div>
|
manifesto
|
opentofu
|
HTML
|
HTML
| 36,134
| 1,083
|
The OpenTF Manifesto expresses concern over HashiCorp's switch of the Terraform license from open-source to the Business Source License (BSL) and calls for the tool's return to a truly open-source license.
|
opentofu_manifesto
|
NEW_FEAT
|
a new table row with links added in the code probably adding a new feature in the UI
|
5976310916458868b5cbd9d8c7cc7de5af418230
|
2024-12-02 06:06:16
|
Caleb White
|
worktree: refactor infer_backlink return The previous round[1] was merged a bit early before reviewer feedback could be applied. This correctly indents a code block and updates the `infer_backlink` function to return `-1` on failure and strbuf.len on success. [1]: https://lore.kernel.org/git/[email protected] Signed-off-by: Caleb White <[email protected]> Signed-off-by: Junio C Hamano <[email protected]>
| false
| 8
| 7
| 15
|
--- worktree.c
@@ -111,9 +111,9 @@ struct worktree *get_linked_worktree(const char *id,
strbuf_strip_suffix(&worktree_path, "/.git");
if (!is_absolute_path(worktree_path.buf)) {
- strbuf_strip_suffix(&path, "gitdir");
- strbuf_addbuf(&path, &worktree_path);
- strbuf_realpath_forgiving(&worktree_path, path.buf, 0);
+ strbuf_strip_suffix(&path, "gitdir");
+ strbuf_addbuf(&path, &worktree_path);
+ strbuf_realpath_forgiving(&worktree_path, path.buf, 0);
}
CALLOC_ARRAY(worktree, 1);
@@ -725,10 +725,8 @@ static int is_main_worktree_path(const char *path)
* won't know which <repo>/worktrees/<id>/gitdir to repair. However, we may
* be able to infer the gitdir by manually reading /path/to/worktree/.git,
* extracting the <id>, and checking if <repo>/worktrees/<id> exists.
- *
- * Returns -1 on failure and strbuf.len on success.
*/
-static ssize_t infer_backlink(const char *gitfile, struct strbuf *inferred)
+static int infer_backlink(const char *gitfile, struct strbuf *inferred)
{
struct strbuf actual = STRBUF_INIT;
const char *id;
@@ -749,11 +747,12 @@ static ssize_t infer_backlink(const char *gitfile, struct strbuf *inferred)
goto error;
strbuf_release(&actual);
- return inferred->len;
+ return 1;
+
error:
strbuf_release(&actual);
strbuf_reset(inferred); /* clear invalid path */
- return -1;
+ return 0;
}
/*
|
git
| null |
C
|
C
| null | null |
Version control
|
_git
|
CODE_IMPROVEMENT
|
obvious
|
5e0c28438b8aeba78cbf484c54a6140ccde1d4c9
|
2024-11-23 12:11:19
|
2dust
|
Refactor Localization for AmazTool
| false
| 504
| 98
| 602
|
--- v2rayN/AmazTool/AmazTool.csproj
@@ -10,18 +10,8 @@
</PropertyGroup>
<ItemGroup>
- <Compile Update="Resx\Resource.Designer.cs">
- <DesignTime>True</DesignTime>
- <AutoGen>True</AutoGen>
- <DependentUpon>Resource.resx</DependentUpon>
- </Compile>
- </ItemGroup>
-
- <ItemGroup>
- <EmbeddedResource Update="Resx\Resource.resx">
- <Generator>ResXFileCodeGenerator</Generator>
- <LastGenOutput>Resource.Designer.cs</LastGenOutput>
- </EmbeddedResource>
+ <EmbeddedResource Include="Assets\en-US.json" />
+ <EmbeddedResource Include="Assets\zh-CN.json" />
</ItemGroup>
</Project>
\ No newline at end of file
--- v2rayN/AmazTool/Assets/en-US.json
@@ -0,0 +1,14 @@
+{
+ "Restart_v2rayN": "Start v2rayN, please wait...",
+ "Guidelines": "Please run it from the main application.",
+ "Upgrade_File_Not_Found": "Upgrade failed, file not found.",
+ "In_Progress": "In progress, please wait...",
+ "Try_Terminate_Process": "Try to terminate the v2rayN process.",
+ "Failed_Terminate_Process": "Failed to terminate the v2rayN.Close it manually,or the upgrade may fail.",
+ "Start_Unzipping": "Start extracting the update package.",
+ "Success_Unzipping": "Successfully extracted the update package!",
+ "Failed_Unzipping": "Failed to extract the update package!",
+ "Failed_Upgrade": "Upgrade failed!",
+ "Success_Upgrade": "Upgrade success!",
+ "Information": "Information"
+}
\ No newline at end of file
--- v2rayN/AmazTool/Assets/zh-CN.json
@@ -0,0 +1,14 @@
+{
+ "Restart_v2rayN": "正在重启,请等待...",
+ "Guidelines": "请从主应用运行!",
+ "Upgrade_File_Not_Found": "升级失败,文件不存在!",
+ "In_Progress": "正在进行中,请等待...",
+ "Try_Terminate_Process": "尝试结束 v2rayN 进程...",
+ "Failed_Terminate_Process": "请手动关闭正在运行的v2rayN,否则可能升级失败。",
+ "Start_Unzipping": "开始解压缩更新包...",
+ "Success_Unzipping": "解压缩更新包成功!",
+ "Failed_Unzipping": "解压缩更新包失败!",
+ "Failed_Upgrade": "升级失败!",
+ "Success_Upgrade": "升级成功!",
+ "Information": "提示"
+}
\ No newline at end of file
--- v2rayN/AmazTool/LocalizationHelper.cs
@@ -0,0 +1,59 @@
+using System.Globalization;
+using System.Reflection;
+using System.Text.Json;
+
+namespace AmazTool
+{
+ public class LocalizationHelper
+ {
+ private static Dictionary<string, string> _languageResources = [];
+
+ static LocalizationHelper()
+ {
+ // 加载语言资源
+ LoadLanguageResources();
+ }
+
+ private static void LoadLanguageResources()
+ {
+ try
+ {
+ var currentLanguage = CultureInfo.CurrentCulture.Name;
+ if (currentLanguage != "zh-CN" && currentLanguage != "en-US")
+ {
+ currentLanguage = "en-US";
+ }
+
+ var resourceName = $"AmazTool.Assets.{currentLanguage}.json";
+ var assembly = Assembly.GetExecutingAssembly();
+
+ using var stream = assembly.GetManifestResourceStream(resourceName);
+ if (stream == null) return;
+
+ using StreamReader reader = new(stream);
+ var json = reader.ReadToEnd();
+ if (!string.IsNullOrEmpty(json))
+ {
+ _languageResources = JsonSerializer.Deserialize<Dictionary<string, string>>(json) ?? new Dictionary<string, string>();
+ }
+ }
+ catch (IOException ex)
+ {
+ Console.WriteLine($"Failed to read language resource file: {ex.Message}");
+ }
+ catch (JsonException ex)
+ {
+ Console.WriteLine($"Failed to parse JSON data: {ex.Message}");
+ }
+ catch (Exception ex)
+ {
+ Console.WriteLine($"Unexpected error occurred: {ex.Message}");
+ }
+ }
+
+ public static string GetLocalizedValue(string key)
+ {
+ return _languageResources.TryGetValue(key, out var translation) ? translation : key;
+ }
+ }
+}
\ No newline at end of file
--- v2rayN/AmazTool/Program.cs
@@ -10,7 +10,7 @@
{
if (args.Length == 0)
{
- Console.WriteLine(Resx.Resource.Guidelines);
+ Console.WriteLine(LocalizationHelper.GetLocalizedValue("Guidelines"));
Thread.Sleep(5000);
return;
}
--- v2rayN/AmazTool/Resx/Resource.Designer.cs
@@ -1,171 +0,0 @@
-//------------------------------------------------------------------------------
-// <auto-generated>
-// 此代码由工具生成。
-// 运行时版本:4.0.30319.42000
-//
-// 对此文件的更改可能会导致不正确的行为,并且如果
-// 重新生成代码,这些更改将会丢失。
-// </auto-generated>
-//------------------------------------------------------------------------------
-
-namespace AmazTool.Resx {
- using System;
-
-
- /// <summary>
- /// 一个强类型的资源类,用于查找本地化的字符串等。
- /// </summary>
- // 此类是由 StronglyTypedResourceBuilder
- // 类通过类似于 ResGen 或 Visual Studio 的工具自动生成的。
- // 若要添加或移除成员,请编辑 .ResX 文件,然后重新运行 ResGen
- // (以 /str 作为命令选项),或重新生成 VS 项目。
- [global::System.CodeDom.Compiler.GeneratedCodeAttribute("System.Resources.Tools.StronglyTypedResourceBuilder", "17.0.0.0")]
- [global::System.Diagnostics.DebuggerNonUserCodeAttribute()]
- [global::System.Runtime.CompilerServices.CompilerGeneratedAttribute()]
- internal class Resource {
-
- private static global::System.Resources.ResourceManager resourceMan;
-
- private static global::System.Globalization.CultureInfo resourceCulture;
-
- [global::System.Diagnostics.CodeAnalysis.SuppressMessageAttribute("Microsoft.Performance", "CA1811:AvoidUncalledPrivateCode")]
- internal Resource() {
- }
-
- /// <summary>
- /// 返回此类使用的缓存的 ResourceManager 实例。
- /// </summary>
- [global::System.ComponentModel.EditorBrowsableAttribute(global::System.ComponentModel.EditorBrowsableState.Advanced)]
- internal static global::System.Resources.ResourceManager ResourceManager {
- get {
- if (object.ReferenceEquals(resourceMan, null)) {
- global::System.Resources.ResourceManager temp = new global::System.Resources.ResourceManager("AmazTool.Resx.Resource", typeof(Resource).Assembly);
- resourceMan = temp;
- }
- return resourceMan;
- }
- }
-
- /// <summary>
- /// 重写当前线程的 CurrentUICulture 属性,对
- /// 使用此强类型资源类的所有资源查找执行重写。
- /// </summary>
- [global::System.ComponentModel.EditorBrowsableAttribute(global::System.ComponentModel.EditorBrowsableState.Advanced)]
- internal static global::System.Globalization.CultureInfo Culture {
- get {
- return resourceCulture;
- }
- set {
- resourceCulture = value;
- }
- }
-
- /// <summary>
- /// 查找类似 Failed to terminate the v2rayN.Close it manually,or the upgrade may fail. 的本地化字符串。
- /// </summary>
- internal static string FailedTerminateProcess {
- get {
- return ResourceManager.GetString("FailedTerminateProcess", resourceCulture);
- }
- }
-
- /// <summary>
- /// 查找类似 Failed to extract the update package. 的本地化字符串。
- /// </summary>
- internal static string FailedUnzipping {
- get {
- return ResourceManager.GetString("FailedUnzipping", resourceCulture);
- }
- }
-
- /// <summary>
- /// 查找类似 Upgrade failed. 的本地化字符串。
- /// </summary>
- internal static string FailedUpgrade {
- get {
- return ResourceManager.GetString("FailedUpgrade", resourceCulture);
- }
- }
-
- /// <summary>
- /// 查找类似 Please run it from the main application. 的本地化字符串。
- /// </summary>
- internal static string Guidelines {
- get {
- return ResourceManager.GetString("Guidelines", resourceCulture);
- }
- }
-
- /// <summary>
- /// 查找类似 Information 的本地化字符串。
- /// </summary>
- internal static string Information {
- get {
- return ResourceManager.GetString("Information", resourceCulture);
- }
- }
-
- /// <summary>
- /// 查找类似 In progress, please wait... 的本地化字符串。
- /// </summary>
- internal static string InProgress {
- get {
- return ResourceManager.GetString("InProgress", resourceCulture);
- }
- }
-
- /// <summary>
- /// 查找类似 Start v2rayN, please wait... 的本地化字符串。
- /// </summary>
- internal static string Restartv2rayN {
- get {
- return ResourceManager.GetString("Restartv2rayN", resourceCulture);
- }
- }
-
- /// <summary>
- /// 查找类似 Start extracting the update package... 的本地化字符串。
- /// </summary>
- internal static string StartUnzipping {
- get {
- return ResourceManager.GetString("StartUnzipping", resourceCulture);
- }
- }
-
- /// <summary>
- /// 查找类似 Successfully extracted the update package. 的本地化字符串。
- /// </summary>
- internal static string SuccessUnzipping {
- get {
- return ResourceManager.GetString("SuccessUnzipping", resourceCulture);
- }
- }
-
- /// <summary>
- /// 查找类似 Upgrade success. 的本地化字符串。
- /// </summary>
- internal static string SuccessUpgrade {
- get {
- return ResourceManager.GetString("SuccessUpgrade", resourceCulture);
- }
- }
-
- /// <summary>
- /// 查找类似 Try to terminate the v2rayN process... 的本地化字符串。
- /// </summary>
- internal static string TryTerminateProcess {
- get {
- return ResourceManager.GetString("TryTerminateProcess", resourceCulture);
- }
- }
-
- /// <summary>
- /// 查找类似 Upgrade failed, file not found. 的本地化字符串。
- /// </summary>
- internal static string UpgradeFileNotFound {
- get {
- return ResourceManager.GetString("UpgradeFileNotFound", resourceCulture);
- }
- }
- }
-}
--- v2rayN/AmazTool/Resx/Resource.resx
@@ -1,156 +0,0 @@
-<?xml version="1.0" encoding="utf-8"?>
-<root>
- <!--
- Microsoft ResX Schema
-
- Version 2.0
-
- The primary goals of this format is to allow a simple XML format
- that is mostly human readable. The generation and parsing of the
- various data types are done through the TypeConverter classes
- associated with the data types.
-
- Example:
-
- ... ado.net/XML headers & schema ...
- <resheader name="resmimetype">text/microsoft-resx</resheader>
- <resheader name="version">2.0</resheader>
- <resheader name="reader">System.Resources.ResXResourceReader, System.Windows.Forms, ...</resheader>
- <resheader name="writer">System.Resources.ResXResourceWriter, System.Windows.Forms, ...</resheader>
- <data name="Name1"><value>this is my long string</value><comment>this is a comment</comment></data>
- <data name="Color1" type="System.Drawing.Color, System.Drawing">Blue</data>
- <data name="Bitmap1" mimetype="application/x-microsoft.net.object.binary.base64">
- <value>[base64 mime encoded serialized .NET Framework object]</value>
- </data>
- <data name="Icon1" type="System.Drawing.Icon, System.Drawing" mimetype="application/x-microsoft.net.object.bytearray.base64">
- <value>[base64 mime encoded string representing a byte array form of the .NET Framework object]</value>
- <comment>This is a comment</comment>
- </data>
-
- There are any number of "resheader" rows that contain simple
- name/value pairs.
-
- Each data row contains a name, and value. The row also contains a
- type or mimetype. Type corresponds to a .NET class that support
- text/value conversion through the TypeConverter architecture.
- Classes that don't support this are serialized and stored with the
- mimetype set.
-
- The mimetype is used for serialized objects, and tells the
- ResXResourceReader how to depersist the object. This is currently not
- extensible. For a given mimetype the value must be set accordingly:
-
- Note - application/x-microsoft.net.object.binary.base64 is the format
- that the ResXResourceWriter will generate, however the reader can
- read any of the formats listed below.
-
- mimetype: application/x-microsoft.net.object.binary.base64
- value : The object must be serialized with
- : System.Runtime.Serialization.Formatters.Binary.BinaryFormatter
- : and then encoded with base64 encoding.
-
- mimetype: application/x-microsoft.net.object.soap.base64
- value : The object must be serialized with
- : System.Runtime.Serialization.Formatters.Soap.SoapFormatter
- : and then encoded with base64 encoding.
-
- mimetype: application/x-microsoft.net.object.bytearray.base64
- value : The object must be serialized into a byte array
- : using a System.ComponentModel.TypeConverter
- : and then encoded with base64 encoding.
- -->
- <xsd:schema id="root" xmlns="" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:msdata="urn:schemas-microsoft-com:xml-msdata">
- <xsd:import namespace="http://www.w3.org/XML/1998/namespace" />
- <xsd:element name="root" msdata:IsDataSet="true">
- <xsd:complexType>
- <xsd:choice maxOccurs="unbounded">
- <xsd:element name="metadata">
- <xsd:complexType>
- <xsd:sequence>
- <xsd:element name="value" type="xsd:string" minOccurs="0" />
- </xsd:sequence>
- <xsd:attribute name="name" use="required" type="xsd:string" />
- <xsd:attribute name="type" type="xsd:string" />
- <xsd:attribute name="mimetype" type="xsd:string" />
- <xsd:attribute ref="xml:space" />
- </xsd:complexType>
- </xsd:element>
- <xsd:element name="assembly">
- <xsd:complexType>
- <xsd:attribute name="alias" type="xsd:string" />
- <xsd:attribute name="name" type="xsd:string" />
- </xsd:complexType>
- </xsd:element>
- <xsd:element name="data">
- <xsd:complexType>
- <xsd:sequence>
- <xsd:element name="value" type="xsd:string" minOccurs="0" msdata:Ordinal="1" />
- <xsd:element name="comment" type="xsd:string" minOccurs="0" msdata:Ordinal="2" />
- </xsd:sequence>
- <xsd:attribute name="name" type="xsd:string" use="required" msdata:Ordinal="1" />
- <xsd:attribute name="type" type="xsd:string" msdata:Ordinal="3" />
- <xsd:attribute name="mimetype" type="xsd:string" msdata:Ordinal="4" />
- <xsd:attribute ref="xml:space" />
- </xsd:complexType>
- </xsd:element>
- <xsd:element name="resheader">
- <xsd:complexType>
- <xsd:sequence>
- <xsd:element name="value" type="xsd:string" minOccurs="0" msdata:Ordinal="1" />
- </xsd:sequence>
- <xsd:attribute name="name" type="xsd:string" use="required" />
- </xsd:complexType>
- </xsd:element>
- </xsd:choice>
- </xsd:complexType>
- </xsd:element>
- </xsd:schema>
- <resheader name="resmimetype">
- <value>text/microsoft-resx</value>
- </resheader>
- <resheader name="version">
- <value>2.0</value>
- </resheader>
- <resheader name="reader">
- <value>System.Resources.ResXResourceReader, System.Windows.Forms, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089</value>
- </resheader>
- <resheader name="writer">
- <value>System.Resources.ResXResourceWriter, System.Windows.Forms, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089</value>
- </resheader>
- <data name="Restartv2rayN" xml:space="preserve">
- <value>Start v2rayN, please wait...</value>
- </data>
- <data name="Guidelines" xml:space="preserve">
- <value>Please run it from the main application.</value>
- </data>
- <data name="UpgradeFileNotFound" xml:space="preserve">
- <value>Upgrade failed, file not found.</value>
- </data>
- <data name="InProgress" xml:space="preserve">
- <value>In progress, please wait...</value>
- </data>
- <data name="TryTerminateProcess" xml:space="preserve">
- <value>Try to terminate the v2rayN process...</value>
- </data>
- <data name="FailedTerminateProcess" xml:space="preserve">
- <value>Failed to terminate the v2rayN.Close it manually,or the upgrade may fail.</value>
- </data>
- <data name="StartUnzipping" xml:space="preserve">
- <value>Start extracting the update package...</value>
- </data>
- <data name="SuccessUnzipping" xml:space="preserve">
- <value>Successfully extracted the update package.</value>
- </data>
- <data name="FailedUnzipping" xml:space="preserve">
- <value>Failed to extract the update package.</value>
- </data>
- <data name="FailedUpgrade" xml:space="preserve">
- <value>Upgrade failed.</value>
- </data>
- <data name="SuccessUpgrade" xml:space="preserve">
- <value>Upgrade success.</value>
- </data>
- <data name="Information" xml:space="preserve">
- <value>Information</value>
- </data>
-</root>
\ No newline at end of file
--- v2rayN/AmazTool/Resx/Resource.zh-Hans.resx
@@ -1,156 +0,0 @@
-<?xml version="1.0" encoding="utf-8"?>
-<root>
- <!--
- Microsoft ResX Schema
-
- Version 2.0
-
- The primary goals of this format is to allow a simple XML format
- that is mostly human readable. The generation and parsing of the
- various data types are done through the TypeConverter classes
- associated with the data types.
-
- Example:
-
- ... ado.net/XML headers & schema ...
- <resheader name="resmimetype">text/microsoft-resx</resheader>
- <resheader name="version">2.0</resheader>
- <resheader name="reader">System.Resources.ResXResourceReader, System.Windows.Forms, ...</resheader>
- <resheader name="writer">System.Resources.ResXResourceWriter, System.Windows.Forms, ...</resheader>
- <data name="Name1"><value>this is my long string</value><comment>this is a comment</comment></data>
- <data name="Color1" type="System.Drawing.Color, System.Drawing">Blue</data>
- <data name="Bitmap1" mimetype="application/x-microsoft.net.object.binary.base64">
- <value>[base64 mime encoded serialized .NET Framework object]</value>
- </data>
- <data name="Icon1" type="System.Drawing.Icon, System.Drawing" mimetype="application/x-microsoft.net.object.bytearray.base64">
- <value>[base64 mime encoded string representing a byte array form of the .NET Framework object]</value>
- <comment>This is a comment</comment>
- </data>
-
- There are any number of "resheader" rows that contain simple
- name/value pairs.
-
- Each data row contains a name, and value. The row also contains a
- type or mimetype. Type corresponds to a .NET class that support
- text/value conversion through the TypeConverter architecture.
- Classes that don't support this are serialized and stored with the
- mimetype set.
-
- The mimetype is used for serialized objects, and tells the
- ResXResourceReader how to depersist the object. This is currently not
- extensible. For a given mimetype the value must be set accordingly:
-
- Note - application/x-microsoft.net.object.binary.base64 is the format
- that the ResXResourceWriter will generate, however the reader can
- read any of the formats listed below.
-
- mimetype: application/x-microsoft.net.object.binary.base64
- value : The object must be serialized with
- : System.Runtime.Serialization.Formatters.Binary.BinaryFormatter
- : and then encoded with base64 encoding.
-
- mimetype: application/x-microsoft.net.object.soap.base64
- value : The object must be serialized with
- : System.Runtime.Serialization.Formatters.Soap.SoapFormatter
- : and then encoded with base64 encoding.
-
- mimetype: application/x-microsoft.net.object.bytearray.base64
- value : The object must be serialized into a byte array
- : using a System.ComponentModel.TypeConverter
- : and then encoded with base64 encoding.
- -->
- <xsd:schema id="root" xmlns="" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:msdata="urn:schemas-microsoft-com:xml-msdata">
- <xsd:import namespace="http://www.w3.org/XML/1998/namespace" />
- <xsd:element name="root" msdata:IsDataSet="true">
- <xsd:complexType>
- <xsd:choice maxOccurs="unbounded">
- <xsd:element name="metadata">
- <xsd:complexType>
- <xsd:sequence>
- <xsd:element name="value" type="xsd:string" minOccurs="0" />
- </xsd:sequence>
- <xsd:attribute name="name" use="required" type="xsd:string" />
- <xsd:attribute name="type" type="xsd:string" />
- <xsd:attribute name="mimetype" type="xsd:string" />
- <xsd:attribute ref="xml:space" />
- </xsd:complexType>
- </xsd:element>
- <xsd:element name="assembly">
- <xsd:complexType>
- <xsd:attribute name="alias" type="xsd:string" />
- <xsd:attribute name="name" type="xsd:string" />
- </xsd:complexType>
- </xsd:element>
- <xsd:element name="data">
- <xsd:complexType>
- <xsd:sequence>
- <xsd:element name="value" type="xsd:string" minOccurs="0" msdata:Ordinal="1" />
- <xsd:element name="comment" type="xsd:string" minOccurs="0" msdata:Ordinal="2" />
- </xsd:sequence>
- <xsd:attribute name="name" type="xsd:string" use="required" msdata:Ordinal="1" />
- <xsd:attribute name="type" type="xsd:string" msdata:Ordinal="3" />
- <xsd:attribute name="mimetype" type="xsd:string" msdata:Ordinal="4" />
- <xsd:attribute ref="xml:space" />
- </xsd:complexType>
- </xsd:element>
- <xsd:element name="resheader">
- <xsd:complexType>
- <xsd:sequence>
- <xsd:element name="value" type="xsd:string" minOccurs="0" msdata:Ordinal="1" />
- </xsd:sequence>
- <xsd:attribute name="name" type="xsd:string" use="required" />
- </xsd:complexType>
- </xsd:element>
- </xsd:choice>
- </xsd:complexType>
- </xsd:element>
- </xsd:schema>
- <resheader name="resmimetype">
- <value>text/microsoft-resx</value>
- </resheader>
- <resheader name="version">
- <value>2.0</value>
- </resheader>
- <resheader name="reader">
- <value>System.Resources.ResXResourceReader, System.Windows.Forms, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089</value>
- </resheader>
- <resheader name="writer">
- <value>System.Resources.ResXResourceWriter, System.Windows.Forms, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089</value>
- </resheader>
- <data name="Restartv2rayN" xml:space="preserve">
- <value>正在重启,请等待...</value>
- </data>
- <data name="Guidelines" xml:space="preserve">
- <value>请从主应用运行。</value>
- </data>
- <data name="UpgradeFileNotFound" xml:space="preserve">
- <value>升级失败,文件不存在。</value>
- </data>
- <data name="InProgress" xml:space="preserve">
- <value>正在进行中,请等待...</value>
- </data>
- <data name="TryTerminateProcess" xml:space="preserve">
- <value>尝试结束 v2rayN 进程...</value>
- </data>
- <data name="FailedTerminateProcess" xml:space="preserve">
- <value>请手动关闭正在运行的v2rayN,否则可能升级失败。</value>
- </data>
- <data name="StartUnzipping" xml:space="preserve">
- <value>开始解压缩更新包...</value>
- </data>
- <data name="SuccessUnzipping" xml:space="preserve">
- <value>解压缩更新包成功。</value>
- </data>
- <data name="FailedUnzipping" xml:space="preserve">
- <value>解压缩更新包失败。</value>
- </data>
- <data name="FailedUpgrade" xml:space="preserve">
- <value>升级失败。</value>
- </data>
- <data name="SuccessUpgrade" xml:space="preserve">
- <value>升级成功。</value>
- </data>
- <data name="Information" xml:space="preserve">
- <value>提示</value>
- </data>
-</root>
\ No newline at end of file
--- v2rayN/AmazTool/UpgradeApp.cs
@@ -8,17 +8,17 @@ namespace AmazTool
{
public static void Upgrade(string fileName)
{
- Console.WriteLine($"{Resx.Resource.StartUnzipping}\n{fileName}");
+ Console.WriteLine($"{LocalizationHelper.GetLocalizedValue("Start_Unzipping")}\n{fileName}");
Waiting(9);
if (!File.Exists(fileName))
{
- Console.WriteLine(Resx.Resource.UpgradeFileNotFound);
+ Console.WriteLine(LocalizationHelper.GetLocalizedValue("Upgrade_File_Not_Found"));
return;
}
- Console.WriteLine(Resx.Resource.TryTerminateProcess);
+ Console.WriteLine(LocalizationHelper.GetLocalizedValue("Try_Terminate_Process"));
try
{
var existing = Process.GetProcessesByName(V2rayN);
@@ -35,10 +35,10 @@ namespace AmazTool
catch (Exception ex)
{
// Access may be denied without admin right. The user may not be an administrator.
- Console.WriteLine(Resx.Resource.FailedTerminateProcess + ex.StackTrace);
+ Console.WriteLine(LocalizationHelper.GetLocalizedValue("Failed_Terminate_Process") + ex.StackTrace);
}
- Console.WriteLine(Resx.Resource.StartUnzipping);
+ Console.WriteLine(LocalizationHelper.GetLocalizedValue("Start_Unzipping"));
StringBuilder sb = new();
try
{
@@ -81,16 +81,16 @@ namespace AmazTool
}
catch (Exception ex)
{
- Console.WriteLine(Resx.Resource.FailedUpgrade + ex.StackTrace);
+ Console.WriteLine(LocalizationHelper.GetLocalizedValue("Failed_Upgrade") + ex.StackTrace);
//return;
}
if (sb.Length > 0)
{
- Console.WriteLine(Resx.Resource.FailedUpgrade + sb.ToString());
+ Console.WriteLine(LocalizationHelper.GetLocalizedValue("Failed_Upgrade") + sb.ToString());
//return;
}
- Console.WriteLine(Resx.Resource.Restartv2rayN);
+ Console.WriteLine(LocalizationHelper.GetLocalizedValue("Restart_v2rayN"));
Waiting(9);
Process process = new()
{
|
v2rayn
|
2dust
|
C#
|
C#
| 75,986
| 12,289
|
A GUI client for Windows, Linux and macOS, support Xray and sing-box and others
|
2dust_v2rayn
|
CODE_IMPROVEMENT
|
simplify decoder draining logic
|
95ecd740aeffc8341172a93a5dc4f78ab9a9fd78
|
2024-08-28 13:25:36
|
Edward Hsing
|
Update README.md
| false
| 1
| 0
| 1
|
--- README.md
@@ -1,7 +1,6 @@
# US.KG – A FREE NAME FOR EVERYONE
[Registry Website (https://nic.us.kg/)](https://nic.us.kg/)
#### Your request is typically processed within 15 minutes after submission.
-#### Please DO NOT download any untrusted binary files from "issues" and run them, they are viruses! We will not ask you to download any files
## Domain names no longer cost
Now, regardless of your project, whether you’re an individual or an organization, you can easily register and own your own *.US.KG domain name, 100% completely free. You can host your website with any third-party DNS service you like, such as Cloudflare, FreeDNS by afraid, hostry…
|
freedomain
|
digitalplatdev
|
HTML
|
HTML
| 41,142
| 933
|
DigitalPlat FreeDomain: Free Domain For Everyone
|
digitalplatdev_freedomain
|
DOC_CHANGE
|
changes in readme
|
0f6d0a44eac8accfbef6c8940a83ea0c19c92d21
|
2023-12-30 22:43:27
|
Kingkor Roy Tirtho
|
chore: fix not closing player
| false
| 2
| 3
| 5
|
--- lib/components/player/player.dart
@@ -95,8 +95,9 @@ class PlayerView extends HookConsumerWidget {
final topPadding = MediaQueryData.fromView(View.of(context)).padding.top;
return PopScope(
- canPop: false,
- onPopInvoked: (didPop) async {
+ canPop: panelController.isPanelOpen,
+ onPopInvoked: (canPop) async {
+ if (!canPop) return;
panelController.close();
},
child: IconTheme(
|
spotube
|
krtirtho
|
Dart
|
Dart
| 35,895
| 1,491
|
🎧 Open source Spotify client that doesn't require Premium nor uses Electron! Available for both desktop & mobile!
|
krtirtho_spotube
|
BUG_FIX
|
not closing player is fixed
|
e35ab35930a837561e3604baaaeff6bc26b49863
|
2025-02-27 19:09:28
|
Niklas Mischkulnig
|
Turbopack: prevent panic in swc issue emitter (#76595)
| false
| 46
| 18
| 64
|
--- turbopack/crates/turbopack-core/src/issue/analyze.rs
@@ -23,7 +23,7 @@ pub struct AnalyzeIssue {
impl AnalyzeIssue {
#[turbo_tasks::function]
pub fn new(
- severity: IssueSeverity,
+ severity: ResolvedVc<IssueSeverity>,
source_ident: ResolvedVc<AssetIdent>,
title: ResolvedVc<RcStr>,
message: ResolvedVc<StyledString>,
@@ -31,7 +31,7 @@ impl AnalyzeIssue {
source: Option<IssueSource>,
) -> Vc<Self> {
Self {
- severity: severity.resolved_cell(),
+ severity,
source_ident,
title,
message,
--- turbopack/crates/turbopack-core/src/issue/mod.rs
@@ -30,7 +30,7 @@ use crate::{
};
#[turbo_tasks::value(shared)]
-#[derive(PartialOrd, Ord, Copy, Clone, Hash, Debug, DeterministicHash, TaskInput)]
+#[derive(PartialOrd, Ord, Copy, Clone, Hash, Debug, DeterministicHash)]
#[serde(rename_all = "camelCase")]
pub enum IssueSeverity {
Bug,
--- turbopack/crates/turbopack-ecmascript/src/references/esm/export.rs
@@ -413,7 +413,7 @@ pub async fn expand_star_exports(
async fn emit_star_exports_issue(source_ident: Vc<AssetIdent>, message: RcStr) -> Result<()> {
AnalyzeIssue::new(
- IssueSeverity::Warning,
+ IssueSeverity::Warning.cell(),
source_ident,
Vc::cell("unexpected export *".into()),
StyledString::Text(message).cell(),
--- turbopack/crates/turbopack-ecmascript/src/references/mod.rs
@@ -939,7 +939,7 @@ pub(crate) async fn analyse_ecmascript_module_internal(
analysis.set_async_module(async_module);
} else if let Some(span) = top_level_await_span {
AnalyzeIssue::new(
- IssueSeverity::Error,
+ IssueSeverity::Error.cell(),
source.ident(),
Vc::cell("unexpected top level await".into()),
StyledString::Text("top level await is only supported in ESM modules.".into())
--- turbopack/crates/turbopack-swc-utils/src/emitter.rs
@@ -27,46 +27,19 @@ impl IssueCollector {
};
for issue in issues {
- AnalyzeIssue::new(
- issue.severity,
- issue.source.ident(),
- Vc::cell(issue.title),
- issue.message.cell(),
- issue.code,
- issue.issue_source,
- )
- .to_resolved()
- .await?
- .emit();
+ issue.to_resolved().await?.emit();
}
Ok(())
}
pub fn last_emitted_issue(&self) -> Option<Vc<AnalyzeIssue>> {
let inner = self.inner.lock();
- inner.emitted_issues.last().map(|issue| {
- AnalyzeIssue::new(
- issue.severity,
- issue.source.ident(),
- Vc::cell(issue.title.clone()),
- issue.message.clone().cell(),
- issue.code.clone(),
- issue.issue_source.clone(),
- )
- })
+ inner.emitted_issues.last().copied()
}
}
struct IssueCollectorInner {
- emitted_issues: Vec<PlainAnalyzeIssue>,
-}
-struct PlainAnalyzeIssue {
- severity: IssueSeverity,
- source: ResolvedVc<Box<dyn Source>>,
- title: RcStr,
- message: StyledString,
- code: Option<RcStr>,
- issue_source: Option<IssueSource>,
+ emitted_issues: Vec<Vc<AnalyzeIssue>>,
}
pub struct IssueEmitter {
@@ -115,7 +88,7 @@ impl Emitter for IssueEmitter {
.as_ref()
.is_some_and(|d| matches!(d, DiagnosticId::Lint(_)));
- let severity = if is_lint {
+ let severity = (if is_lint {
IssueSeverity::Suggestion
} else {
match level {
@@ -128,7 +101,8 @@ impl Emitter for IssueEmitter {
Level::Cancelled => IssueSeverity::Error,
Level::FailureNote => IssueSeverity::Note,
}
- };
+ })
+ .resolved_cell();
let title;
if let Some(t) = self.title.as_ref() {
@@ -144,16 +118,14 @@ impl Emitter for IssueEmitter {
});
// TODO add other primary and secondary spans with labels as sub_issues
- // This can be invoked by swc on different threads, so we cannot call any turbo-tasks or
- // create cells here.
- let issue = PlainAnalyzeIssue {
- severity,
- source: self.source,
- title,
- message: StyledString::Text(message.into()),
+ let issue = AnalyzeIssue::new(
+ *severity,
+ self.source.ident(),
+ Vc::cell(title),
+ StyledString::Text(message.into()).cell(),
code,
- issue_source: source,
- };
+ source,
+ );
let mut inner = self.inner.lock();
inner.emitted_issues.push(issue);
|
next.js
|
vercel
|
JavaScript
|
JavaScript
| 129,891
| 27,821
|
The React Framework
|
vercel_next.js
|
CONFIG_CHANGE
|
Obvious
|
991609379c1651e8ad19f9bf9d89dc7895d86627
|
2023-05-03 10:23:12
|
Guide
|
[docs update]简单完善
| false
| 83
| 68
| 151
|
--- README.md
@@ -33,6 +33,7 @@
- [项目介绍](./docs/javaguide/intro.md)
- [贡献指南](./docs/javaguide/contribution-guideline.md)
- [常见问题](./docs/javaguide/faq.md)
+- [项目待办](./docs/javaguide/todo.md)
## Java
--- docs/.vuepress/navbar.ts
@@ -4,7 +4,7 @@ export default navbar([
{ text: "面试指南", icon: "java", link: "/home.md" },
{
text: "知识星球",
- icon: "planet",
+ icon: "code",
link: "/about-the-author/zhishixingqiu-two-years.md",
},
{ text: "开源项目", icon: "github", link: "/open-source-project/" },
@@ -23,7 +23,12 @@ export default navbar([
text: "更新历史",
icon: "history",
link: "/timeline/",
- }
+ },
+ {
+ text: "旧版入口(不推荐)",
+ icon: "java",
+ link: "https://snailclimb.gitee.io/javaguide/#/",
+ },
],
},
]);
--- docs/.vuepress/sidebar/about-the-author.ts
@@ -3,7 +3,7 @@ import { arraySidebar } from "vuepress-theme-hope";
export const aboutTheAuthor = arraySidebar([
{
text: "个人经历",
- icon: "experience",
+ icon: "zuozhe",
collapsible: false,
children: [
"internet-addiction-teenager",
--- docs/.vuepress/sidebar/index.ts
@@ -11,6 +11,7 @@ export default sidebar({
"/books/": books,
"/about-the-author/": aboutTheAuthor,
"/high-quality-technical-articles/": highQualityTechnicalArticles,
+ "/javaguide/": ["intro", "history", "contribution-guideline", "faq", "todo"],
"/zhuanlan/": [
"java-mian-shi-zhi-bei",
"handwritten-rpc-framework",
@@ -18,17 +19,6 @@ export default sidebar({
],
// 必须放在最后面
"/": [
- {
- text: "必看",
- icon: "star",
- collapsible: true,
- prefix: "javaguide/",
- children: [
- "intro",
- "contribution-guideline",
- "faq",
- ],
- },
{
text: "面试准备",
icon: "interview",
@@ -374,7 +364,7 @@ export default sidebar({
{
text: "常用框架",
prefix: "system-design/framework/",
- icon: "component",
+ icon: "framework",
collapsible: true,
children: [
{
@@ -403,7 +393,7 @@ export default sidebar({
},
{
text: "系统设计",
- icon: "design",
+ icon: "xitongsheji",
prefix: "system-design/",
collapsible: true,
children: [
@@ -453,7 +443,6 @@ export default sidebar({
text: "理论&算法&协议",
icon: "suanfaku",
prefix: "protocol/",
- collapsible: true,
children: [
"cap-and-base-theorem",
"paxos-algorithm",
@@ -461,8 +450,13 @@ export default sidebar({
"gossip-protocl",
],
},
+ "api-gateway",
+ "distributed-id",
+ "distributed-lock",
+ "distributed-transaction",
+ "distributed-configuration-center",
{
- text: "RPC详解",
+ text: "RPC(远程调用)详解",
prefix: "rpc/",
icon: "network",
collapsible: true,
@@ -479,11 +473,6 @@ export default sidebar({
"zookeeper/zookeeper-in-action",
],
},
- "api-gateway",
- "distributed-id",
- "distributed-lock",
- "distributed-transaction",
- "distributed-configuration-center",
],
},
{
--- docs/.vuepress/sidebar/open-source-project.ts
@@ -14,7 +14,7 @@ export const openSourceProject = arraySidebar([
{
text: "系统设计",
link: "system-design",
- icon: "design",
+ icon: "xitongsheji",
},
{
text: "工具类库",
--- docs/.vuepress/theme.ts
@@ -10,7 +10,7 @@ export default hopeTheme({
logo: "/logo.png",
hostname: "https://javaguide.cn/",
- iconAssets: "//at.alicdn.com/t/c/font_2922463_kweia6fbo9.css",
+ iconAssets: "//at.alicdn.com/t/c/font_2922463_9aayheyb3v7.css",
author: {
name: "Guide",
--- docs/about-the-author/zhishixingqiu-two-years.md
@@ -127,7 +127,7 @@ star: 2

-**方式二(推荐)** :添加我的个人微信(**javaguide1024**)领取一个 **30** 元的星球专属优惠券(一定要备注“优惠卷”)。
+**方式二(推荐)** :添加我的个人微信(**guidege666**)领取一个 **30** 元的星球专属优惠券(一定要备注“优惠卷”)。
**一定要备注“优惠卷”**,不然通过不了。
--- docs/cs-basics/network/other-network-questions2.md
@@ -137,8 +137,6 @@ NAT 不光可以缓解 IPv4 地址资源短缺的问题,还可以隐藏内部

-相关阅读:[NAT 协议详解(网络层)](./nat.md)。
-
## ARP
### 什么是 Mac 地址?
--- docs/high-performance/cdn.md
@@ -1,7 +1,7 @@
---
title: CDN(内容分发网络)详解
category: 高性能
-icon: cdn
+icon: "cdn"
head:
- - meta
- name: keywords
--- docs/high-performance/load-balancing.md
@@ -1,7 +1,7 @@
---
title: 负载均衡详解
category: 高性能
-icon: fuzaijunheng
+icon: "fuzaijunheng"
head:
- - meta
- name: keywords
--- docs/high-performance/read-and-write-separation-and-library-subtable.md
@@ -1,7 +1,7 @@
---
title: 读写分离和分库分表详解
category: 高性能
-icon: mysql
+icon: "mysql"
head:
- - meta
- name: keywords
--- docs/high-performance/sql-optimization.md
@@ -1,7 +1,7 @@
---
-title: 常见SQL优化手段总结(付费)
+title: 常见 SQL 优化手段总结(付费)
category: 高性能
-icon: SQL
+icon: "mysql"
head:
- - meta
- name: keywords
--- docs/home.md
@@ -6,7 +6,7 @@ title: JavaGuide(Java学习&&面试指南)
::: tip 友情提示
- **面试专版** :准备 Java 面试的小伙伴可以考虑面试专版:**[《Java 面试指北 》](./zhuanlan/java-mian-shi-zhi-bei.md)** (质量很高,专为面试打造,配合 JavaGuide 食用)。
-- **知识星球** :专属面试小册/一对一交流/简历修改/专属求职指南,欢迎加入 **[JavaGuide 知识星球](./about-the-author/zhishixingqiu-two-years.md)**(点击链接即可查看星球的详细介绍,一定确定自己真的需要再加入)。
+- **知识星球** :专属面试小册/一对一交流/简历修改/专属求职指南,欢迎加入 **[JavaGuide 知识星球](./about-the-author/zhishixingqiu-two-years.md)**(点击链接即可查看星球的详细介绍,一定一定一定确定自己真的需要再加入,一定一定要看完详细介绍之后再加我)。
- **转载须知** :以下所有文章如非文首说明为转载皆为 JavaGuide 原创,转载在文首注明出处,如发现恶意抄袭/搬运,会动用法律武器维护自己的权益。让我们一起维护一个良好的技术创作环境!
:::
@@ -26,6 +26,13 @@ title: JavaGuide(Java学习&&面试指南)
[](https://javaguide.cn/about-the-author/zhishixingqiu-two-years.html)
+## 项目相关
+
+- [项目介绍](./javaguide/intro.md)
+- [贡献指南](./javaguide/contribution-guideline.md)
+- [常见问题](./javaguide/faq.md)
+- [项目代办](./javaguide/todo.md)
+
## Java
### 基础
@@ -140,7 +147,7 @@ JVM 这部分内容主要参考 [JVM 虚拟机规范-Java8](https://docs.oracle.
- [TCP 三次握手和四次挥手(传输层)](./cs-basics/network/tcp-connection-and-disconnection.md)
- [TCP 传输可靠性保障(传输层)](./cs-basics/network/tcp-reliability-guarantee.md)
- [ARP 协议详解(网络层)](./cs-basics/network/arp.md)
-- [NAT 协议详解(网络层)](./cs-basics/network/nat.md)
+- [NAT 协议详解(网络层)](./docs/cs-basics/network/nat.md)
- [网络攻击常见手段总结(安全)](./cs-basics/network/network-attack-means.md)
### 数据结构
--- docs/interview-preparation/interview-experience.md
@@ -1,7 +1,6 @@
---
title: 优质面经汇总(付费)
category: 知识星球
-icon: experience
---
古人云:“**他山之石,可以攻玉**” 。善于学习借鉴别人的面试的成功经验或者失败的教训,可以让自己少走许多弯路。
--- docs/interview-preparation/key-points-of-interview.md
@@ -1,7 +1,6 @@
---
-title: Java面试重点总结(重要)
+title: Java面试重点总结
category: 面试准备
-icon: star
---
::: tip 友情提示
--- docs/interview-preparation/project-experience-guide.md
@@ -1,7 +1,6 @@
---
title: 项目经验指南
category: 面试准备
-icon: project
---
::: tip 友情提示
--- docs/interview-preparation/resume-guide.md
@@ -1,7 +1,6 @@
---
-title: 程序员简历编写指南(重要)
+title: 程序员简历编写指南
category: 面试准备
-icon: jianli
---
::: tip 友情提示
--- docs/interview-preparation/self-test-of-common-interview-questions.md
@@ -1,7 +1,6 @@
---
title: 常见面试题自测(付费)
category: 知识星球
-icon: security-fill
---
面试之前,强烈建议大家多拿常见的面试题来进行自测,检查一下自己的掌握情况,这是一种非常实用的备战技术面试的小技巧。
--- docs/interview-preparation/teach-you-how-to-prepare-for-the-interview-hand-in-hand.md
@@ -1,7 +1,6 @@
---
-title: 手把手教你如何准备Java面试(重要)
+title: 手把手教你如何准备Java面试
category: 知识星球
-icon: path
---
::: tip 友情提示
--- docs/javaguide/contribution-guideline.md
@@ -1,7 +1,6 @@
---
title: 贡献指南
category: 走近项目
-icon: guide
---
欢迎参与 JavaGuide 的维护工作,这是一件非常有意义的事情。详细信息请看:[JavaGuide 贡献指南](https://zhuanlan.zhihu.com/p/464832264) 。
--- docs/javaguide/faq.md
@@ -1,13 +1,8 @@
---
title: 常见问题
category: 走近项目
-icon: help
---
-## JavaGuide 是否支持 RSS?
-
-必须支持!推荐 RSS 订阅本网站获取最新更新。
-
## JavaGuide 有没有 PDF 版本?
由于 JavaGuide 内容在持续完善,所以并没有一个完全与之同步的 PDF 版本提供。如果你想要 PDF 版本的话,可以考虑 **《JavaGuide 面试突击版》** ,这是对 JavaGuide 内容的浓缩总结。
@@ -39,7 +34,7 @@ JavaGuide 这个项目诞生一年左右就有出版社的老师联系我了,
- JavaGuide 的很多内容我还不是很满意,也一直在维护中,细心的小伙伴看我的提交记录就明白了。
- 开源版本更容易维护和修改,也能让更多人更方便地参与到项目的建设中,这也是我最初做这个项目的初衷。
- 我觉得出书是一件神圣的事情,自认能力还不够。
-- 个人精力有限,不光有本职工作,还弄了一个[知识星球](https://javaguide.cn/about-the-author/zhishixingqiu-two-years.html)赚点外快,还要维护完善 JavaGuide。
+- 个人精力有限,不光有本职工作,还弄了一个[知识星球](http://localhost:8080/about-the-author/zhishixingqiu-two-years.html)赚点外快,还要维护完善 JavaGuide。
- ......
这几年一直在默默完善,真心希望 JavaGuide 越来越好,帮助到更多朋友!也欢迎大家参与进来!
--- docs/javaguide/intro.md
@@ -1,10 +1,9 @@
---
title: 项目介绍
category: 走近项目
-icon: about
---
-我是 19 年大学毕业的,在大三准备面试的时候,我开源了 JavaGuide 。我把自己准备面试过程中的一些总结都毫不保留地通过 JavaGuide 分享了出来。
+在大三准备面试的时候,我开源了 JavaGuide 。我把自己准备面试过程中的一些总结都毫不保留地通过 JavaGuide 分享了出来。
开源 JavaGuide 初始想法源于自己的个人那一段比较迷茫的学习经历。主要目的是为了通过这个开源平台来帮助一些在学习 Java 或者面试过程中遇到问题的小伙伴。
@@ -13,23 +12,7 @@ icon: about
相比于其他通过 JavaGuide 学到东西或者说助力获得 offer 的朋友来说 , JavaGuide 对我的意义更加重大。不夸张的说,有时候真的感觉像是自己的孩子一点一点长大一样,我一直用心呵护着它。虽然,我花了很长时间来维护它,但是,我觉得非常值得!非常有意义!
-希望大家对面试不要抱有侥幸的心理,打铁还需自身硬! 我希望这个文档是为你学习 Java 指明方向,而不仅仅是用来应付面试用的。
-
-另外,JavaGuide 不可能把面试中的所有内容都给涵盖住,尤其是阿里、美团这种挖的比较深入的面试。你可以根据你的目标公司进行针对性的深入学习,多看一些目标公司的面经进行查漏补缺,没事就自测一下,多多思考总结。
-
-加油!奥利给!
-
-## 学习建议
-
-JavaGuide 整体目录规划已经非常清晰了,你可以从头开始学习,也可以根据自身情况来选择性地学习。
-
-## 知识星球
-
-对于准备面试的同学来说,强烈推荐我创建的一个纯粹的[Java 面试知识星球](../about-the-author/zhishixingqiu-two-years.md),干货非常多,学习氛围也很不错!
-
-下面是星球提供的部分服务(点击下方图片即可获取知识星球的详细介绍):
-
-[](../about-the-author/zhishixingqiu-two-years.md)
+希望大家对面试不要抱有侥幸的心理,打铁还需自身硬! 我希望这个文档是为你学习 Java 指明方向,而不是用来应付面试用的。加油!奥利给!
## 项目说明
@@ -40,3 +23,5 @@ JavaGuide 整体目录规划已经非常清晰了,你可以从头开始学习
## 贡献者
[你可以点此链接查看 JavaGuide 的所有贡献者。](https://github.com/Snailclimb/JavaGuide/graphs/contributors) 感谢你们让 JavaGuide 变得更好!如果你们来到武汉一定要找我,我请你们吃饭玩耍。
+
+欢迎参与 [JavaGuide 的维护工作](https://zhuanlan.zhihu.com/p/464832264),这是一件非常有意义的事情。
--- docs/javaguide/todo.md
@@ -0,0 +1,13 @@
+---
+title: 项目待办
+category: 走近项目
+---
+
+- [x] 数据结构内容完善
+- [x] Java 基础内容完善
+- [x] 大篇幅文章拆分
+- [ ] JVM 内容更新完善
+- [ ] 计算机网络知识点完善
+- [ ] 分布式常见理论和算法总结完善
+
+欢迎参与 JavaGuide 的维护工作,这是一件非常有意义的事情。详细信息请看:[JavaGuide 贡献指南](./contribution-guideline.md) 。
--- docs/open-source-project/readme.md
@@ -16,7 +16,7 @@ category: 开源项目
另外,我的公众号还会定期分享优质开源项目,每月一期,每一期我都会精选 5 个高质量的 Java 开源项目。
-目前已经更新到了第 19 期:
+目前已经更新到了第 16 期:
1. [一款基于 Spring Boot + Vue 的一站式开源持续测试平台](http://mp.weixin.qq.com/s?__biz=Mzg2OTA0Njk0OA==&mid=2247515383&idx=1&sn=ba7244020c05d966b483d8c302d54e85&chksm=cea1f33cf9d67a2a111bcf6cadc3cc1c44828ba2302cd3e13bbd88349e43d4254808e6434133&scene=21#wechat_redirect)。
2. [用 Java 写个沙盒塔防游戏!已上架 Steam,Apple Store](https://mp.weixin.qq.com/s?__biz=Mzg2OTA0Njk0OA==&mid=2247515981&idx=1&sn=e4b9c06af65f739bdcdf76bdc35d59f6&chksm=cea1f086f9d679908bd6604b1c42d67580160d9789951f3707ad2f5de4d97aa72121d8fe777e&token=435278690&lang=zh_CN&scene=21#wechat_redirect)
@@ -35,8 +35,6 @@ category: 开源项目
15. [31.2k!这是我见过最强的后台管理系统 !!](https://mp.weixin.qq.com/s/esaivn2z_66CcrRJlDYLEA)
16. [14.3k star,这是我见过最强的第三方登录工具库!!](https://mp.weixin.qq.com/s/6-TnCHUMEIFWQVl-pIWBOA)
17. [3.2k!这是我见过最强的消息推送平台!!](https://mp.weixin.qq.com/s/heag76H4UwZmr8oBY_2gcw)
-18. [好家伙,又一本技术书籍开源了!!](https://mp.weixin.qq.com/s/w-JuBlcqCeAZR0xUFWzvHQ)
-19. [开箱即用的 ChatGPT Java SDK!支持 GPT3.5、 GPT4 API](https://mp.weixin.qq.com/s/WhI2K1VF0h_57TEVGCwuCA)
推荐你在我的公众号“**JavaGuide**”回复“**开源**”在线阅读[「优质开源项目推荐」](https://mp.weixin.qq.com/mp/appmsgalbum?__biz=Mzg2OTA0Njk0OA==&action=getalbum&album_id=1345382825083895808&scene=173&from_msgid=2247516459&from_itemidx=1&count=3&nolastread=1#wechat_redirect)系列。
--- docs/readme.md
@@ -21,6 +21,7 @@ footer: |-
## 关于网站
- [项目介绍](./javaguide/intro.md)
+- [网站历史](./javaguide/history.md)
- [贡献指南](./javaguide/contribution-guideline.md)
- [常见问题](./javaguide/faq.md)
@@ -31,13 +32,13 @@ footer: |-
- [我的知识星球快 3 岁了!](./about-the-author/zhishixingqiu-two-years.md)
- [坚持写技术博客六年了](./about-the-author/writing-technology-blog-six-years.md)
-## 知识星球
+## PDF
-对于准备面试的同学来说,强烈推荐我创建的一个纯粹的[Java 面试知识星球](../about-the-author/zhishixingqiu-two-years.md),干货非常多,学习氛围也很不错!
-
-下面是星球提供的部分服务(点击下方图片即可获取知识星球的详细介绍):
-
-[](../about-the-author/zhishixingqiu-two-years.md)
+- [《JavaGuide 面试突击版》](https://mp.weixin.qq.com/s?__biz=Mzg2OTA0Njk0OA==&mid=100029614&idx=1&sn=62993c5cf10265cb7018db7f1ec67250&chksm=4ea1fb6579d67273499b7243641d4ef372decd08047bfbb6dfb5843ef81c7ccba209086cf345#rd)
+- [《消息队列常见知识点&面试题总结》](https://t.1yb.co/Fy0u)
+- [《Java 工程师进阶知识完全扫盲》](https://t.1yb.co/GXLF)
+- [《分布式相关面试题汇总》](https://t.1yb.co/GXLF)
+- [《图解计算机基础》](https://mp.weixin.qq.com/s?__biz=Mzg2OTA0Njk0OA==&mid=100021725&idx=1&sn=2db9664ca25363139a81691043e9fd8f&chksm=4ea19a1679d61300d8990f7e43bfc7f476577a81b712cf0f9c6f6552a8b219bc081efddb5c54#rd)
## 公众号
--- docs/snippets/planet.snippet.md
@@ -1,10 +1,10 @@
[《Java 面试指北》](https://javaguide.cn/zhuanlan/java-mian-shi-zhi-bei.html)(点击链接即可查看详细介绍)的部分内容展示如下,你可以将其看作是 [JavaGuide](https://javaguide.cn/#/) 的补充完善,两者可以配合使用。
-
+
## 星球介绍
-为了帮助更多同学准备 Java 面试以及学习 Java ,我创建了一个纯粹的[ Java 面试知识星球](https://javaguide.cn/about-the-author/zhishixingqiu-two-years.html)。虽然收费只有培训班/训练营的百分之一,但是知识星球里的内容质量更高,提供的服务也更全面,非常适合准备 Java 面试和学习 Java 的同学。
+为了帮助更多同学准备 Java 面试以及学习 Java ,我创建了一个纯粹的[知识星球](https://javaguide.cn/about-the-author/zhishixingqiu-two-years.html)。虽然收费只有培训班/训练营的百分之一,但是知识星球里的内容质量更高,提供的服务也更全面。
**欢迎准备 Java 面试以及学习 Java 的同学加入我的 [知识星球](https://javaguide.cn/about-the-author/zhishixingqiu-two-years.html),干货非常多,学习氛围也很不错!收费虽然是白菜价,但星球里的内容或许比你参加上万的培训班质量还要高。**
@@ -22,7 +22,7 @@

-**方式二(推荐)** :添加我的个人微信(**javaguide1024**)领取一个 **30** 元的星球专属优惠券(一定要备注“优惠卷”)。
+**方式二(推荐)** :添加我的个人微信(**guidege666**)领取一个 **30** 元的星球专属优惠券(一定要备注“优惠卷”)。
**一定要备注“优惠卷”**,不然通过不了。
@@ -30,7 +30,7 @@
**无任何套路,无任何潜在收费项。用心做内容,不割韭菜!**
-进入星球之后,记得查看 **[星球使用指南](https://t.zsxq.com/0d18KSarv)** (一定要看!) 。
+进入星球之后,记得查看[星球使用指南](https://t.zsxq.com/0d18KSarv)(一定要看!) 。
随着时间推移,星球积累的干货资源越来越多,我花在星球上的时间也越来越多,星球的价格会逐步向上调整,想要加入的同学一定要尽早。
|
javaguide
|
snailclimb
|
Java
|
Java
| 148,495
| 45,728
|
「Java学习+面试指南」一份涵盖大部分 Java 程序员所需要掌握的核心知识。准备 Java 面试,首选 JavaGuide!
|
snailclimb_javaguide
|
DOC_CHANGE
|
Obvious
|
c3c2ad1454a5df39b10596692934081797da206b
| null |
Martin Probst
|
fix: improve type of TreeNode.children.
| false
| 1
| 1
| 0
|
--- element_injector.ts
@@ -150,7 +150,7 @@ export class TreeNode<T extends TreeNode<any>> {
get parent() { return this._parent; }
// TODO(rado): replace with a function call, does too much work for a getter.
- get children(): TreeNode<any>[] {
+ get children(): T[] {
var res = [];
var child = this._head;
while (child != null) {
|
angular_angular.json
| null | null | null | null | null | null |
angular_angular.json
|
BUG_FIX
|
5, fix written in commit msg
|
1c50612559a78dce9c108f7e7b816d1b84540fe4
|
2023-09-10 21:09:16
|
Kingkor Roy Tirtho
|
fix: limit cover image upload to allowed 256kb size
| false
| 78
| 30
| 108
|
--- lib/components/playlist/playlist_create_dialog.dart
@@ -1,5 +1,4 @@
import 'dart:convert';
-import 'dart:io';
import 'package:collection/collection.dart';
import 'package:flutter/material.dart';
@@ -155,85 +154,39 @@ class PlaylistCreateDialog extends HookConsumerWidget {
child: ListView(
shrinkWrap: true,
children: [
- FormField<XFile?>(
- initialValue: image.value,
- onSaved: (newValue) {
- image.value = newValue;
- },
- validator: (value) {
- if (value == null) return null;
- final file = File(value.path);
-
- if (file.lengthSync() > 256000) {
- return "Image size should be less than 256kb";
- }
- return null;
- },
- builder: (field) {
- return Center(
- child: Stack(
- children: [
- UniversalImage(
- path: field.value?.path ??
- TypeConversionUtils.image_X_UrlString(
- updatingPlaylist?.images,
- placeholder:
- ImagePlaceholder.collection,
- ),
- height: 200,
+ Center(
+ child: Stack(
+ children: [
+ UniversalImage(
+ path: image.value?.path ??
+ TypeConversionUtils.image_X_UrlString(
+ updatingPlaylist?.images,
+ placeholder: ImagePlaceholder.collection,
),
- Positioned(
- bottom: 20,
- right: 20,
- child: IconButton.filled(
- icon: const Icon(SpotubeIcons.edit),
- style: IconButton.styleFrom(
- backgroundColor:
- theme.colorScheme.surface,
- foregroundColor:
- theme.colorScheme.primary,
- elevation: 2,
- shadowColor: theme.colorScheme.onSurface,
- ),
- onPressed: () async {
- final imageFile = await ImagePicker()
- .pickImage(
- source: ImageSource.gallery);
+ height: 200,
+ ),
+ Positioned(
+ bottom: 20,
+ right: 20,
+ child: IconButton.filled(
+ icon: const Icon(SpotubeIcons.edit),
+ style: IconButton.styleFrom(
+ backgroundColor: theme.colorScheme.surface,
+ foregroundColor: theme.colorScheme.primary,
+ elevation: 2,
+ shadowColor: theme.colorScheme.onSurface,
+ ),
+ onPressed: () async {
+ final imageFile = await ImagePicker()
+ .pickImage(source: ImageSource.gallery);
- if (imageFile != null) {
- field.didChange(imageFile);
- field.validate();
- field.save();
- }
- },
- ),
- ),
- if (field.hasError)
- Positioned(
- bottom: 20,
- left: 20,
- child: Container(
- padding: const EdgeInsets.symmetric(
- horizontal: 8,
- vertical: 4,
- ),
- decoration: BoxDecoration(
- color: theme.colorScheme.error,
- borderRadius: BorderRadius.circular(4),
- ),
- child: Text(
- field.errorText ?? "",
- style: theme.textTheme.bodyMedium!
- .copyWith(
- color: theme.colorScheme.onError,
- ),
- ),
- ),
- ),
- ],
+ image.value = imageFile ?? image.value;
+ },
),
- );
- }),
+ ),
+ ],
+ ),
+ ),
const SizedBox(height: 10),
TextFormField(
controller: playlistName,
@@ -250,7 +203,6 @@ class PlaylistCreateDialog extends HookConsumerWidget {
hintText: context.l10n.description,
),
keyboardType: TextInputType.multiline,
- validator: ValidationBuilder().required().build(),
maxLines: 5,
),
const SizedBox(height: 10),
|
spotube
|
krtirtho
|
Dart
|
Dart
| 35,895
| 1,491
|
🎧 Open source Spotify client that doesn't require Premium nor uses Electron! Available for both desktop & mobile!
|
krtirtho_spotube
|
BUG_FIX
|
this commit fixes/polishes an earlier feature
|
b676bf3d4ccce96ce33a4b54b6236e0ae9b4526f
| null |
Tom Robinson
|
Remove unused babel-loader from babel-preset-react-app (#6780)
| false
| 0
| 1
| -1
|
--- package.json
@@ -32,7 +32,6 @@
"@babel/preset-react": "7.0.0",
"@babel/preset-typescript": "7.3.3",
"@babel/runtime": "7.4.3",
- "babel-loader": "8.0.5",
"babel-plugin-dynamic-import-node": "2.2.0",
"babel-plugin-macros": "2.5.1",
"babel-plugin-transform-react-remove-prop-types": "0.4.24"
|
facebook_create-react-app.json
| null | null | null | null | null | null |
facebook_create-react-app.json
|
CODE_IMPROVEMENT
|
4, removed redundant code
|
d9280bdfa9c67dcaf1909e6408ab43d209f2f800
|
2023-08-31 20:10:30
|
李国冬
|
fix-2023-08-31T22:40+08:00
| false
| 190
| 14
| 204
|
--- README.md
@@ -64,7 +64,7 @@
### LLM微调实战
-下面给大家分享**大模型参数高效微调技术实战**,该系列共6篇文章。
+下面给大家分享**大模型参数高效微调技术实战**系列文章,该系列共6篇文章。
| 教程 | 代码 | 框架 |
| ----------------------------- | ----------------------------- | ----------------------------- |
--- docs/llm-base/ai-algo/README.md
@@ -26,12 +26,10 @@
- 通常 seq_length 与 max_position_embeddings 相等。
- key_value头数:This is the number of key_value heads that should be used to implement Grouped Query Attention. If
`num_key_value_heads=num_attention_heads`, the model will use Multi Head Attention (MHA), if
- `num_key_value_heads=1` the model will use Multi Query Attention (MQA) otherwise GQA is used. When
+ `num_key_value_heads=1 the model will use Multi Query Attention (MQA) otherwise GQA is used. When
converting a multi-head checkpoint to a GQA checkpoint, each group key and value head should be constructed
by meanpooling all the original heads within that group.
-
-
## LLaMA
| 模型 | LLaMA-7B | LLaMA-2-7B | LLaMA-13B | LLaMA-2-13B | LLaMA-30B | LLaMA-65B | LLaMA-2-70B |
@@ -39,7 +37,7 @@
| 词表大小(vocab_size) | 32000 | 32000 | 32000 | 32000 | 32000 | 32000 | 32000 |
| Transformer层(n_layer, num_layers, num_hidden_layers) | 32 | 32 | 40 | 40 | 60 | 80 | 80 |
| 注意力头数(num_attention_heads, n_head) | 32 | 32 | 40 | 40 | 52 | 64 | 64 |
-| key_value头数(num_key_value_heads) | N/A | 32 | N/A | 40 | N/A | N/A | 8 |
+| key_value头数(num_key_value_heads) | N/A | N/A | N/A | 40 | N/A | N/A | 8 |
| 隐藏层大小(hidden_size) | 4096 | 4096 | 5120 | 5120 | 6656 | 8192 | 8192 |
| 前馈神经网络的隐藏层大小(ffn_hidden_size, intermediate_size,n_inner) | 11008 | 11008 | 13824 | 13824 | 17920 | 22016 | 28672 |
| seq_length, n_ctx | 2048(max_position_embeddings) | 2048(max_position_embeddings) | 2048 | N/A | 2048 | | N/A |
--- docs/llm-base/ai-algo/transformer.md
@@ -12,7 +12,7 @@
-- 哈佛annotated-transformer:https://github.com/harvardnlp/annotated-transformer/blob/master/AnnotatedTransformer.ipynb
+
@@ -41,14 +41,6 @@ Transformer 中除了单词的Embedding,还需要使用位置Embedding 表示
位置Embedding用 PE 表示, PE 的维度与单词Embedding相同。 PE 可以通过训练得到,也可以使用某种公式计算得到。在Transformer中采用了后者。
-- Overview: The Implemented Transformer: https://medium.com/@hunter-j-phillips/overview-the-implemented-transformer-eafd87fe9589
-- Multi-Head Attention: https://medium.com/@hunter-j-phillips/multi-head-attention-7924371d477a
-- Layer Normalization: https://medium.com/@hunter-j-phillips/layer-normalization-e9ae93eb3c9c
-- Positional Encoding: https://medium.com/@hunter-j-phillips/positional-encoding-7a93db4109e6
-
-
-
-
--- docs/llm-base/ai-algo/transformer/multi-head-attention.webp
Binary files a/docs/llm-base/ai-algo/transformer/multi-head-attention.webp and /dev/null differ
--- docs/llm-base/ai-algo/transformer/模型架构.md
@@ -3,6 +3,9 @@
+
+
+
## 绝对位置编码
--- docs/llm-base/ai-hardware/README.md
@@ -20,7 +20,3 @@
-
-
-- NVIDIA GPUDirect: https://developer.nvidia.com/gpudirect
-
--- docs/llm-base/distribution-parallelism/multidimensional-hybrid-parallel/BloombergGPT模型超参数.png
Binary files "a/docs/llm-base/distribution-parallelism/multidimensional-hybrid-parallel/BloombergGPT\346\250\241\345\236\213\350\266\205\345\217\202\346\225\260.png" and /dev/null differ
--- docs/llm-base/distribution-parallelism/multidimensional-hybrid-parallel/README.md
@@ -1,5 +1,6 @@
- https://huggingface.co/docs/transformers/perf_train_gpu_many
-- https://huggingface.co/transformers/v4.12.5/parallelism.html
+
+
@@ -9,18 +10,12 @@
-| 模型 | DP | TP | PP | ZeRO Stage | FSDP(ZeRO Stage 3) | GPUs |
+| | DP | TP | PP | ZeRO Stage | FSDP(ZeRO Stage 3) | GPUs |
| ------------ | --- | --- | --- | ---------- | ------------------ | ----------------------- |
| Bloom-176B | 8 | 4 | 12 | ZeRO-1 | - | 384 张 A100 80GB |
| CodeGeeX-13B | 192 | 8 | - | ZeRO-2 | - | 1,536 张 Ascend 910 32GB |
| GLM-130B | 24 | 4 | 8 | ZeRO-1 | - | 768 张 A100 40G |
-| OPT-175B | - | 8 | - | - | ✅ | 992 张 80GB A100 |
-| Megatron-Turing NLG(530B) | 16 | 8 | 35 | - | - | 4480 张 A100 80G |
-| GPT-NeoX-20B | 12 | 2 | 4 |ZeRO-1 | - | 96 张 A100 40G |
-
-
-
-
+| OPT-175B | - | 8 | - | - | | 992 张 80GB A100 |
@@ -91,7 +86,6 @@ ZERO_STAGE=0 # important: bf16 must use z0! it implements its own zero stage 1 e
## CodeGeeX-13B
-为了提高训练效率,我们采用8路模型并行训练和192路数据并行训练,启用 ZeRO-2 进一步减少优化器状态的内存消耗。 最后,微批量大小为每个节点 16 个,全局批量大小达到 3,072。
```
@@ -121,7 +115,7 @@ Global batch size: 3072
```
adopt 4-way tensor parallelism and 8-way pipeline parallelism
-96 台 A100(40G*8)
+ 96 台 A100(40G*8)
fp16 True
@@ -155,15 +149,13 @@ learning_rate 8e-05
```
FP16
-
trained OPT-175B on 992 80GB A100 GPUs,
by utilizing Fully Sharded Data Parallel with Megatron-LM Tensor Parallelism
-通过利用完全分片数据并行与 Megatron-LM 张量并行
-roughly ~33 days of continuous training
+roughly ~33 days of continuous training
300B tokens
@@ -171,81 +163,4 @@ roughly ~33 days of continuous training
-### BloombergGPT
-
-
-
-We use the Amazon SageMaker service provided by AWS to train and evaluate BloombergGPT.
-
-We use the latest version available at the time of training and
-train on a total of 64 p4d.24xlarge instances.
-
-Each p4d.24xlarge instance has 8 NVIDIA 40GB A100 GPUs with NVIDIA NVSwitch intra-node connections (600 GB/s) and NVIDIA GPUDirect using AWS Elastic Fabric Adapter (EFA) inter-node connections (400 Gb/s).
-
-This yields a total of 512 40GB A100 GPUs.
-
-
-
-
-we rely on stage 3 of ZeRO optimization. We utilize the proprietary SageMaker Model Parallelism (SMP) library from AWS, which enables the automatic distribution of large models across multiple GPU devices and instances
-
-
-ZeRO shards the training state (model parameters, gradients, and optimizer state) across a group of GPUs. We shard a model across 128 GPUs, and we have 4 copies of the model during training
-
-
-
-
-
-
-## Megatron-Turing NLG(530B)
-
-
-训练过程一共使用了4480块英伟达A100 GPU
-
-
-5300亿个参数的模型,每个模型副本跨越280个NVIDIA A100 GPU,节点内采用Megatron-LM的8路张量切片(tensor-slicing),节点间采用35路管道并行。
-
-
-基于NVIDIA DGX SuperPOD的Selene超级计算机上完成混合精度训练。(该超级计算机由560个DGX A100服务器提供支持,每个DGX A100有8个 NVIDIA A100 80GB Tensor Core GPU,通过NVLink 和 NVSwitch相互完全连接)。
-
-
-
-Model training is done with mixed precision using 16-bit bfloat on NVIDIA’s Selene supercomputer with 560 DGX A100 nodes.
-
-Each cluster node has 8 NVIDIA 80-GB A100 GPUs, connected to each other by NVLink and NVSwitch.
-
-Each node has eight NVIDIA Mellanox 200Gbps HDR Infiniband
-HCAs for application communication, with an additional two HCAs per node for dedicated storage.
-
-The nodes are connected in a three-level (leaf, spine, core) fat-tree topology with 850 switches.
-This topology allows efficient all-reduce communication (which is the dominant communication pattern in deep learning
-training). The cluster uses an all-NVME shared parallel filesystem for high-performance data access and
-storage. The peak device throughput of an A100 GPU with 16-bit precision is 312 teraFLOP/s, resulting in
-an aggregate of 1.4 exaFLOP/s of peak 16-bit precision performance
-
-
-
-mixed precision using 16-bit bfloat
-
-
-
-## GPT-NeoX-20B
-
-
-We trained GPT-NeoX-20B on twelve Supermicro AS-4124GO-NART servers, each with eight
-NVIDIA A100-SXM4-40GB GPUs and configured with two AMD EPYC 7532 CPUs.
-
-All GPUs can directly access the InfiniBand switched fabric through one of four ConnectX-6 HCAs for
-GPUDirect RDMA.
-
-Two NVIDIA MQM8700-HS2R switches—connected by 16 links—compose the spine of this InfiniBand network, with one link
-per node CPU socket connected to each switch.
-
-Figure 2 shows a simplified overview of a node as configured for training。
-
-
-
-
-
-
--- docs/llm-base/distribution-parallelism/并行技术.drawio
@@ -1 +0,0 @@
-<mxfile host="Electron" modified="2023-08-31T11:56:35.644Z" agent="5.0 (Macintosh; Intel Mac OS X 12_3_1) AppleWebKit/537.36 (KHTML, like Gecko) draw.io/14.6.13 Chrome/89.0.4389.128 Electron/12.0.7 Safari/537.36" etag="8oeCXHUF3_uXAPg895Ur" version="14.6.13" type="device" pages="2"><diagram id="qC0874aR_NxkKrCqCS1e" name="第 1 页">5ZjZbuIwFIafJpegJM56OaGUSqNKaFDVa+OYxMKJU2MGMk8/TuMsJqlaJJbRlAtk/8fr//mYBAPMsuOCwyJ9ZjGmhm3GRwM8GLYdOp78roSyFtwwrIWEk7iWrE5YkT9YiaZS9yTGO62hYIwKUugiYnmOkdA0yDk76M02jOqzFjDBA2GFIB2qryQWaa0Gtt/pT5gkaTOz5an9ZbBprHayS2HMDj0JzA0w44yJupQdZ5hW3jW+1P0eP4i2C+M4F1/qEKzXE/dn+foI357zlwVZRXyiRvkN6V5teLF8kUKzZlE2RnC2z2NcjWUaIDqkROBVAVEVPUjyUktFRmXNksUNoXTGKOPvfUEMcbBBUt8Jzra4F/FQgNcbGVHrwFzg44cbtFrb5HHDLMOCl7KJ6tCuumw2rOqHDpzV0Eh70DylQXVWknbozk5ZUI6e4a49cNewPSoqIwqYy3JSlWvH25CcqR+9DggmexBRGeW7Y1xM+QHgMlycf44LOIOLdQsumwBhNJog68B1XPM6CQLse4NwzgABbgringlyfy7uCJcTl+XPWVEV0Z7TMuIQbbH43O6OjXUZ7yzv88tlzLo2Fy7unXcb7zgTUBCWy2p4ofsBOOEUBJqdE8ucevaXHAW2NbXdK5nqD0x9WBpz34giI3AG/koLhO6dnr85y/FJ7isJUpJUliJpGZZ6VBlK5OPgDxXISBxX04zC0m+eS5zuk5vB9oen2xk93VfiEAw4LL8DB+C4+i0zckPflEM44PAL5tvRB/f/BkKgM2j32mPg35JB84o6hGB9GwjAuRoEWe1ejt9jvX8YwPwv</diagram><diagram id="wMmwPg0yGGhdXhdaVQFP" name="第 2 页">ldGxDoIwEADQr+loghRRRgOoi3FgUCdTaYGawpFSFPx6MQWxYdGlub5ce70rwn7ebCUpsz1QJpBt0QbhANm257jd+oZWw8LzNKSSU03zESL+ZD1avdacsspIVABC8dLEGIqCxcowIiU8zLQEhFm1JCmbQBQTMdUjpyrTurKXo+8YT7Oh8tzt+8vJkNx3UmWEwuOLcIiwLwGUjvLGZ+I9u2EuZ6feXu7t+pq0VdIEN344uTN92eafI58WJCvUr1d3wfi0bmP8Lw5f</diagram></mxfile>
\ No newline at end of file
--- docs/peft/LoRA-FA.md
@@ -1,45 +0,0 @@
-
-
-
-低秩适应方法(LoRA)可以在很大程度上减少训练参数数量,以微调大型语言模型(LLM),然而,仍需要昂贵的激活记忆更新低秩权重。减少LoRA层数或使用激活重计算可能会损害微调性能或增加计算开销。
-
-
-在本文中,我们提出了LoRA-FA,一种高效的微调方法,可以减少激活记忆的需求,而无需性能下降和昂贵的重计算。LoRA-FA选择冻结A的投影下权重和更新B的投影上权重在每个LoRA层中。
-
-它确保在LLM微调期间,模型权重的变化存在于低秩空间中,同时消除存储全秩输入激活的要求。我们跨越多个模型类型(RoBERTa、T5、LLaMA)和模型大小进行广泛的实验。
-
-我们的结果显示,LoRA-FA可以在不同任务中比全参数微调和LoRA更准确地微调,同时LoRA-FA相对于LoRA可以减少总体内存成本高达1.4×。
-
-
-----
-
-
-本文提出了一种名为LoRA-FA的内存高效的细调方法,通过冻结A的投影下权重并仅更新B的投影上权重,将模型权重的变化限制在低秩空间中,从而减少了激活内存的需求。
-
-
-
-方法的详细步骤:
-
- (1). 设计LoRA-FA方法:LoRA-FA方法通过冻结A的投影下权重并仅更新B的投影上权重,将模型权重的变化限制在低秩空间中,从而减少了激活内存的需求。
- (2). LoRA-FA的集成:LoRA-FA可以与其他内存优化技术相结合,提高其利用率。
- (3). LoRA-FA与梯度压缩的关系:LoRA方法通过更新A和B两个低秩矩阵,并使用AB作为预训练和冻结权重W的变化,即W + α∆W = W + αAB。
-
- LoRA-FA方法冻结了W和A,并仅在细调过程中更新B。在模型适应过程中,权重的变化将被限制在低秩空间中。
-
- (4). 低秩模型适应:在细调过程中,冻结初始化的A和预训练的W,并更新投影上权重B。
- 因此,权重的变化将限制在由A的列空间定义的低秩空间中。
-
- (5). 内存复杂度:对LoRA-FA的内存复杂度进行详细研究。LoRA-FA模块仅计算B的梯度,其具有d_out × r个元素。在GPT类型的模型中,总的可训练参数是n_r/2,即LoRA中可训练参数数量的一半。因此,在16位混合精度训练中,模型权重和适配器相关状态的内存成本为2n + 8n_r字节。相比于全参数细调,LoRA-FA通过显著减少可训练参数和输入激活的数量,具有内存效率。LoRA-FA可以与先进的内存优化方法相结合,如权重量化、权重分片和选择性激活重计算。
-
- (6). 权重量化:LoRA-FA可以将模型权重量化为较低的位宽,以减少模型权重的内存开销,而不影响细调性能。
-
- (7). 权重分片:在使用数据并行ism的多个GPU上训练LLM时,可以将权重分片或使用ZeRO stage-3技术与LoRA-FA相结合,将模型权重分片到不同的GPU上,从而降低每个GPU的内存开销。
-
- (8). 选择性激活重计算:可以使用选择性激活重计算来重新计算部分模型组件的输入,以减少激活内存开销。通过选择性激活重计算,可以在不需要存储LoRA层输入的情况下平衡激活成本和重计算成本。
-
-
-
-
-
-
-
--- docs/peft/README.md
@@ -1,9 +0,0 @@
-
-
-
-
--
-
-
-
-
--- docs/peft/ReLoRA.md
@@ -1,4 +0,0 @@
-
-
-
-
--- train/megatron/README.md
@@ -7,25 +7,4 @@
- [CodeGeeX](https://github.com/THUDM/CodeGeeX)
-- [如何使用 Megatron-LM 训练语言模型](https://huggingface.co/blog/zh/megatron-training):数据预处理,训练,模型转换,推理等
-
-
-
-
-
-
-
-### 数据加载
-
-Megatron-LM 带有一个高效的 DataLoader,其中数据在训练前被 tokenize 和 shuffle。它还将数据拆分为带有索引的编号序列,并将索引存储,因此 tokenize 只需要计算一次。为了构建索引,首先根据训练参数计算每个 epoch 的数量,并创建一个排序,然后对数据进行 shuffle 操作。这与大多数情况不同,我们通常迭代整个数据集直到其用尽,然后重复第二个 epoch 。这平滑了学习曲线并节省了训练时间。
-
-
-### 融合 CUDA 内核
-当一个计算在 GPU 上运行时,必要的数据会从内存中取出并加载到 GPU 上,然后计算结果被保存回内存。简单来说,融合内核的思想是: 将通常由 PyTorch 单独执行的类似操作组合成一个单独的硬件操作。因此可以将多个离散计算合并为一个,从而减少在多个离散计算中的内存移动次数。
-
-
-当 f、g 和 h 融合在一个内核中时,f 和 g 的中间结果 x' 和 y' 存储在 GPU 寄存器中并立即被 h 使用。但是如果不融合,x' 和 y' 就需要复制到内存中,然后由 h 加载。因此,融合 CUDA 内核显着加快了计算速度。此外,Megatron-LM 还使用 Apex 的 AdamW 融合实现,它比 PyTorch 实现更快。
-
-虽然我们可以在 transformers 中自定义 Megatron-LM 中的 DataLoader 和 Apex 的融合优化器,但自定义融合 CUDA 内核对新手来说太不友好了。
-
-
+- [如何使用 Megatron-LM 训练语言模型](https://huggingface.co/blog/zh/megatron-training)
--- train/megatron/kernel_fusion.png
Binary files a/train/megatron/kernel_fusion.png and /dev/null differ
|
llm-action
|
liguodongiot
|
HTML
|
HTML
| 15,588
| 1,812
|
本项目旨在分享大模型相关技术原理以及实战经验(大模型工程化、大模型应用落地)
|
liguodongiot_llm-action
|
NEW_FEAT
|
Code change: new inheritance
|
303f6514bd229c80f58abc87448836ede877c0e5
| null |
Vadim B. Mikheev
|
+ int SortTuplesInTree = 2560; (default value for max number of tuples in leftist tree)
| false
| 2
| 1
| 1
|
--- globals.c
@@ -7,7 +7,7 @@
*
*
* IDENTIFICATION
- * $Header: /cvsroot/pgsql/src/backend/utils/init/globals.c,v 1.12 1997/09/08 02:31:56 momjian Exp $
+ * $Header: /cvsroot/pgsql/src/backend/utils/init/globals.c,v 1.13 1997/09/18 05:23:58 vadim Exp $
*
* NOTES
* Globals used all over the place should be declared here and not
@@ -80,6 +80,7 @@ char FloatFormat[20] = "%f";
int fsyncOff = 0;
int SortMem = 512;
+int SortTuplesInTree = 2560;
char *IndexedCatalogNames[] = {
AttributeRelationName,
|
postgres_postgres.json
| null | null | null | null | null | null |
postgres_postgres.json
|
NEW_FEAT
|
3, Vague commit description but from code change, it seems like a new feature as a new variable sortTupleInTree is added
|
dec5037a3d94fcf932a789df4c66f591feb5bb3d
|
2025-04-05T16:26:49Z
|
chromium-autoroll
|
Roll Perfetto Trace Processor Win from 256692b4376d to 4df384ab8fd5 https://android.googlesource.com/platform/external/perfetto.git/+log/256692b4376d..4df384ab8fd5 If this roll has caused a breakage, revert this CL and stop the roller using the controls here: https://autoroll.skia.org/r/perfetto-trace-processor-win-chromium Please CC [email protected],[email protected] on the revert to ensure that a human is aware of the problem. To file a bug in Chromium: https://bugs.chromium.org/p/chromium/issues/entry To report a problem with the AutoRoller itself, please file a bug: https://issues.skia.org/issues/new?component=1389291&template=1850622 Documentation for the AutoRoller is here: https://skia.googlesource.com/buildbot/+doc/main/autoroll/README.md Tbr: [email protected] Change-Id: I32c5170977a0045a9bff60b46668647d68fa8805 Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/6434395 Commit-Queue: chromium-autoroll <[email protected]> Bot-Commit: chromium-autoroll <[email protected]> Cr-Commit-Position: refs/heads/main@{#1443110}
| false
| 2
| 2
| 4
|
--- tools/perf/core/perfetto_binary_roller/binary_deps.json
@@ -5,8 +5,8 @@
"full_remote_path": "perfetto-luci-artifacts/40b529923598b739b2892a536a7692eedbed5685/linux-arm64/trace_processor_shell"
},
"win": {
- "hash": "08a2ef55720be965acb1fde8320f50e47ebc724a",
- "full_remote_path": "chromium-telemetry/perfetto_binaries/trace_processor_shell/win/256692b4376d5454088d417f51a0b97b1b4d9065/trace_processor_shell.exe"
+ "hash": "43b81d244abca35e4d9d21c99c7ccc9379ebc442",
+ "full_remote_path": "chromium-telemetry/perfetto_binaries/trace_processor_shell/win/4df384ab8fd5cb376e392f6e245593f9ea221f2f/trace_processor_shell.exe"
},
"linux_arm": {
"hash": "28bd9c986197285caeb7e5f7e8434e8f61bd7822",
|
chromium
| null |
C
|
C
| null | null |
Browser
|
_chromium
|
CONFIG_CHANGE
|
just some version changes done
|
ff757e2d6390b852614918c7bb9cb24b6f73f008
| null |
Thomas Köppe
|
Fix memory leak in THTensorRandom.c
| false
| 2
| 1
| 1
|
--- THTensorRandom.c
@@ -106,7 +106,8 @@ void THTensor_(multinomial)(THLongTensor *self, THGenerator *_generator, THTenso
sum \
);
}
- THArgCheck((sum > 0), 2, "invalid multinomial distribution (sum of probabilities <= 0)");
+ THArgCheckWithCleanup((sum > 0), THCleanup(THTensor_(free)(cum_dist);), 2,
+ "invalid multinomial distribution (sum of probabilities <= 0)");
/* normalize cumulative probability distribution so that last val is 1
i.e. dosen't assume original prob_dist row sums to one */
if ( (sum > 0) || ( ( sum < 1.00001) && (sum > 0.99999) ) )
|
pytorch_pytorch.json
| null | null | null | null | null | null |
pytorch_pytorch.json
|
BUG_FIX
|
5, obvious
|
d6bcde565f8155b6a51ce4682ada367fe62d5a18
|
2023-05-08 20:41:34
|
Romain Vimont
|
Accept .m4a and .mka These are just aliases for mp4 and mkv when there is no video stream. PR #3978 <https://github.com/Genymobile/scrcpy/pull/3978>
| false
| 28
| 3
| 31
|
--- app/src/cli.c
@@ -1487,12 +1487,6 @@ get_record_format(const char *name) {
if (!strcmp(name, "mkv")) {
return SC_RECORD_FORMAT_MKV;
}
- if (!strcmp(name, "m4a")) {
- return SC_RECORD_FORMAT_M4A;
- }
- if (!strcmp(name, "mka")) {
- return SC_RECORD_FORMAT_MKA;
- }
return 0;
}
@@ -1980,12 +1974,6 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
LOGW("Recording does not support RAW audio codec");
return false;
}
-
- if (opts->video
- && sc_record_format_is_audio_only(opts->record_format)) {
- LOGE("Audio container does not support video stream");
- return false;
- }
}
if (opts->audio_codec == SC_CODEC_RAW) {
--- app/src/options.h
@@ -21,16 +21,8 @@ enum sc_record_format {
SC_RECORD_FORMAT_AUTO,
SC_RECORD_FORMAT_MP4,
SC_RECORD_FORMAT_MKV,
- SC_RECORD_FORMAT_M4A,
- SC_RECORD_FORMAT_MKA,
};
-static inline bool
-sc_record_format_is_audio_only(enum sc_record_format fmt) {
- return fmt == SC_RECORD_FORMAT_M4A
- || fmt == SC_RECORD_FORMAT_MKA;
-}
-
enum sc_codec {
SC_CODEC_H264,
SC_CODEC_H265,
--- app/src/recorder.c
@@ -60,14 +60,9 @@ sc_recorder_queue_clear(struct sc_recorder_queue *queue) {
static const char *
sc_recorder_get_format_name(enum sc_record_format format) {
switch (format) {
- case SC_RECORD_FORMAT_MP4:
- case SC_RECORD_FORMAT_M4A:
- return "mp4";
- case SC_RECORD_FORMAT_MKV:
- case SC_RECORD_FORMAT_MKA:
- return "matroska";
- default:
- return NULL;
+ case SC_RECORD_FORMAT_MP4: return "mp4";
+ case SC_RECORD_FORMAT_MKV: return "matroska";
+ default: return NULL;
}
}
|
scrcpy
|
genymobile
|
C
|
C
| 118,486
| 11,201
|
Display and control your Android device
|
genymobile_scrcpy
|
NEW_FEAT
|
obvious
|
54c65e6cc28f30f82c0d51b0f9169ca82149c115
|
2023-09-21 06:34:35
|
John Turner
|
Adding in my name for community support.
| false
| 6
| 1
| 7
|
--- index.html
@@ -5010,16 +5010,11 @@
<td>Individual</td>
<td>Development; Open-Source Community Efforts, Documentation, Evangelism</td>
</tr>
- <tr>
+ <tr>
<td><a href="https://github.com/johnnorton">John Norton</a></td>
<td>Individual</td>
<td>Development; open-source community efforts</td>
</tr>
- <tr>
- <td><a href="https://github.com/b1tsized">John Turner</a></td>
- <td>Individual</td>
- <td>Development; open-source community efforts</td>
- </tr>
</tbody>
</table>
</div>
|
manifesto
|
opentofu
|
HTML
|
HTML
| 36,134
| 1,083
|
The OpenTF Manifesto expresses concern over HashiCorp's switch of the Terraform license from open-source to the Business Source License (BSL) and calls for the tool's return to a truly open-source license.
|
opentofu_manifesto
|
CONFIG_CHANGE
|
Very small changes
|
2b0d17e9cad742a6e651bc48752ff7cfd2782c32
|
2023-12-11 23:31:27
|
Kingkor Roy Tirtho
|
cd: fix json lint
| false
| 1
| 0
| 1
|
--- .github/workflows/pr-lint.yml
@@ -27,6 +27,5 @@ jobs:
- name: Lint translations & config files
run: |
- npm install -g @prantlf/jsonlint
jsonlint -q -D --enforce-double-quotes ./lib/l10n/*.arb
jsonlint -q -D --enforce-double-quotes -T .vscode/*.json
\ No newline at end of file
|
spotube
|
krtirtho
|
Dart
|
Dart
| 35,895
| 1,491
|
🎧 Open source Spotify client that doesn't require Premium nor uses Electron! Available for both desktop & mobile!
|
krtirtho_spotube
|
CONFIG_CHANGE
|
Obvious
|
827a879ae2dcc473dfcae8ae188b992f66cd07d6
|
2023-11-20 05:08:49
|
Jon Shier
|
Require Swift 5.7.1 (#3798) ### Goals :soccer:
This PR updates Alamofire to require Swift 5.7.1, as Xcode 14.1 has been
required to ship to the App Store since April.
### Implementation Details :construction:
Relevant version checks have been removed and formatting updated for
Swift 5.7.
### Testing Details :mag:
Older Swift versions have been removed from CI.
| false
| 91
| 236
| 327
|
--- .github/workflows/ci.yml
@@ -47,6 +47,30 @@ jobs:
runsOn: macOS-12
name: "macOS 12, Xcode 14.1, Swift 5.7.1"
testPlan: "macOS"
+ - xcode: "Xcode_14.0.1"
+ runsOn: macOS-12
+ name: "macOS 12, Xcode 14.0.1, Swift 5.7.0"
+ testPlan: "macOS"
+ - xcode: "Xcode_13.4.1"
+ runsOn: macOS-12
+ name: "macOS 12, Xcode 13.4.1, Swift 5.6.1"
+ testPlan: "macOS-NoTS"
+ - xcode: "Xcode_13.3.1"
+ runsOn: macOS-12
+ name: "macOS 12, Xcode 13.3.1, Swift 5.6.0"
+ testPlan: "macOS-NoTS"
+ - xcode: "Xcode_13.2.1"
+ runsOn: macOS-11
+ name: "macOS 11, Xcode 13.2.1, Swift 5.5.2"
+ testPlan: "macOS-NoTS"
+ - xcode: "Xcode_13.1"
+ runsOn: macOS-11
+ name: "macOS 11, Xcode 13.1, Swift 5.5.1"
+ testPlan: "macOS-NoTS"
+ - xcode: "Xcode_13.0"
+ runsOn: macOS-11
+ name: "macOS 11, Xcode 13.0, Swift 5.5.0"
+ testPlan: "macOS-NoTS"
steps:
- uses: actions/checkout@v4
- name: ${{ matrix.name }}
@@ -122,11 +146,11 @@ jobs:
testPlan: "iOS-NoTS"
xcode: "Xcode_14.3.1"
runsOn: macOS-13
- # - destination: "OS=14.5,name=iPhone 12 Pro"
- # name: "iOS 14.5"
- # testPlan: "iOS-NoTS"
- # xcode: "Xcode_14.3.1"
- # runsOn: macOS-13
+ - destination: "OS=14.5,name=iPhone 12 Pro"
+ name: "iOS 14.5"
+ testPlan: "iOS-NoTS"
+ xcode: "Xcode_14.3.1"
+ runsOn: macOS-13
# - destination: "OS=13.7,name=iPhone 11 Pro"
# name: "iOS 13.7"
# testPlan: "iOS-NoTS"
@@ -161,11 +185,11 @@ jobs:
testPlan: "tvOS-NoTS"
xcode: "Xcode_14.3.1"
runsOn: firebreak
- # - destination: "OS=14.5,name=Apple TV"
- # name: "tvOS 14.5"
- # testPlan: "tvOS-NoTS"
- # xcode: "Xcode_14.3.1"
- # runsOn: firebreak
+ - destination: "OS=14.5,name=Apple TV"
+ name: "tvOS 14.5"
+ testPlan: "tvOS-NoTS"
+ xcode: "Xcode_14.3.1"
+ runsOn: firebreak
# - destination: "OS=13.4,name=Apple TV"
# name: "tvOS 13.4"
# testPlan: "tvOS-NoTS"
@@ -200,11 +224,11 @@ jobs:
testPlan: "watchOS-NoTS"
xcode: "Xcode_14.3.1"
runsOn: firebreak
- # - destination: "OS=7.4,name=Apple Watch Series 6 (44mm)"
- # name: "watchOS 7.4"
- # testPlan: "watchOS-NoTS"
- # xcode: "Xcode_14.3.1"
- # runsOn: firebreak
+ - destination: "OS=7.4,name=Apple Watch Series 6 (44mm)"
+ name: "watchOS 7.4"
+ testPlan: "watchOS-NoTS"
+ xcode: "Xcode_14.3.1"
+ runsOn: firebreak
steps:
- uses: actions/checkout@v4
- name: Install Firewalk
@@ -233,6 +257,24 @@ jobs:
- xcode: "Xcode_14.1"
runsOn: macOS-12
name: "macOS 12, SPM 5.7.1 Test"
+ - xcode: "Xcode_14.0.1"
+ runsOn: macOS-12
+ name: "macOS 12, SPM 5.7.0 Test"
+ - xcode: "Xcode_13.4.1"
+ runsOn: macOS-12
+ name: "macOS 12, SPM 5.6.1 Test"
+ - xcode: "Xcode_13.3.1"
+ runsOn: macOS-12
+ name: "macOS 12, SPM 5.6.0 Test"
+ - xcode: "Xcode_13.2.1"
+ runsOn: macOS-11
+ name: "macOS 11, SPM 5.5.2 Test"
+ - xcode: "Xcode_13.1"
+ runsOn: macOS-11
+ name: "macOS 11, SPM 5.5.1 Test"
+ - xcode: "Xcode_13.0"
+ runsOn: macOS-11
+ name: "macOS 11, SPM 5.5.0 Test"
steps:
- uses: actions/checkout@v4
- name: Install Firewalk
--- .swiftformat
@@ -1,7 +1,7 @@
# file options
--symlinks ignore
---swiftversion 5.7
+--swiftversion 5.5
# rules
--enable isEmpty
@@ -22,15 +22,7 @@
--operatorfunc no-space
--nospaceoperators ..<, ...
--selfrequired validate
---someAny false
--stripunusedargs closure-only
--wraparguments preserve
--wrapcollections preserve
--wrapparameters preserve
-
-
-# rules
-
---enable isEmpty
---disable wrapMultilineStatementBraces
---disable opaqueGenericParameters
--- Example/Source/DetailViewController.swift
@@ -72,7 +72,7 @@ class DetailViewController: UITableViewController {
// MARK: IBActions
@IBAction func refresh() {
- guard let request else {
+ guard let request = request else {
return
}
@@ -85,7 +85,7 @@ class DetailViewController: UITableViewController {
let end = CACurrentMediaTime()
self.elapsedTime = end - start
- if let response {
+ if let response = response {
for (field, value) in response.allHeaderFields {
self.headers["\(field)"] = "\(value)"
}
@@ -205,7 +205,7 @@ extension DetailViewController {
}
override func tableView(_ tableView: UITableView, titleForFooterInSection section: Int) -> String? {
- if Sections(rawValue: section) == .body, let elapsedTime {
+ if Sections(rawValue: section) == .body, let elapsedTime = elapsedTime {
let elapsedTimeText = DetailViewController.numberFormatter.string(from: elapsedTime as NSNumber) ?? "???"
return "Elapsed Time: \(elapsedTimeText) sec"
}
--- [email protected]
@@ -0,0 +1,48 @@
+// swift-tools-version:5.5
+//
+// [email protected]
+//
+// Copyright (c) 2022 Alamofire Software Foundation (http://alamofire.org/)
+//
+// Permission is hereby granted, free of charge, to any person obtaining a copy
+// of this software and associated documentation files (the "Software"), to deal
+// in the Software without restriction, including without limitation the rights
+// to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+// copies of the Software, and to permit persons to whom the Software is
+// furnished to do so, subject to the following conditions:
+//
+// The above copyright notice and this permission notice shall be included in
+// all copies or substantial portions of the Software.
+//
+// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
+// THE SOFTWARE.
+//
+
+import PackageDescription
+
+let package = Package(name: "Alamofire",
+ platforms: [.macOS(.v10_12),
+ .iOS(.v10),
+ .tvOS(.v10),
+ .watchOS(.v3)],
+ products: [.library(name: "Alamofire",
+ targets: ["Alamofire"])],
+ targets: [.target(name: "Alamofire",
+ path: "Source",
+ exclude: ["Info.plist"],
+ linkerSettings: [.linkedFramework("CFNetwork",
+ .when(platforms: [.iOS,
+ .macOS,
+ .tvOS,
+ .watchOS]))]),
+ .testTarget(name: "AlamofireTests",
+ dependencies: ["Alamofire"],
+ path: "Tests",
+ exclude: ["Info.plist", "Test Plans"],
+ resources: [.process("Resources")])],
+ swiftLanguageVersions: [.v5])
--- [email protected]
@@ -0,0 +1,48 @@
+// swift-tools-version:5.6
+//
+// [email protected]
+//
+// Copyright (c) 2022 Alamofire Software Foundation (http://alamofire.org/)
+//
+// Permission is hereby granted, free of charge, to any person obtaining a copy
+// of this software and associated documentation files (the "Software"), to deal
+// in the Software without restriction, including without limitation the rights
+// to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+// copies of the Software, and to permit persons to whom the Software is
+// furnished to do so, subject to the following conditions:
+//
+// The above copyright notice and this permission notice shall be included in
+// all copies or substantial portions of the Software.
+//
+// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
+// THE SOFTWARE.
+//
+
+import PackageDescription
+
+let package = Package(name: "Alamofire",
+ platforms: [.macOS(.v10_12),
+ .iOS(.v10),
+ .tvOS(.v10),
+ .watchOS(.v3)],
+ products: [.library(name: "Alamofire",
+ targets: ["Alamofire"])],
+ targets: [.target(name: "Alamofire",
+ path: "Source",
+ exclude: ["Info.plist"],
+ linkerSettings: [.linkedFramework("CFNetwork",
+ .when(platforms: [.iOS,
+ .macOS,
+ .tvOS,
+ .watchOS]))]),
+ .testTarget(name: "AlamofireTests",
+ dependencies: ["Alamofire"],
+ path: "Tests",
+ exclude: ["Info.plist", "Test Plans"],
+ resources: [.process("Resources")])],
+ swiftLanguageVersions: [.v5])
--- [email protected]
@@ -1,4 +1,4 @@
-// swift-tools-version:5.7.1
+// swift-tools-version:5.7
//
// Package.swift
//
--- README.md
@@ -1,6 +1,6 @@

-[](https://img.shields.io/badge/Swift-5.7_5.8_5.9-Orange?style=flat-square)
+[](https://img.shields.io/badge/Swift-5.6_5.7_5.8_5.9-Orange?style=flat-square)
[](https://img.shields.io/badge/Platforms-macOS_iOS_tvOS_watchOS_vision_OS_Linux_Windows_Android-Green?style=flat-square)
[](https://img.shields.io/cocoapods/v/Alamofire.svg)
[](https://github.com/Carthage/Carthage)
@@ -89,7 +89,7 @@ In order to keep Alamofire focused specifically on core networking implementatio
| Platform | Minimum Swift Version | Installation | Status |
| ---------------------------------------------------- | --------------------- | -------------------------------------------------------------------------------------------------------------------- | ------------------------ |
-| iOS 10.0+ / macOS 10.12+ / tvOS 10.0+ / watchOS 3.0+ | 5.7.1 / Xcode 14.1 | [CocoaPods](#cocoapods), [Carthage](#carthage), [Swift Package Manager](#swift-package-manager), [Manual](#manually) | Fully Tested |
+| iOS 10.0+ / macOS 10.12+ / tvOS 10.0+ / watchOS 3.0+ | 5.6 | [CocoaPods](#cocoapods), [Carthage](#carthage), [Swift Package Manager](#swift-package-manager), [Manual](#manually) | Fully Tested |
| Linux | Latest Only | [Swift Package Manager](#swift-package-manager) | Building But Unsupported |
| Windows | Latest Only | [Swift Package Manager](#swift-package-manager) | Building But Unsupported |
| Android | Latest Only | [Swift Package Manager](#swift-package-manager) | Building But Unsupported |
@@ -102,7 +102,7 @@ Alamofire builds on Linux, Windows, and Android but there are missing features a
- Various methods of HTTP authentication may crash, including HTTP Basic and HTTP Digest. Crashes may occur if responses contain server challenges.
- Cache control through `CachedResponseHandler` and associated APIs is unavailable, as the underlying delegate methods aren't called.
- `URLSessionTaskMetrics` are never gathered.
-- `WebSocketRequest` is not available.
+- `WebSocketRequest` not available.
Due to these issues, Alamofire is unsupported on Linux, Windows, and Android. Please report any crashes to the [Swift bug reporter](https://bugs.swift.org).
--- Source/Alamofire.swift
@@ -29,12 +29,12 @@ import Foundation
#endif
// Enforce minimum Swift version for all platforms and build systems.
-#if swift(<5.7.1)
-#error("Alamofire doesn't support Swift versions below 5.7.1.")
+#if swift(<5.5)
+#error("Alamofire doesn't support Swift versions below 5.5.")
#endif
/// Reference to `Session.default` for quick bootstrapping and examples.
public let AF = Session.default
/// Current Alamofire version. Necessary since SPM doesn't use dynamic libraries. Plus this will be more accurate.
-let version = "5.8.1"
+let version = "5.8.0"
--- Source/AuthenticationInterceptor.swift
@@ -303,7 +303,7 @@ public class AuthenticationInterceptor<AuthenticatorType>: RequestInterceptor wh
}
// Do not attempt retry if there is no credential.
- guard let credential else {
+ guard let credential = credential else {
let error = AuthenticationError.missingCredential
completion(.doNotRetryWithError(error))
return
--- Source/Concurrency.swift
@@ -22,7 +22,7 @@
// THE SOFTWARE.
//
-#if canImport(_Concurrency)
+#if compiler(>=5.6.0) && canImport(_Concurrency)
import Foundation
@@ -181,6 +181,7 @@ extension DataRequest {
}
}
+ #if swift(>=5.7)
/// Sets an async closure returning a `Request.ResponseDisposition`, called whenever the `DataRequest` produces an
/// `HTTPURLResponse`.
///
@@ -230,6 +231,7 @@ extension DataRequest {
return self
}
+ #endif
/// Creates a `DataTask` to `await` a `Data` value.
///
@@ -701,6 +703,7 @@ extension DataStreamRequest {
}
}
+ #if swift(>=5.7)
/// Sets an async closure returning a `Request.ResponseDisposition`, called whenever the `DataStreamRequest`
/// produces an `HTTPURLResponse`.
///
@@ -748,6 +751,7 @@ extension DataStreamRequest {
return self
}
+ #endif
/// Creates a `DataStreamTask` used to `await` streams of serialized values.
///
--- Source/HTTPHeaders.swift
@@ -333,7 +333,7 @@ extension HTTPHeader {
}
}
-extension [HTTPHeader] {
+extension Array where Element == HTTPHeader {
/// Case-insensitively finds the index of an `HTTPHeader` with the provided name, if it exists.
func index(of name: String) -> Int? {
let lowercasedName = name.lowercased()
@@ -425,7 +425,7 @@ extension HTTPHeader {
}()
}
-extension Collection<String> {
+extension Collection where Element == String {
func qualityEncoded() -> String {
enumerated().map { index, encoding in
let quality = 1.0 - (Double(index) * 0.1)
--- Source/MultipartFormData.swift
@@ -320,7 +320,7 @@ open class MultipartFormData {
/// - Returns: The encoded `Data`, if encoding is successful.
/// - Throws: An `AFError` if encoding encounters an error.
public func encode() throws -> Data {
- if let bodyPartError {
+ if let bodyPartError = bodyPartError {
throw bodyPartError
}
@@ -345,7 +345,7 @@ open class MultipartFormData {
/// - Parameter fileURL: File `URL` to which to write the form data.
/// - Throws: An `AFError` if encoding encounters an error.
public func writeEncodedData(to fileURL: URL) throws {
- if let bodyPartError {
+ if let bodyPartError = bodyPartError {
throw bodyPartError
}
@@ -515,10 +515,10 @@ open class MultipartFormData {
private func contentHeaders(withName name: String, fileName: String? = nil, mimeType: String? = nil) -> HTTPHeaders {
var disposition = "form-data; name=\"\(name)\""
- if let fileName { disposition += "; filename=\"\(fileName)\"" }
+ if let fileName = fileName { disposition += "; filename=\"\(fileName)\"" }
var headers: HTTPHeaders = [.contentDisposition(disposition)]
- if let mimeType { headers.add(.contentType(mimeType)) }
+ if let mimeType = mimeType { headers.add(.contentType(mimeType)) }
return headers
}
--- Source/NetworkReachabilityManager.swift
@@ -196,7 +196,7 @@ open class NetworkReachabilityManager {
}
)
let callback: SCNetworkReachabilityCallBack = { _, flags, info in
- guard let info else { return }
+ guard let info = info else { return }
let weakManager = Unmanaged<WeakManager>.fromOpaque(info).takeUnretainedValue()
weakManager.manager?.notifyListener(flags)
--- Source/ParameterEncoder.swift
@@ -74,7 +74,7 @@ open class JSONParameterEncoder: ParameterEncoder {
open func encode<Parameters: Encodable>(_ parameters: Parameters?,
into request: URLRequest) throws -> URLRequest {
- guard let parameters else { return request }
+ guard let parameters = parameters else { return request }
var request = request
@@ -158,7 +158,7 @@ open class URLEncodedFormParameterEncoder: ParameterEncoder {
open func encode<Parameters: Encodable>(_ parameters: Parameters?,
into request: URLRequest) throws -> URLRequest {
- guard let parameters else { return request }
+ guard let parameters = parameters else { return request }
var request = request
--- Source/ParameterEncoding.swift
@@ -163,7 +163,7 @@ public struct URLEncoding: ParameterEncoding {
public func encode(_ urlRequest: URLRequestConvertible, with parameters: Parameters?) throws -> URLRequest {
var urlRequest = try urlRequest.asURLRequest()
- guard let parameters else { return urlRequest }
+ guard let parameters = parameters else { return urlRequest }
if let method = urlRequest.method, destination.encodesParametersInURL(for: method) {
guard let url = urlRequest.url else {
@@ -272,7 +272,7 @@ public struct JSONEncoding: ParameterEncoding {
public func encode(_ urlRequest: URLRequestConvertible, with parameters: Parameters?) throws -> URLRequest {
var urlRequest = try urlRequest.asURLRequest()
- guard let parameters else { return urlRequest }
+ guard let parameters = parameters else { return urlRequest }
guard JSONSerialization.isValidJSONObject(parameters) else {
throw AFError.parameterEncodingFailed(reason: .jsonEncodingFailed(error: Error.invalidJSONObject))
@@ -304,7 +304,7 @@ public struct JSONEncoding: ParameterEncoding {
public func encode(_ urlRequest: URLRequestConvertible, withJSONObject jsonObject: Any? = nil) throws -> URLRequest {
var urlRequest = try urlRequest.asURLRequest()
- guard let jsonObject else { return urlRequest }
+ guard let jsonObject = jsonObject else { return urlRequest }
guard JSONSerialization.isValidJSONObject(jsonObject) else {
throw AFError.parameterEncodingFailed(reason: .jsonEncodingFailed(error: Error.invalidJSONObject))
--- Source/Request.swift
@@ -494,7 +494,7 @@ public class Request {
func retryOrFinish(error: AFError?) {
dispatchPrecondition(condition: .onQueue(underlyingQueue))
- guard !isCancelled, let error, let delegate else { finish(); return }
+ guard !isCancelled, let error = error, let delegate = delegate else { finish(); return }
delegate.retryResult(for: self, dueTo: error) { retryResult in
switch retryResult {
@@ -518,7 +518,7 @@ public class Request {
mutableState.isFinishing = true
- if let error { self.error = error }
+ if let error = error { self.error = error }
// Start response handlers
processNextResponseSerializer()
@@ -1013,7 +1013,7 @@ extension Request {
components.append("-u \(user):\(password)")
}
} else {
- if let credential, let user = credential.user, let password = credential.password {
+ if let credential = credential, let user = credential.user, let password = credential.password {
components.append("-u \(user):\(password)")
}
}
@@ -1212,7 +1212,7 @@ public class DataRequest: Request {
@discardableResult
public func validate(_ validation: @escaping Validation) -> Self {
let validator: () -> Void = { [unowned self] in
- guard error == nil, let response else { return }
+ guard error == nil, let response = response else { return }
let result = validation(request, response, data)
@@ -1451,7 +1451,7 @@ public final class DataStreamRequest: Request {
@discardableResult
public func validate(_ validation: @escaping Validation) -> Self {
let validator: () -> Void = { [unowned self] in
- guard error == nil, let response else { return }
+ guard error == nil, let response = response else { return }
let result = validation(request, response)
@@ -1869,7 +1869,7 @@ extension DataStreamRequest.Stream {
let startTimestamp = ProcessInfo.processInfo.systemUptime
socket?.sendPing { error in
// Calls back on delegate queue / rootQueue / underlyingQueue
- if let error {
+ if let error = error {
queue.async {
onResponse(.error(error))
}
@@ -1895,7 +1895,7 @@ extension DataStreamRequest.Stream {
}
let item = DispatchWorkItem { [weak self] in
- guard let self, self.isResumed else { return }
+ guard let self = self, self.isResumed else { return }
self.sendPing(respondingOn: self.underlyingQueue) { response in
guard case .pong = response else { return }
@@ -2067,7 +2067,7 @@ extension DataStreamRequest.Stream {
completionHandler: @escaping (Result<Void, Error>) -> Void) {
guard !(isCancelled || isFinished) else { return }
- guard let socket else {
+ guard let socket = socket else {
// URLSessionWebSocketTask note created yet, enqueue the send.
socketMutableState.write { mutableState in
mutableState.enqueuedSends.append((message, queue, completionHandler))
@@ -2401,7 +2401,7 @@ public class DownloadRequest: Request {
return
}
- if let completionHandler {
+ if let completionHandler = completionHandler {
// Resume to ensure metrics are gathered.
task.resume()
task.cancel { resumeData in
@@ -2430,7 +2430,7 @@ public class DownloadRequest: Request {
@discardableResult
public func validate(_ validation: @escaping Validation) -> Self {
let validator: () -> Void = { [unowned self] in
- guard error == nil, let response else { return }
+ guard error == nil, let response = response else { return }
let result = validation(request, response, fileURL)
@@ -2534,7 +2534,7 @@ public class UploadRequest: DataRequest {
}
override func task(for request: URLRequest, using session: URLSession) -> URLSessionTask {
- guard let uploadable else {
+ guard let uploadable = uploadable else {
fatalError("Attempting to create a URLSessionUploadTask when Uploadable value doesn't exist.")
}
@@ -2558,7 +2558,7 @@ public class UploadRequest: DataRequest {
///
/// - Returns: The `InputStream`.
func inputStream() -> InputStream {
- guard let uploadable else {
+ guard let uploadable = uploadable else {
fatalError("Attempting to access the input stream but the uploadable doesn't exist.")
}
@@ -2575,7 +2575,7 @@ public class UploadRequest: DataRequest {
defer { super.cleanup() }
guard
- let uploadable,
+ let uploadable = uploadable,
case let .file(url, shouldRemove) = uploadable,
shouldRemove
else { return }
--- Source/RequestCompression.swift
@@ -114,9 +114,16 @@ public struct DeflateRequestCompressor: RequestInterceptor {
}
func adler32Checksum(of data: Data) -> UInt32 {
+ #if swift(>=5.6)
data.withUnsafeBytes { buffer in
UInt32(adler32(1, buffer.baseAddress, UInt32(buffer.count)))
}
+ #else
+ data.withUnsafeBytes { buffer in
+ let buffer = buffer.bindMemory(to: UInt8.self)
+ return UInt32(adler32(1, buffer.baseAddress, UInt32(buffer.count)))
+ }
+ #endif
}
}
--- Source/RequestTaskMap.swift
@@ -47,7 +47,7 @@ struct RequestTaskMap {
subscript(_ request: Request) -> URLSessionTask? {
get { requestsToTasks[request] }
set {
- guard let newValue else {
+ guard let newValue = newValue else {
guard let task = requestsToTasks[request] else {
fatalError("RequestTaskMap consistency error: no task corresponding to request found.")
}
@@ -68,7 +68,7 @@ struct RequestTaskMap {
subscript(_ task: URLSessionTask) -> Request? {
get { tasksToRequests[task] }
set {
- guard let newValue else {
+ guard let newValue = newValue else {
guard let request = tasksToRequests[task] else {
fatalError("RequestTaskMap consistency error: no request corresponding to task found.")
}
--- Source/Response.swift
@@ -429,7 +429,7 @@ private enum DebugDescription {
headers: HTTPHeaders,
allowingPrintableTypes printableTypes: [String] = ["json", "xml", "text"],
maximumLength: Int = 100_000) -> String {
- guard let data, !data.isEmpty else { return "[Body]: None" }
+ guard let data = data, !data.isEmpty else { return "[Body]: None" }
guard
data.count <= maximumLength,
--- Source/ResponseSerialization.swift
@@ -156,7 +156,7 @@ extension DownloadResponseSerializerProtocol where Self: DataResponseSerializerP
public func serializeDownload(request: URLRequest?, response: HTTPURLResponse?, fileURL: URL?, error: Error?) throws -> Self.SerializedObject {
guard error == nil else { throw error! }
- guard let fileURL else {
+ guard let fileURL = fileURL else {
throw AFError.responseSerializationFailed(reason: .inputFileNil)
}
@@ -247,7 +247,7 @@ extension DataRequest {
var didComplete: (() -> Void)?
defer {
- if let didComplete {
+ if let didComplete = didComplete {
self.responseSerializerDidComplete { queue.async { didComplete() } }
}
}
@@ -384,7 +384,7 @@ extension DownloadRequest {
var didComplete: (() -> Void)?
defer {
- if let didComplete {
+ if let didComplete = didComplete {
self.responseSerializerDidComplete { queue.async { didComplete() } }
}
}
@@ -520,7 +520,7 @@ public final class DataResponseSerializer: ResponseSerializer {
public func serialize(request: URLRequest?, response: HTTPURLResponse?, data: Data?, error: Error?) throws -> Data {
guard error == nil else { throw error! }
- guard var data, !data.isEmpty else {
+ guard var data = data, !data.isEmpty else {
guard emptyResponseAllowed(forRequest: request, response: response) else {
throw AFError.responseSerializationFailed(reason: .inputDataNilOrZeroLength)
}
@@ -640,7 +640,7 @@ public final class StringResponseSerializer: ResponseSerializer {
public func serialize(request: URLRequest?, response: HTTPURLResponse?, data: Data?, error: Error?) throws -> String {
guard error == nil else { throw error! }
- guard var data, !data.isEmpty else {
+ guard var data = data, !data.isEmpty else {
guard emptyResponseAllowed(forRequest: request, response: response) else {
throw AFError.responseSerializationFailed(reason: .inputDataNilOrZeroLength)
}
@@ -784,7 +784,7 @@ public final class JSONResponseSerializer: ResponseSerializer {
public func serialize(request: URLRequest?, response: HTTPURLResponse?, data: Data?, error: Error?) throws -> Any {
guard error == nil else { throw error! }
- guard var data, !data.isEmpty else {
+ guard var data = data, !data.isEmpty else {
guard emptyResponseAllowed(forRequest: request, response: response) else {
throw AFError.responseSerializationFailed(reason: .inputDataNilOrZeroLength)
}
@@ -941,7 +941,7 @@ public final class DecodableResponseSerializer<T: Decodable>: ResponseSerializer
public func serialize(request: URLRequest?, response: HTTPURLResponse?, data: Data?, error: Error?) throws -> T {
guard error == nil else { throw error! }
- guard var data, !data.isEmpty else {
+ guard var data = data, !data.isEmpty else {
guard emptyResponseAllowed(forRequest: request, response: response) else {
throw AFError.responseSerializationFailed(reason: .inputDataNilOrZeroLength)
}
--- Source/Result+Alamofire.swift
@@ -59,7 +59,7 @@ extension Result {
/// - value: A value.
/// - error: An `Error`.
init(value: Success, error: Failure?) {
- if let error {
+ if let error = error {
self = .failure(error)
} else {
self = .success(value)
--- Source/ServerTrustEvaluation.swift
@@ -456,7 +456,7 @@ public final class DisabledTrustEvaluator: ServerTrustEvaluating {
// MARK: - Extensions
-extension [ServerTrustEvaluating] {
+extension Array where Element == ServerTrustEvaluating {
#if os(Linux) || os(Windows) || os(Android)
// Add this same convenience method for Linux/Windows.
#else
@@ -614,7 +614,7 @@ extension AlamofireExtension where ExtendedType == SecTrust {
SecTrustGetCertificateAtIndex(type, index)
}
}
- #else
+ #elseif swift(>=5.5.1) // Xcode 13.1 / 2021 SDKs.
if #available(iOS 15, macOS 12, tvOS 15, watchOS 8, *) {
return (SecTrustCopyCertificateChain(type) as? [SecCertificate]) ?? []
} else {
@@ -622,6 +622,10 @@ extension AlamofireExtension where ExtendedType == SecTrust {
SecTrustGetCertificateAtIndex(type, index)
}
}
+ #else
+ (0..<SecTrustGetCertificateCount(type)).compactMap { index in
+ SecTrustGetCertificateAtIndex(type, index)
+ }
#endif
}
--- Source/URLConvertible+URLRequestConvertible.swift
@@ -57,7 +57,7 @@ extension URLComponents: URLConvertible {
/// - Returns: The `URL` from the `url` property.
/// - Throws: An `AFError.invalidURL` instance.
public func asURL() throws -> URL {
- guard let url else { throw AFError.invalidURL(url: self) }
+ guard let url = url else { throw AFError.invalidURL(url: self) }
return url
}
--- Source/URLEncodedFormEncoder.swift
@@ -756,7 +756,7 @@ extension _URLEncodedFormEncoder.KeyedContainer: KeyedEncodingContainerProtocol
}
func _encodeIfPresent<Value>(_ value: Value?, forKey key: Key) throws where Value: Encodable {
- if let value {
+ if let value = value {
try encode(value, forKey: key)
} else {
try encodeNil(forKey: key)
@@ -1124,7 +1124,7 @@ final class URLEncodedFormSerializer {
}
}
-extension [String] {
+extension Array where Element == String {
func joinedWithAmpersands() -> String {
joined(separator: "&")
}
--- Source/Validation.swift
@@ -97,7 +97,7 @@ extension Request {
data: Data?)
-> ValidationResult
where S.Iterator.Element == String {
- guard let data, !data.isEmpty else { return .success(()) }
+ guard let data = data, !data.isEmpty else { return .success(()) }
return validate(contentType: acceptableContentTypes, response: response)
}
--- Tests/CacheTests.swift
@@ -182,7 +182,7 @@ final class CacheTestCase: BaseTestCase {
return
}
- if let response, let timestamp = response.headers["Date"] {
+ if let response = response, let timestamp = response.headers["Date"] {
if isCachedResponse {
XCTAssertEqual(timestamp, cachedResponseTimestamp, "timestamps should be equal")
} else {
--- Tests/ConcurrencyTests.swift
@@ -22,7 +22,7 @@
// THE SOFTWARE.
//
-#if canImport(_Concurrency)
+#if compiler(>=5.6.0) && canImport(_Concurrency)
import Alamofire
import XCTest
--- Tests/MultipartFormDataTests.swift
@@ -111,7 +111,7 @@ class MultipartFormDataEncodingTestCase: BaseTestCase {
// Then
XCTAssertNotNil(encodedData, "encoded data should not be nil")
- if let encodedData {
+ if let encodedData = encodedData {
let boundary = multipartFormData.boundary
let expectedString = (
@@ -150,7 +150,7 @@ class MultipartFormDataEncodingTestCase: BaseTestCase {
// Then
XCTAssertNotNil(encodedData, "encoded data should not be nil")
- if let encodedData {
+ if let encodedData = encodedData {
let boundary = multipartFormData.boundary
let expectedString = (
@@ -192,7 +192,7 @@ class MultipartFormDataEncodingTestCase: BaseTestCase {
// Then
XCTAssertNotNil(encodedData, "encoded data should not be nil")
- if let encodedData {
+ if let encodedData = encodedData {
let boundary = multipartFormData.boundary
var expectedData = Data()
@@ -231,7 +231,7 @@ class MultipartFormDataEncodingTestCase: BaseTestCase {
// Then
XCTAssertNotNil(encodedData, "encoded data should not be nil")
- if let encodedData {
+ if let encodedData = encodedData {
let boundary = multipartFormData.boundary
var expectedData = Data()
@@ -281,7 +281,7 @@ class MultipartFormDataEncodingTestCase: BaseTestCase {
// Then
XCTAssertNotNil(encodedData, "encoded data should not be nil")
- if let encodedData {
+ if let encodedData = encodedData {
let boundary = multipartFormData.boundary
var expectedData = Data()
@@ -333,7 +333,7 @@ class MultipartFormDataEncodingTestCase: BaseTestCase {
// Then
XCTAssertNotNil(encodedData, "encoded data should not be nil")
- if let encodedData {
+ if let encodedData = encodedData {
let boundary = multipartFormData.boundary
var expectedData = Data()
@@ -389,7 +389,7 @@ class MultipartFormDataEncodingTestCase: BaseTestCase {
// Then
XCTAssertNotNil(encodedData, "encoded data should not be nil")
- if let encodedData {
+ if let encodedData = encodedData {
let boundary = multipartFormData.boundary
var expectedData = Data()
--- Tests/RequestTests.swift
@@ -1328,7 +1328,7 @@ final class RequestInvalidURLTestCase: BaseTestCase {
}
}
-#if canImport(zlib) // Same condition as `DeflateRequestCompressor`.
+#if canImport(zlib) && swift(>=5.6) // Same condition as `DeflateRequestCompressor`.
@available(macOS 10.15, iOS 13, tvOS 13, watchOS 6, *)
final class RequestCompressionTests: BaseTestCase {
func testThatRequestsCanBeCompressed() async {
--- Tests/RetryPolicyTests.swift
@@ -137,7 +137,7 @@ class BaseRetryPolicyTestCase: BaseTestCase {
func request(method: HTTPMethod = .get, statusCode: Int? = nil) -> Request {
var response: HTTPURLResponse?
- if let statusCode {
+ if let statusCode = statusCode {
response = HTTPURLResponse(url: url, statusCode: statusCode, httpVersion: nil, headerFields: nil)
}
--- Tests/SessionTests.swift
@@ -149,7 +149,7 @@ final class SessionTestCase: BaseTestCase {
retryErrors.append(error)
if retryCount < 2 {
- if let retryDelay {
+ if let retryDelay = retryDelay {
completion(.retryWithDelay(retryDelay))
} else {
completion(.retry)
--- Tests/ValidationTests.swift
@@ -842,7 +842,7 @@ final class CustomValidationTestCase: BaseTestCase {
AF.download(endpoint)
.validate { _, _, fileURL in
- guard let fileURL else { return .failure(ValidationError.missingFile) }
+ guard let fileURL = fileURL else { return .failure(ValidationError.missingFile) }
do {
_ = try Data(contentsOf: fileURL)
|
alamofire
|
alamofire
|
Swift
|
Swift
| 41,720
| 7,598
|
Elegant HTTP Networking in Swift
|
alamofire_alamofire
|
CODE_IMPROVEMENT
|
Dropped support for older Swift versions, removed legacy version checks, and updated CI and README accordingly.
|
620a2ffcfb0a47dc3cd69f098845c4efba245ce3
| null |
pgibler
|
Added cmdh to community section in README
| false
| 1
| 0
| 1
|
--- README.md
@@ -261,6 +261,7 @@ See the [API documentation](./docs/api.md) for all endpoints.
- [ogpt.nvim](https://github.com/huynle/ogpt.nvim)
- [gptel Emacs client](https://github.com/karthink/gptel)
- [Oatmeal](https://github.com/dustinblackman/oatmeal)
+- [cmdh](https://github.com/pgibler/cmdh)
### Database
|
ollama_ollama.json
| null | null | null | null | null | null |
ollama_ollama.json
|
CONFIG_CHANGE
|
5, readme or comment change
|
d11f7fadd68b30785e591569ccaa308a813b18dd
|
2022-01-25 22:18:56
|
Sewook Han
|
Add Korean translation (#767) Co-authored-by: Oleksii Trekhleb <[email protected]>
| false
| 18
| 7
| 25
|
--- src/data-structures/hash-table/README.ko-KR.md
@@ -1,23 +1,12 @@
-# Hash Table
+# 해시 테이블
-_Read this in other languages:_
-[_简体中文_](README.zh-CN.md),
-[_Русский_](README.ru-RU.md),
-[_日本語_](README.ja-JP.md),
-[_Français_](README.fr-FR.md),
-[_Português_](README.pt-BR.md)
+컴퓨터 과학에서 **해시 테이블** (해시맵)은 키를 값에 매핑할 수 있는 *연관배열(associative array)* 기능을 가진 데이터 구조입니다. 해시 테이블은 해시함수를 사용해 버킷이나 슬롯 배열에 대한 인덱스를 계산하고 원하는 값을 찾을 수 있습니다. (사실상 해시함수에서 배열은 bucket 또는 slot으로 부릅니다.)
+따라서 해시테이블은 특정 키에 값을 매핑 시킨 형태의 자료구조로 키값에 매핑된 값을 검색 할 때 매우 유용합니다. 이때 해시함수는 찾고자 하는 키가 버킷 또는 슬롯의 몇 번째 인덱스에 해당하는지를 확인할 때 사용합니다.
-컴퓨팅에서, **해시 테이블**(해시 맵)은 키를 값에 매핑할 수 있는 구조인 *연관 배열*을 구현하는 자료 구조입니다. 해시 테이블은 *해시 함수*를 사용해 원하는 값을 담을 수 있는 버킷 또는 슬롯 배열의 인덱스를 계산합니다.
-
-이상적으로, 해시 함수는 각 키들을 고유 버킷에 할당하지만 대부분의 해시 테이블은 불완전한 해시 함수를 사용하기 때문에 해시 함수를 통해 두 개 이상의 키에 대해 동일한 인덱스를 생성하는 해시 충돌이 발생할 수 있습니다. 이러한 해시 충돌은 어떠한 방법으로든 해결되어야 합니다.
+이상적으로는 해시함수는 각 키를 하나의 버킷에 할당하지만 대부분의 해시 테이블은 불완전한 해시함수를 채택하고 있기 때문에 복수의 키에 대해 같은 인덱스를 생성했을 때 해시의 충돌이 발생합니다. 이러한 충돌은 어떤 방법으로든 대처할 필요가 있습니다.

-다음은 분리 연결법을 통해 해시 충돌을 해결한 예시입니다.
+체이닝에 따른 해시충돌(Hash Collision)의 해결법

-
-## 참고
-
-- [Wikipedia](https://en.wikipedia.org/wiki/Hash_table)
-- [YouTube](https://www.youtube.com/watch?v=shs0KM3wKv8&index=4&list=PLLXdhg_r2hKA7DPDsunoDZ-Z769jWn4R8)
\ No newline at end of file
--- src/data-structures/hash-table/README.md
@@ -5,8 +5,8 @@ _Read this in other languages:_
[_Русский_](README.ru-RU.md),
[_日本語_](README.ja-JP.md),
[_Français_](README.fr-FR.md),
-[_Português_](README.pt-BR.md), [_한국어_](README.ko-KR.md)
-
+[_Português_](README.pt-BR.md)
+
In computing, a **hash table** (hash map) is a data
structure which implements an *associative array*
abstract data type, a structure that can *map keys
|
javascript-algorithms
|
trekhleb
|
JavaScript
|
JavaScript
| 190,336
| 30,518
|
📝 Algorithms and data structures implemented in JavaScript with explanations and links to further readings
|
trekhleb_javascript-algorithms
|
DOC_CHANGE
|
Obvious
|
f3747c80db280f8311120c5985ba933fddd04036
|
2023-09-19 07:43:42
|
macro
|
修改支付状态查询逻辑
| false
| 4
| 1
| 5
|
--- mall-portal/src/main/java/com/macro/mall/portal/service/AlipayService.java
@@ -23,7 +23,6 @@ public interface AlipayService {
String notify(Map<String, String> params);
/**
- * 查询支付宝交易状态
* @param outTradeNo 商户订单编号
* @param tradeNo 支付宝交易编号
* @return 支付宝交易状态
--- mall-portal/src/main/java/com/macro/mall/portal/service/impl/AlipayServiceImpl.java
@@ -119,9 +119,7 @@ public class AlipayServiceImpl implements AlipayService {
}
if(response.isSuccess()){
log.info("查询支付宝账单成功!");
- if("TRADE_SUCCESS".equals(response.getTradeStatus())){
- portalOrderService.paySuccessByOrderSn(outTradeNo,1);
- }
+ portalOrderService.paySuccessByOrderSn(outTradeNo,1);
} else {
log.error("查询支付宝账单失败!");
}
|
mall
|
macrozheng
|
Java
|
Java
| 79,319
| 29,052
|
mall项目是一套电商系统,包括前台商城系统及后台管理系统,基于Spring Boot+MyBatis实现,采用Docker容器化部署。 前台商城系统包含首页门户、商品推荐、商品搜索、商品展示、购物车、订单流程、会员中心、客户服务、帮助中心等模块。 后台管理系统包含商品管理、订单管理、会员管理、促销管理、运营管理、内容管理、统计报表、财务管理、权限管理、设置等模块。
|
macrozheng_mall
|
BUG_FIX
|
Obvious
|
e3d84575237bcbeb11da709b057d8522905a76de
|
2024-12-09 21:41:56
|
Gregor Vostrak
|
add week_start default for unauthenticated shared reports view
| false
| 1
| 1
| 2
|
--- resources/js/app.ts
@@ -32,7 +32,7 @@ createInertiaApp({
user: User;
};
}>();
- return page.props.auth.user.week_start ?? 'monday';
+ return page.props.auth.user.week_start;
};
window.getTimezoneSetting = function () {
const page = usePage<{
|
solidtime
|
solidtime-io
|
PHP
|
PHP
| 5,267
| 278
|
Modern open-source time-tracking app
|
solidtime-io_solidtime
|
CONFIG_CHANGE
|
Very small changes
|
d9b63875ab9e5decc9e2cec720cd8ffeeb887e17
|
2024-03-21 07:58:14
|
Kieran Eglin
|
Renamed 'music' in README to 'audio content'
| false
| 1
| 1
| 2
|
--- README.md
@@ -43,7 +43,7 @@ If it doesn't work for your use case, please make a feature request! You can als
- First-class support for media center apps like Plex, Jellyfin, and Kodi
- Automatically downloads new content from channels and playlists
- Uses a novel approach to download new content more quickly than other apps
-- Supports downloading audio content
+- Supports downloading music
- Custom rules for handling YouTube Shorts and livestreams
- Advanced options like setting cutoff dates and filtering by title
- Reliable hands-off operation
|
pinchflat
|
kieraneglin
|
Elixir
|
Elixir
| 2,779
| 59
|
Your next YouTube media manager
|
kieraneglin_pinchflat
|
DOC_CHANGE
|
The prefix fix: suggests a bug fix, but the actual change is not fixing code behavior, it’s improving documentation rendering
|
e8801cc3c0493f72ece976f2b1d3a3bdef8237ac
|
2023-11-16 01:43:12
|
Romain Vimont
|
Upgrade AGP (8.1.3) and Gradle to 8.4 Android Gradle Plugin 8.1.3. Gradle 8.4. From now on, Java 17 is required.
| false
| 12
| 12
| 24
|
--- build.gradle
@@ -7,7 +7,7 @@ buildscript {
mavenCentral()
}
dependencies {
- classpath 'com.android.tools.build:gradle:8.1.3'
+ classpath 'com.android.tools.build:gradle:7.4.0'
// NOTE: Do not place your application dependencies here; they belong
// in the individual module build.gradle files
@@ -23,3 +23,7 @@ allprojects {
options.compilerArgs << "-Xlint:deprecation"
}
}
+
+task clean(type: Delete) {
+ delete rootProject.buildDir
+}
--- doc/build.md
@@ -58,7 +58,7 @@ sudo apt install gcc git pkg-config meson ninja-build libsdl2-dev \
libswresample-dev libusb-1.0-0-dev
# server build dependencies
-sudo apt install openjdk-17-jdk
+sudo apt install openjdk-11-jdk
```
On old versions (like Ubuntu 16.04), `meson` is too old. In that case, install
@@ -100,7 +100,7 @@ sudo apt install mingw-w64 mingw-w64-tools
You also need the JDK to build the server:
```bash
-sudo apt install openjdk-17-jdk
+sudo apt install openjdk-11-jdk
```
Then generate the releases:
@@ -168,13 +168,13 @@ brew install sdl2 ffmpeg libusb
brew install pkg-config meson
```
-Additionally, if you want to build the server, install Java 17 from Caskroom, and
+Additionally, if you want to build the server, install Java 8 from Caskroom, and
make it available from the `PATH`:
```bash
brew tap homebrew/cask-versions
-brew install adoptopenjdk/openjdk/adoptopenjdk17
-export JAVA_HOME="$(/usr/libexec/java_home --version 1.17)"
+brew install adoptopenjdk/openjdk/adoptopenjdk11
+export JAVA_HOME="$(/usr/libexec/java_home --version 1.11)"
export PATH="$JAVA_HOME/bin:$PATH"
```
--- gradle/wrapper/gradle-wrapper.properties
@@ -1,5 +1,5 @@
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
-distributionUrl=https\://services.gradle.org/distributions/gradle-8.4-bin.zip
+distributionUrl=https\://services.gradle.org/distributions/gradle-7.5-all.zip
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists
--- server/build.gradle
@@ -2,7 +2,7 @@ apply plugin: 'com.android.application'
android {
namespace 'com.genymobile.scrcpy'
- compileSdk 33
+ compileSdkVersion 33
defaultConfig {
applicationId "com.genymobile.scrcpy"
minSdkVersion 21
@@ -17,10 +17,6 @@ android {
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
}
}
- buildFeatures {
- buildConfig true
- aidl true
- }
}
dependencies {
|
scrcpy
|
genymobile
|
C
|
C
| 118,486
| 11,201
|
Display and control your Android device
|
genymobile_scrcpy
|
CONFIG_CHANGE
|
version updates are done
|
6068053c228cbfdeb0ead9fcf8683847c11d33af
|
2023-11-19 17:05:46
|
Serhiy Mytrovtsiy
|
v2.9.12
| false
| 5
| 4
| 9
|
--- Stats.xcodeproj/project.pbxproj
@@ -2392,7 +2392,7 @@
"@executable_path/../Frameworks",
);
MACOSX_DEPLOYMENT_TARGET = 10.15;
- MARKETING_VERSION = 2.9.12;
+ MARKETING_VERSION = 2.9.11;
OTHER_LDFLAGS = "";
PRODUCT_BUNDLE_IDENTIFIER = eu.exelban.Stats;
PRODUCT_NAME = "$(TARGET_NAME)";
@@ -2430,7 +2430,7 @@
"@executable_path/../Frameworks",
);
MACOSX_DEPLOYMENT_TARGET = 10.15;
- MARKETING_VERSION = 2.9.12;
+ MARKETING_VERSION = 2.9.11;
OTHER_LDFLAGS = "";
PRODUCT_BUNDLE_IDENTIFIER = eu.exelban.Stats;
PRODUCT_NAME = "$(TARGET_NAME)";
--- Stats/Supporting Files/Info.plist
@@ -17,7 +17,7 @@
<key>CFBundleShortVersionString</key>
<string>$(MARKETING_VERSION)</string>
<key>CFBundleVersion</key>
- <string>508</string>
+ <string>506</string>
<key>Description</key>
<string>Simple macOS system monitor in your menu bar</string>
<key>LSApplicationCategoryType</key>
--- Stats/Supporting Files/hi.lproj/Localizable.strings
@@ -179,7 +179,6 @@
"Color of download" = "डाउनलोड का रंग";
"Color of upload" = "अपलोड का रंग";
"Monospaced font" = "मोनोस्पेस्ड फ़ॉन्ट";
-"Reverse order" = "Reverse order";
// Module Kit
"Open module settings" = "ओपन मॉड्यूल सेटिंग्स";
--- Stats/Supporting Files/zh-Hant.lproj/Localizable.strings
@@ -179,7 +179,7 @@
"Color of download" = "下載的顯示色彩";
"Color of upload" = "上傳的顯示色彩";
"Monospaced font" = "等寬字體";
-"Reverse order" = "排序對調";
+"Reverse order" = 排序對調";
// Module Kit
"Open module settings" = "打開模組設定";
|
stats
|
exelban
|
Swift
|
Swift
| 29,655
| 950
|
macOS system monitor in your menu bar
|
exelban_stats
|
CONFIG_CHANGE
|
version/id change of roll media app
|
e18bd8efc1e786c58b99613fc0f938da4a430d17
|
2025-02-07 00:01:53
|
Hadley Wickham
|
Remove `extra_args` arg from `chat_perform` (#314) And apply `provider@extra_args` to the request body in the same way in all providers. Fixes #313
| false
| 39
| 57
| 96
|
--- R/httr2.R
@@ -5,7 +5,8 @@ chat_perform <- function(provider,
mode = c("value", "stream", "async-stream", "async-value"),
turns,
tools = list(),
- type = NULL) {
+ type = NULL,
+ extra_args = list()) {
mode <- arg_match(mode)
stream <- mode %in% c("stream", "async-stream")
@@ -15,7 +16,8 @@ chat_perform <- function(provider,
turns = turns,
tools = tools,
stream = stream,
- type = type
+ type = type,
+ extra_args = extra_args
)
switch(mode,
--- R/provider-azure.R
@@ -141,7 +141,8 @@ method(chat_request, ProviderAzure) <- function(provider,
stream = TRUE,
turns = list(),
tools = list(),
- type = NULL) {
+ type = NULL,
+ extra_args = list()) {
req <- request(provider@base_url)
req <- req_url_path_append(req, "/chat/completions")
@@ -178,6 +179,7 @@ method(chat_request, ProviderAzure) <- function(provider,
messages <- compact(unlist(as_json(provider, turns), recursive = FALSE))
tools <- as_json(provider, unname(tools))
+ extra_args <- utils::modifyList(provider@extra_args, extra_args)
if (!is.null(type)) {
response_format <- list(
@@ -192,17 +194,17 @@ method(chat_request, ProviderAzure) <- function(provider,
response_format <- NULL
}
- body <- compact(list2(
+ data <- compact(list2(
messages = messages,
model = provider@model,
seed = provider@seed,
stream = stream,
stream_options = if (stream) list(include_usage = TRUE),
tools = tools,
- response_format = response_format
+ response_format = response_format,
+ !!!extra_args
))
- body <- modify_list(body, provider@extra_args)
- req <- req_body_json(req, body)
+ req <- req_body_json(req, data)
req
}
--- R/provider-bedrock.R
@@ -91,7 +91,8 @@ method(chat_request, ProviderBedrock) <- function(provider,
stream = TRUE,
turns = list(),
tools = list(),
- type = NULL) {
+ type = NULL,
+ extra_args = list()) {
req <- request(paste0(
"https://bedrock-runtime.", provider@region, ".amazonaws.com"
@@ -150,7 +151,8 @@ method(chat_request, ProviderBedrock) <- function(provider,
system = system,
toolConfig = toolConfig
)
- body <- modify_list(body, provider@extra_args)
+ extra_args <- utils::modifyList(provider@extra_args, extra_args)
+ body <- modify_list(body, extra_args)
req <- req_body_json(req, body)
req
--- R/provider-claude.R
@@ -76,7 +76,8 @@ method(chat_request, ProviderClaude) <- function(provider,
stream = TRUE,
turns = list(),
tools = list(),
- type = NULL) {
+ type = NULL,
+ extra_args = list()) {
req <- request(provider@base_url)
# https://docs.anthropic.com/en/api/messages
@@ -124,6 +125,7 @@ method(chat_request, ProviderClaude) <- function(provider,
}
tools <- as_json(provider, unname(tools))
+ extra_args <- utils::modifyList(provider@extra_args, extra_args)
body <- compact(list2(
model = provider@model,
system = system,
@@ -132,8 +134,8 @@ method(chat_request, ProviderClaude) <- function(provider,
max_tokens = provider@max_tokens,
tools = tools,
tool_choice = tool_choice,
+ !!!extra_args
))
- body <- modify_list(body, provider@extra_args)
req <- req_body_json(req, body)
req
--- R/provider-cortex.R
@@ -150,7 +150,8 @@ method(chat_request, ProviderCortex) <- function(provider,
stream = TRUE,
turns = list(),
tools = list(),
- type = NULL) {
+ type = NULL,
+ extra_args = list()) {
if (length(tools) != 0) {
cli::cli_abort("Tools are not supported by Cortex.")
}
@@ -179,10 +180,10 @@ method(chat_request, ProviderCortex) <- function(provider,
# Cortex does not yet support multi-turn chats.
turns <- tail(turns, n = 1)
messages <- as_json(provider, turns)
+ extra_args <- utils::modifyList(provider@extra_args, extra_args)
- body <- list(messages = messages, stream = stream)
- body <- modify_list(body, provider@extra_args)
- req <- req_body_json(req, body)
+ data <- compact(list2(messages = messages, stream = stream, !!!extra_args))
+ req <- req_body_json(req, data)
req
}
--- R/provider-databricks.R
@@ -88,7 +88,8 @@ method(chat_request, ProviderDatabricks) <- function(provider,
stream = TRUE,
turns = list(),
tools = list(),
- type = NULL) {
+ type = NULL,
+ extra_args = list()) {
req <- request(provider@base_url)
# Note: this API endpoint is undocumented and seems to exist primarily for
# compatibility with the OpenAI Python SDK. The documented endpoint is
@@ -107,6 +108,7 @@ method(chat_request, ProviderDatabricks) <- function(provider,
messages <- compact(unlist(as_json(provider, turns), recursive = FALSE))
tools <- as_json(provider, unname(tools))
+ extra_args <- utils::modifyList(provider@extra_args, extra_args)
if (!is.null(type)) {
response_format <- list(
@@ -121,15 +123,15 @@ method(chat_request, ProviderDatabricks) <- function(provider,
response_format <- NULL
}
- body <- compact(list(
+ data <- compact(list2(
messages = messages,
model = provider@model,
stream = stream,
tools = tools,
- response_format = response_format
+ response_format = response_format,
+ !!!extra_args
))
- body <- modify_list(body, provider@extra_args)
- req <- req_body_json(req, body)
+ req <- req_body_json(req, data)
req
}
--- R/provider-gemini.R
@@ -61,7 +61,8 @@ method(chat_request, ProviderGemini) <- function(provider,
stream = TRUE,
turns = list(),
tools = list(),
- type = NULL) {
+ type = NULL,
+ extra_args = list()) {
req <- request(provider@base_url)
@@ -106,15 +107,15 @@ method(chat_request, ProviderGemini) <- function(provider,
} else {
tools <- NULL
}
+ extra_args <- utils::modifyList(provider@extra_args, extra_args)
- body <- compact(list(
+ body <- compact(list2(
contents = contents,
tools = tools,
systemInstruction = system,
- generationConfig = generation_config
+ generationConfig = generation_config,
+ !!!extra_args
))
- body <- modify_list(body, provider@extra_args)
-
req <- req_body_json(req, body)
req
--- R/provider-openai.R
@@ -100,7 +100,8 @@ method(chat_request, ProviderOpenAI) <- function(provider,
stream = TRUE,
turns = list(),
tools = list(),
- type = NULL) {
+ type = NULL,
+ extra_args = list()) {
req <- request(provider@base_url)
req <- req_url_path_append(req, "/chat/completions")
@@ -122,6 +123,7 @@ method(chat_request, ProviderOpenAI) <- function(provider,
messages <- compact(unlist(as_json(provider, turns), recursive = FALSE))
tools <- as_json(provider, unname(tools))
+ extra_args <- utils::modifyList(provider@extra_args, extra_args)
if (!is.null(type)) {
response_format <- list(
@@ -136,17 +138,17 @@ method(chat_request, ProviderOpenAI) <- function(provider,
response_format <- NULL
}
- body <- compact(list(
+ data <- compact(list2(
messages = messages,
model = provider@model,
seed = provider@seed,
stream = stream,
stream_options = if (stream) list(include_usage = TRUE),
tools = tools,
- response_format = response_format
+ response_format = response_format,
+ !!!extra_args
))
- body <- utils::modifyList(body, provider@extra_args)
- req <- req_body_json(req, body)
+ req <- req_body_json(req, data)
req
}
--- R/provider-openrouter.R
@@ -63,14 +63,16 @@ method(chat_request, ProviderOpenRouter) <- function(
stream = TRUE,
turns = list(),
tools = list(),
- type = NULL
+ type = NULL,
+ extra_args = list()
) {
req <- chat_request(
super(provider, ProviderOpenAI),
stream = stream,
turns = turns,
tools = tools,
- type = type
+ type = type,
+ extra_args = extra_args
)
# https://openrouter.ai/docs/api-keys
--- R/provider-snowflake.R
@@ -82,7 +82,8 @@ method(chat_request, ProviderSnowflake) <- function(provider,
stream = TRUE,
turns = list(),
tools = list(),
- type = NULL) {
+ type = NULL,
+ extra_args = list()) {
if (length(tools) != 0) {
cli::cli_abort(
"Tool calling is not supported.",
@@ -113,14 +114,15 @@ method(chat_request, ProviderSnowflake) <- function(provider,
req <- req_error(req, body = function(resp) resp_body_json(resp)$message)
messages <- as_json(provider, turns)
+ extra_args <- utils::modifyList(provider@extra_args, extra_args)
- body <- list(
+ data <- compact(list2(
messages = messages,
model = provider@model,
- stream = stream
- )
- body <- modify_list(body, provider@extra_args)
- req <- req_body_json(req, body)
+ stream = stream,
+ !!!extra_args
+ ))
+ req <- req_body_json(req, data)
req
}
--- R/provider.R
@@ -29,7 +29,7 @@ Provider <- new_class(
# Create a request------------------------------------
chat_request <- new_generic("chat_request", "provider",
- function(provider, stream = TRUE, turns = list(), tools = list(), type = NULL) {
+ function(provider, stream = TRUE, turns = list(), tools = list(), type = NULL, extra_args = list()) {
S7_dispatch()
}
)
|
ellmer
|
tidyverse
|
R
|
R
| 401
| 55
|
Call LLM APIs from R
|
tidyverse_ellmer
|
BUG_FIX
|
obvious
|
315700b55b140bd5219d03eb7ca791d433c04cf6
|
2023-01-11 04:36:34
|
Kebin Liu
|
fixes cross-compilation and CI
| false
| 162
| 79
| 241
|
--- .github/workflows/main.yml
@@ -8,24 +8,7 @@ jobs:
runs-on: macos-10.15
steps:
-
- - name: Checkout
- uses: actions/checkout@v3
- with:
- submodules: recursive
- fetch-depth: 0
-
- - name: Build
- run: |
- brew install automake
- make debug
- make debug-dmg
- shasum -a 256 build/Debug/ShadowsocksX-NG.dmg > build/Debug/ShadowsocksX-NG.dmg.checksum
-
- - name: Upload artifacts
- uses: actions/upload-artifact@v3
- with:
- name: ShadowsocksX-NG
- path: |
- build/Debug/ShadowsocksX-NG.dmg
- build/Debug/ShadowsocksX-NG.dmg.checksum
+ - uses: actions/checkout@v1
+ - name: Run unit test
+ run: |
+ set -o pipefail && xcodebuild test -workspace ShadowsocksX-NG.xcworkspace -scheme ShadowsocksX-NG|xcpretty
--- .gitignore
@@ -76,9 +76,3 @@ deps/dist
deps/pcre*
deps/libev*
deps/privoxy*
-
-ShadowsocksX-NG/kcptun/client
-ShadowsocksX-NG/privoxy/privoxy
-ShadowsocksX-NG/simple-obfs/obfs-local
-ShadowsocksX-NG/ss-local/ss-local
-ShadowsocksX-NG/v2ray-plugin/v2ray-plugin
--- .gitmodules
@@ -28,6 +28,3 @@
[submodule "deps/kcptun"]
path = deps/kcptun
url = https://github.com/xtaci/kcptun
-[submodule "deps/simple-obfs"]
- path = deps/simple-obfs
- url = https://github.com/shadowsocks/simple-obfs
--- Makefile
@@ -1,5 +1,6 @@
.PHONY: all
all: debug
+ echo "Deps build done"
.PHONY: debug
debug: deps/dist
--- ShadowsocksX-NG/LaunchAgentUtils.swift
@@ -8,6 +8,11 @@
import Foundation
+//let SS_LOCAL_VERSION = "3.2.5"
+//let KCPTUN_CLIENT_VERSION = "v20190905_1"
+//let V2RAY_PLUGIN_VERSION = "v1.3.1-9-gddd7ab4"
+//let PRIVOXY_VERSION = "3.0.26.static"
+let SIMPLE_OBFS_VERSION = "0.0.5_1"
let APP_SUPPORT_DIR = "/Library/Application Support/ShadowsocksX-NG/"
let USER_CONFIG_DIR = "/.ShadowsocksX-NG/"
let LAUNCH_AGENT_DIR = "/Library/LaunchAgents/"
@@ -59,7 +64,7 @@ func generateSSLocalLauchAgentPlist() -> Bool {
let dyld_library_paths = [
NSHomeDirectory() + APP_SUPPORT_DIR + "ss-local/",
NSHomeDirectory() + APP_SUPPORT_DIR + "plugins/",
- ]
+ ]
let dict: NSMutableDictionary = [
"Label": "com.qiuyuzhou.shadowsocksX-NG.local",
@@ -108,31 +113,24 @@ func InstallSSLocal() {
let fileMgr = FileManager.default
let homeDir = NSHomeDirectory()
let appSupportDir = homeDir+APP_SUPPORT_DIR
- if fileMgr.fileExists(atPath: appSupportDir + "ss-local/ss-local") {
- do {
- try fileMgr.removeItem(atPath: appSupportDir + "ss-local/ss-local")
- } catch {
- NSLog("Remove old ss-local error")
+ if !fileMgr.fileExists(atPath: appSupportDir + "ss-local/ss-local") {
+ let bundle = Bundle.main
+ let installerPath = bundle.path(forResource: "install_ss_local.sh", ofType: nil)
+ let task = Process.launchedProcess(launchPath: installerPath!, arguments: [""])
+ task.waitUntilExit()
+ if task.terminationStatus == 0 {
+ NSLog("Install ss-local succeeded.")
+ } else {
+ NSLog("Install ss-local failed.")
}
}
-
- let bundle = Bundle.main
- let installerPath = bundle.path(forResource: "install_ss_local.sh", ofType: nil)
- let task = Process.launchedProcess(launchPath: installerPath!, arguments: [""])
- task.waitUntilExit()
- if task.terminationStatus == 0 {
- NSLog("Install ss-local succeeded.")
- } else {
- NSLog("Install ss-local failed.")
- }
-
}
func writeSSLocalConfFile(_ conf:[String:AnyObject]) -> Bool {
do {
let filepath = NSHomeDirectory() + APP_SUPPORT_DIR + "ss-local-config.json"
var data: Data = try JSONSerialization.data(withJSONObject: conf, options: .prettyPrinted)
-
+
// https://github.com/shadowsocks/ShadowsocksX-NG/issues/1104
// This is NSJSONSerialization.dataWithJSONObject that likes to insert additional backslashes.
// Escaped forward slashes is also valid json.
@@ -184,7 +182,7 @@ func SyncSSLocal() {
execute: {
() in
StartSSLocal()
- })
+ })
} else {
StartSSLocal()
}
@@ -206,24 +204,18 @@ func InstallSimpleObfs() {
let fileMgr = FileManager.default
let homeDir = NSHomeDirectory()
let appSupportDir = homeDir + APP_SUPPORT_DIR
- if fileMgr.fileExists(atPath: appSupportDir + "simple-obfs/obfs-local") {
- do {
- try fileMgr.removeItem(atPath: appSupportDir + "simple-obfs/obfs-local")
- } catch {
- NSLog("Remove old simple-obfs error")
+ if !fileMgr.fileExists(atPath: appSupportDir + "simple-obfs-\(SIMPLE_OBFS_VERSION)/obfs-local")
+ || !fileMgr.fileExists(atPath: appSupportDir + "plugins/obfs-local") {
+ let bundle = Bundle.main
+ let installerPath = bundle.path(forResource: "install_simple_obfs.sh", ofType: nil)
+ let task = Process.launchedProcess(launchPath: "/bin/sh", arguments: [installerPath!])
+ task.waitUntilExit()
+ if task.terminationStatus == 0 {
+ NSLog("Install simple-obfs succeeded.")
+ } else {
+ NSLog("Install simple-obfs failed.")
}
}
-
- let bundle = Bundle.main
- let installerPath = bundle.path(forResource: "install_simple_obfs.sh", ofType: nil)
- let task = Process.launchedProcess(launchPath: "/bin/sh", arguments: [installerPath!])
- task.waitUntilExit()
- if task.terminationStatus == 0 {
- NSLog("Install simple-obfs succeeded.")
- } else {
- NSLog("Install simple-obfs failed.")
- }
-
}
// --------------------------------------------------------------------------------
@@ -233,22 +225,17 @@ func InstallKcptun() {
let fileMgr = FileManager.default
let homeDir = NSHomeDirectory()
let appSupportDir = homeDir+APP_SUPPORT_DIR
- if fileMgr.fileExists(atPath: appSupportDir + "kcptun/client") {
- do {
- try fileMgr.removeItem(atPath: appSupportDir + "kcptun/client")
- } catch {
- NSLog("Remove old kcptun client error")
+ if !fileMgr.fileExists(atPath: appSupportDir + "kcptun/client") {
+ let bundle = Bundle.main
+ let installerPath = bundle.path(forResource: "install_kcptun", ofType: "sh")
+ let task = Process.launchedProcess(launchPath: "/bin/sh", arguments: [installerPath!])
+ task.waitUntilExit()
+ if task.terminationStatus == 0 {
+ NSLog("Install kcptun succeeded.")
+ } else {
+ NSLog("Install kcptun failed.")
}
}
- let bundle = Bundle.main
- let installerPath = bundle.path(forResource: "install_kcptun", ofType: "sh")
- let task = Process.launchedProcess(launchPath: "/bin/sh", arguments: [installerPath!])
- task.waitUntilExit()
- if task.terminationStatus == 0 {
- NSLog("Install kcptun succeeded.")
- } else {
- NSLog("Install kcptun failed.")
- }
}
// --------------------------------------------------------------------------------
@@ -258,29 +245,24 @@ func InstallV2rayPlugin() {
let fileMgr = FileManager.default
let homeDir = NSHomeDirectory()
let appSupportDir = homeDir+APP_SUPPORT_DIR
- if fileMgr.fileExists(atPath: appSupportDir + "v2ray-plugin/v2ray-plugin") {
- do {
- try fileMgr.removeItem(atPath: appSupportDir + "v2ray-plugin/v2ray-plugin")
- } catch {
- NSLog("Remove old v2ray-plugin error")
+ if !fileMgr.fileExists(atPath: appSupportDir + "v2ray-plugin/v2ray-plugin") {
+ let bundle = Bundle.main
+ let installerPath = bundle.path(forResource: "install_v2ray_plugin", ofType: "sh")
+ let task = Process.launchedProcess(launchPath: "/bin/sh", arguments: [installerPath!])
+ task.waitUntilExit()
+ if task.terminationStatus == 0 {
+ NSLog("Install v2ray-plugin succeeded.")
+ } else {
+ NSLog("Install v2ray-plugin failed.")
}
}
- let bundle = Bundle.main
- let installerPath = bundle.path(forResource: "install_v2ray_plugin", ofType: "sh")
- let task = Process.launchedProcess(launchPath: "/bin/sh", arguments: [installerPath!])
- task.waitUntilExit()
- if task.terminationStatus == 0 {
- NSLog("Install v2ray-plugin succeeded.")
- } else {
- NSLog("Install v2ray-plugin failed.")
- }
}
// --------------------------------------------------------------------------------
// MARK: privoxy
func generatePrivoxyLauchAgentPlist() -> Bool {
- let privoxyPath = NSHomeDirectory() + APP_SUPPORT_DIR + "privoxy/privoxy"
+ let privoxyPath = NSHomeDirectory() + APP_SUPPORT_DIR + "privoxy"
let logFilePath = NSHomeDirectory() + "/Library/Logs/privoxy.log"
let launchAgentDirPath = NSHomeDirectory() + LAUNCH_AGENT_DIR
let plistFilepath = launchAgentDirPath + LAUNCH_AGENT_CONF_PRIVOXY_NAME
@@ -340,31 +322,25 @@ func InstallPrivoxy() {
let fileMgr = FileManager.default
let homeDir = NSHomeDirectory()
let appSupportDir = homeDir+APP_SUPPORT_DIR
- if fileMgr.fileExists(atPath: appSupportDir + "privoxy/privoxy") {
- do {
- try fileMgr.removeItem(atPath: appSupportDir + "privoxy/privoxy")
- } catch {
- NSLog("Remove old privoxy error")
+ if !fileMgr.fileExists(atPath: appSupportDir + "privoxy/privoxy") {
+ let bundle = Bundle.main
+ let installerPath = bundle.path(forResource: "install_privoxy.sh", ofType: nil)
+ let task = Process.launchedProcess(launchPath: installerPath!, arguments: [""])
+ task.waitUntilExit()
+ if task.terminationStatus == 0 {
+ NSLog("Install privoxy succeeded.")
+ } else {
+ NSLog("Install privoxy failed.")
}
}
- let bundle = Bundle.main
- let installerPath = bundle.path(forResource: "install_privoxy.sh", ofType: nil)
- let task = Process.launchedProcess(launchPath: installerPath!, arguments: [""])
- task.waitUntilExit()
- if task.terminationStatus == 0 {
- NSLog("Install privoxy succeeded.")
- } else {
- NSLog("Install privoxy failed.")
- }
-
let userConfigDir = homeDir + USER_CONFIG_DIR
// Make dir: '~/.ShadowsocksX-NG'
if !fileMgr.fileExists(atPath: userConfigDir) {
try! fileMgr.createDirectory(atPath: userConfigDir
- , withIntermediateDirectories: true, attributes: nil)
+ , withIntermediateDirectories: true, attributes: nil)
}
-
+
// Install empty `user-privoxy.config` file.
let userConfigPath = userConfigDir + "user-privoxy.config"
if !fileMgr.fileExists(atPath: userConfigPath) {
@@ -434,7 +410,7 @@ func SyncPrivoxy() {
execute: {
() in
StartPrivoxy()
- })
+ })
} else {
StartPrivoxy()
}
--- ShadowsocksX-NG/privoxy/install_privoxy.sh
@@ -11,5 +11,7 @@ cd "$(dirname "${BASH_SOURCE[0]}")"
mkdir -p "$HOME/Library/Application Support/ShadowsocksX-NG/privoxy"
cp -f privoxy "$HOME/Library/Application Support/ShadowsocksX-NG/privoxy/"
+rm -f "$HOME/Library/Application Support/ShadowsocksX-NG/privoxy"
+ln -s "$HOME/Library/Application Support/ShadowsocksX-NG/privoxy/privoxy" "$HOME/Library/Application Support/ShadowsocksX-NG/privoxy"
echo done
--- ShadowsocksX-NG/simple-obfs/install_simple_obfs.sh
@@ -5,7 +5,7 @@ FILE_DIR=`dirname "${BASH_SOURCE[0]}"`
cd "$FILE_DIR"
NGDir="$HOME/Library/Application Support/ShadowsocksX-NG"
-TargetDir="$NGDir/simple-obfs"
+TargetDir="$NGDir/simple-obfs-0.0.5_1"
PluginDir="$NGDir/plugins"
echo ngdir: ${NGDir}
--- ShadowsocksX-NG/simple-obfs/obfs-local
Binary files /dev/null and b/ShadowsocksX-NG/simple-obfs/obfs-local differ
--- deps/Makefile
@@ -1,12 +1,13 @@
ARCH ?= $(shell uname -m)
-ifeq ($(ARCH),arm64)
-ARCH := aarch64
-endif
TARGET := $(ARCH)-apple-macos10.12
JOBS := $(shell getconf _NPROCESSORS_ONLN)
GOROOT := $${PWD}/../dist/go
GO := $(GOROOT)/bin/go
+ifeq ($(ARCH),arm64)
+ARCH := aarch64
+endif
+
ifeq ($(ARCH),x86_64)
GOARCH := amd64
else
@@ -15,8 +16,8 @@ endif
.PHONY: all
all:
- $(MAKE) ARCH=x86_64 shadowsocks-libev privoxy simple-obfs v2ray-plugin kcptun
- $(MAKE) ARCH=aarch64 shadowsocks-libev privoxy simple-obfs v2ray-plugin kcptun
+ $(MAKE) ARCH=x86_64 shadowsocks-libev privoxy v2ray-plugin kcptun
+ $(MAKE) ARCH=aarch64 shadowsocks-libev privoxy v2ray-plugin kcptun
$(MAKE) universal
.PHONY: universal
@@ -39,15 +40,6 @@ universal:
-output $${PWD}/dist/universal/privoxy/sbin/privoxy
cp $${PWD}/dist/universal/privoxy/sbin/privoxy $${PWD}/../ShadowsocksX-NG/privoxy/
- # simple-obfs
- mkdir -p $${PWD}/dist/universal/simple-obfs/bin
- lipo \
- $${PWD}/dist/x86_64/simple-obfs/bin/obfs-local \
- $${PWD}/dist/aarch64/simple-obfs/bin/obfs-local \
- -create \
- -output $${PWD}/dist/universal/simple-obfs/bin/obfs-local
- cp $${PWD}/dist/universal/simple-obfs/bin/obfs-local $${PWD}/../ShadowsocksX-NG/simple-obfs/
-
# v2ray-plugin
mkdir -p $${PWD}/dist/universal/v2ray-plugin/bin
lipo \
@@ -103,17 +95,15 @@ libev:
.PHONY: privoxy
privoxy:
- [ -f privoxy.tar.gz ] || curl -L -o privoxy.tar.gz 'https://www.privoxy.org/sf-download-mirror/Sources/3.0.33%20%28stable%29/privoxy-3.0.33-stable-src.tar.gz'
+ [ -f privoxy.tar.gz ] || curl -L -o privoxy.tar.gz 'https://www.privoxy.org/sf-download-mirror/Sources/3.0.28%20%28stable%29/privoxy-3.0.28-stable-src.tar.gz'
tar -xf privoxy.tar.gz
- cd privoxy-3.0.33-stable \
- && patch -Ru configure.in < $${PWD}/../patch/privoxy/configure.in.patch \
+ cd privoxy-3.0.28-stable \
&& autoreconf -fi \
- && LDFLAGS="-target $(TARGET) -L$${PWD}/../dist/$(ARCH)/pcre/lib" \
- CPPFLAGS="-target $(TARGET) -Dunix" \
- CFLAGS="-target $(TARGET) -Dunix" \
- ./configure --prefix $${PWD}/../dist/$(ARCH)/privoxy \
+ && LDFLAGS="-target $(TARGET)" CFLAGS="-target $(TARGET)" ./configure --prefix $${PWD}/../dist/$(ARCH)/privoxy \
--host=$(TARGET) \
- --disable-debug \
+ --disable-dependency-tracking \
+ --enable-static \
+ --disable-shared \
&& make -j$(JOBS) \
&& make install \
&& make clean
@@ -169,20 +159,6 @@ shadowsocks-libev: pcre libev c-ares libsodium mbedtls
&& make install \
&& make clean
-.PHONY: simple-obfs
-simple-obfs:
- cd simple-obfs \
- && ./autogen.sh \
- && CXXFLAGS="-target $(TARGET)" CFLAGS="-target $(TARGET)" ./configure --prefix=$${PWD}/../dist/$(ARCH)/simple-obfs \
- --host=$(TARGET) \
- --disable-dependency-tracking \
- --enable-static \
- --disable-shared \
- --with-ev=$${PWD}/../dist/$(ARCH)/libev \
- && make -j$(JOBS) \
- && make install \
- && make clean
-
.PHONY: v2ray-plugin
v2ray-plugin: go
cd v2ray-plugin \
--- deps/patch/privoxy/configure.in.patch
@@ -1,11 +0,0 @@
---- configure.in 2023-01-05 14:37:57
-+++ configure.in.1 2023-01-05 14:37:40
-@@ -1090,7 +1090,7 @@
- fi
- fi
-
--use_static_pcre="yes"
-+
- # If we have libpcre and either we also have pcreposix or
- # we don't need pcreposix, then link pcre dynamically; else
- # build it and link statically
--- deps/simple-obfs
@@ -1 +0,0 @@
-Subproject commit 486bebd9208539058e57e23a12f23103016e09b4
--- deps/v2ray-plugin
@@ -1 +1 @@
-Subproject commit b9717056b251747149cacb44458fe02e420b9d9b
+Subproject commit ddd7ab46b4aeee0ca8b272efed9d7da3e3a6e52c
|
shadowsocksx-ng
|
shadowsocks
|
Swift
|
Swift
| 32,651
| 7,935
|
Next Generation of ShadowsocksX
|
shadowsocks_shadowsocksx-ng
|
CONFIG_CHANGE
|
Obvious
|
13443fdc5d0226793725e47e7343b9b942a0980b
|
2023-08-24 00:43:11
|
Michael de Hoog
|
Bump op-geth to new release-candidate (#100)
| false
| 3
| 3
| 6
|
--- Dockerfile
@@ -17,8 +17,8 @@ FROM golang:1.19 as geth
WORKDIR /app
ENV REPO=https://github.com/ethereum-optimism/op-geth
-ENV VERSION=v1.101200.1-rc.2
-ENV CHECKSUM=acdd027c85cf2edaec198f888a543445821182eaef461bc9d1a32527bd186ee3
+ENV VERSION=v1.101106.0
+ENV CHECKSUM=0273ea3226147ba5b04c1a6eff2d9da48e6bbff3a348b33fe13e7e34d88ba411
ADD --checksum=sha256:$CHECKSUM $REPO/archive/$VERSION.tar.gz ./
RUN tar -xvf ./$VERSION.tar.gz --strip-components=1 && \
--- geth-entrypoint
@@ -64,5 +64,5 @@ exec ./geth \
--nat=extip:$HOST_IP \
--networkid="$CHAIN_ID" \
--rollup.sequencerhttp="$OP_GETH_SEQUENCER_HTTP" \
- --port="$P2P_PORT" \
+ --port="$P2P_PORT" \
$ADDITIONAL_ARGS # intentionally unquoted
|
node
|
base
|
Shell
|
Shell
| 68,555
| 2,658
|
Everything required to run your own Base node
|
base_node
|
CONFIG_CHANGE
|
dependencies change
|
90c2349b35b4aa8f12fbdfdcec01cc1aab189b8a
|
2025-02-25 04:04:18
|
antirez
|
HNSW: binary distance: fix type for the xor var.
| false
| 1
| 1
| 2
|
--- hnsw.c
@@ -295,7 +295,7 @@ float vectors_distance_bin(const uint64_t *x, const uint64_t *y, uint32_t dim) {
uint32_t len = (dim+63)/64;
uint32_t opposite = 0;
for (uint32_t j = 0; j < len; j++) {
- uint64_t xor = x[j]^y[j];
+ int64_t xor = x[j]^y[j];
opposite += popcount64(xor);
}
return (float)opposite*2/dim;
|
redis
|
redis
|
C
|
C
| 68,201
| 23,916
|
Redis is an in-memory database that persists on disk. The data model is key-value, but many different kind of values are supported: Strings, Lists, Sets, Sorted Sets, Hashes, Streams, HyperLogLogs, Bitmaps.
|
redis_redis
|
BUG_FIX
|
Matched \bfix(e[ds]|ing)?\b in message
|
8720d087ea3146c1d0f3e658c38cb1dfa661c71c
|
2024-04-01 09:39:24
|
renovate[bot]
|
Update dependency com.tencent:mmkv-static to v1.3.4
| false
| 1
| 1
| 2
|
--- V2rayNG/app/build.gradle.kts
@@ -106,7 +106,7 @@ dependencies {
implementation("org.jetbrains.kotlinx:kotlinx-coroutines-core:1.8.0")
implementation("org.jetbrains.kotlinx:kotlinx-coroutines-android:1.8.0")
- implementation("com.tencent:mmkv-static:1.3.4")
+ implementation("com.tencent:mmkv-static:1.3.3")
implementation("com.google.code.gson:gson:2.10.1")
implementation("io.reactivex:rxjava:1.3.8")
implementation("io.reactivex:rxandroid:1.2.1")
|
v2rayng
|
2dust
|
Kotlin
|
Kotlin
| 38,863
| 5,828
|
A V2Ray client for Android, support Xray core and v2fly core
|
2dust_v2rayng
|
CODE_IMPROVEMENT
|
Code change: type annotation added
|
e16099adf396900c2a0493d185c154c0d84e56f4
| null |
Ilkka Seppälä
|
Update README.md
| false
| 1
| 1
| 0
|
--- README.md
@@ -66,7 +66,7 @@
* you need to use several existing subclasses, but it's impractical to adapt their interface by subclassing every one. An object adapter can adapt the interface of its parent class.
##Bridge
-**Intent:** Decouple an abstraction from its implementationso that the two can vary independently.
+**Intent:** Decouple an abstraction from its implementation so that the two can vary independently.

|
iluwatar_java-design-patterns.json
| null | null | null | null | null | null |
iluwatar_java-design-patterns.json
|
CONFIG_CHANGE
|
5, readme update
|
aac6c7055d710204cabd6e3ccec33b70447db648
|
2024-11-19 06:44:19
|
Yuri Schimke
|
Avoid NPE in ConnectPlan (#8514) * Avoid NPE in ConnectPlan
* Avoid NPE in ConnectPlan
* Avoid NPE in ConnectPlan
* cleanup
| false
| 55
| 36
| 91
|
--- okhttp/src/main/kotlin/okhttp3/internal/connection/ConnectPlan.kt
@@ -93,8 +93,8 @@ class ConnectPlan(
internal var socket: Socket? = null
private var handshake: Handshake? = null
private var protocol: Protocol? = null
- private lateinit var source: BufferedSource
- private lateinit var sink: BufferedSink
+ private var source: BufferedSource? = null
+ private var sink: BufferedSink? = null
private var connection: RealConnection? = null
/** True if this connection is ready for use, including TCP, tunnels, and TLS. */
@@ -152,7 +152,7 @@ class ConnectPlan(
}
override fun connectTlsEtc(): ConnectResult {
- val rawSocket = requireNotNull(rawSocket) { "TCP not connected" }
+ check(rawSocket != null) { "TCP not connected" }
check(!isReady) { "already connected" }
val connectionSpecs = route.address.connectionSpecs
@@ -176,7 +176,7 @@ class ConnectPlan(
// that happens, then we will have buffered bytes that are needed by the SSLSocket!
// This check is imperfect: it doesn't tell us whether a handshake will succeed, just
// that it will almost certainly fail because the proxy has sent unexpected data.
- if (!source.buffer.exhausted() || !sink.buffer.exhausted()) {
+ if (source?.buffer?.exhausted() == false || sink?.buffer?.exhausted() == false) {
throw IOException("TLS tunnel buffered too many bytes!")
}
@@ -216,9 +216,9 @@ class ConnectPlan(
connectionPool = connectionPool,
route = route,
rawSocket = rawSocket,
- socket = socket!!,
+ socket = socket,
handshake = handshake,
- protocol = protocol!!,
+ protocol = protocol,
source = source,
sink = sink,
pingIntervalMillis = pingIntervalMillis,
@@ -247,7 +247,7 @@ class ConnectPlan(
user.removePlanToCancel(this)
if (!success) {
socket?.closeQuietly()
- rawSocket.closeQuietly()
+ rawSocket?.closeQuietly()
}
}
}
@@ -420,6 +420,8 @@ class ConnectPlan(
val url = route.address.url
val requestLine = "CONNECT ${url.toHostHeader(includeDefaultPort = true)} HTTP/1.1"
while (true) {
+ val source = this.source!!
+ val sink = this.sink!!
val tunnelCodec =
Http1ExchangeCodec(
// No client for CONNECT tunnels:
--- okhttp/src/main/kotlin/okhttp3/internal/connection/RealConnection.kt
@@ -53,13 +53,8 @@ import okhttp3.internal.http2.StreamResetException
import okhttp3.internal.isHealthy
import okhttp3.internal.tls.OkHostnameVerifier
import okhttp3.internal.ws.RealWebSocket
-import okio.Buffer
import okio.BufferedSink
import okio.BufferedSource
-import okio.Sink
-import okio.Source
-import okio.Timeout
-import okio.buffer
/**
* A connection to a remote web server capable of carrying 1 or more concurrent streams.
@@ -72,16 +67,16 @@ class RealConnection(
val connectionPool: RealConnectionPool,
override val route: Route,
/** The low-level TCP socket. */
- private val rawSocket: Socket,
+ private var rawSocket: Socket?,
/**
* The application layer socket. Either an [SSLSocket] layered over [rawSocket], or [rawSocket]
* itself if this connection does not use SSL.
*/
- private val socket: Socket,
- private val handshake: Handshake?,
- private val protocol: Protocol,
- private val source: BufferedSource,
- private val sink: BufferedSink,
+ private var socket: Socket?,
+ private var handshake: Handshake?,
+ private var protocol: Protocol?,
+ private var source: BufferedSource?,
+ private var sink: BufferedSink?,
private val pingIntervalMillis: Int,
internal val connectionListener: ConnectionListener,
) : Http2Connection.Listener(), Connection, ExchangeCodec.Carrier {
@@ -167,6 +162,9 @@ class RealConnection(
@Throws(IOException::class)
private fun startHttp2() {
+ val socket = this.socket!!
+ val source = this.source!!
+ val sink = this.sink!!
socket.soTimeout = 0 // HTTP/2 connection timeouts are set per-stream.
val flowControlListener = connectionListener as? FlowControlListener ?: FlowControlListener.None
val http2Connection =
@@ -255,7 +253,7 @@ class RealConnection(
}
// We have a host mismatch. But if the certificate matches, we're still good.
- return !noCoalescedConnections && handshake != null && certificateSupportHost(url, handshake)
+ return !noCoalescedConnections && handshake != null && certificateSupportHost(url, handshake!!)
}
private fun certificateSupportHost(
@@ -273,9 +271,9 @@ class RealConnection(
client: OkHttpClient,
chain: RealInterceptorChain,
): ExchangeCodec {
- val socket = this.socket
- val source = this.source
- val sink = this.sink
+ val socket = this.socket!!
+ val source = this.source!!
+ val sink = this.sink!!
val http2Connection = this.http2Connection
return if (http2Connection != null) {
@@ -290,6 +288,10 @@ class RealConnection(
@Throws(SocketException::class)
internal fun newWebSocketStreams(exchange: Exchange): RealWebSocket.Streams {
+ val socket = this.socket!!
+ val source = this.source!!
+ val sink = this.sink!!
+
socket.soTimeout = 0
noNewExchanges()
return object : RealWebSocket.Streams(true, source, sink) {
@@ -307,10 +309,10 @@ class RealConnection(
override fun cancel() {
// Close the raw socket so we don't end up doing synchronous I/O.
- rawSocket.closeQuietly()
+ rawSocket?.closeQuietly()
}
- override fun socket(): Socket = socket
+ override fun socket(): Socket = socket!!
/** Returns true if this connection is ready to host new streams. */
fun isHealthy(doExtensiveChecks: Boolean): Boolean {
@@ -318,6 +320,9 @@ class RealConnection(
val nowNs = System.nanoTime()
+ val rawSocket = this.rawSocket!!
+ val socket = this.socket!!
+ val source = this.source!!
if (rawSocket.isClosed || socket.isClosed || socket.isInputShutdown ||
socket.isOutputShutdown
) {
@@ -437,7 +442,7 @@ class RealConnection(
}
}
- override fun protocol(): Protocol = protocol
+ override fun protocol(): Protocol = protocol!!
override fun toString(): String {
return "Connection{${route.address.url.host}:${route.address.url.port}," +
@@ -462,38 +467,12 @@ class RealConnection(
taskRunner = taskRunner,
connectionPool = connectionPool,
route = route,
- rawSocket = Socket(),
+ rawSocket = null,
socket = socket,
handshake = null,
- protocol = Protocol.HTTP_2,
- source =
- object : Source {
- override fun close() = Unit
-
- override fun read(
- sink: Buffer,
- byteCount: Long,
- ): Long {
- throw UnsupportedOperationException()
- }
-
- override fun timeout(): Timeout = Timeout.NONE
- }.buffer(),
- sink =
- object : Sink {
- override fun close() = Unit
-
- override fun flush() = Unit
-
- override fun timeout(): Timeout = Timeout.NONE
-
- override fun write(
- source: Buffer,
- byteCount: Long,
- ) {
- throw UnsupportedOperationException()
- }
- }.buffer(),
+ protocol = null,
+ source = null,
+ sink = null,
pingIntervalMillis = 0,
ConnectionListener.NONE,
)
|
okhttp
|
square
|
Kotlin
|
Kotlin
| 46,179
| 9,194
|
Square’s meticulous HTTP client for the JVM, Android, and GraalVM.
|
square_okhttp
|
CODE_IMPROVEMENT
|
seems like code cleanup is done
|
608e87bf8707e377f1c195ae22330e26f67de91e
|
2024-09-06 05:32:28
|
Patrick Devine
|
Fix gemma2 2b conversion (#6645)
| false
| 376
| 19
| 395
|
--- convert/convert_gemma2.go
@@ -34,20 +34,10 @@ func (p *gemma2Model) KV(t *Tokenizer) llm.KV {
}
func (p *gemma2Model) Replacements() []string {
- return []string{
- "model.embed_tokens", "token_embd",
- "model.norm", "output_norm",
- "model.layers", "blk",
- "input_layernorm", "attn_norm",
- "self_attn.q_proj", "attn_q",
- "self_attn.k_proj", "attn_k",
- "self_attn.v_proj", "attn_v",
- "self_attn.o_proj", "attn_output",
- "mlp.gate_proj", "ffn_gate",
- "mlp.down_proj", "ffn_down",
- "mlp.up_proj", "ffn_up",
+ return append(
+ p.gemmaModel.Replacements(),
"post_attention_layernorm", "post_attention_norm",
"pre_feedforward_layernorm", "ffn_norm",
"post_feedforward_layernorm", "post_ffw_norm",
- }
+ )
}
--- convert/convert_test.go
@@ -15,7 +15,6 @@ import (
"os"
"path/filepath"
"slices"
- "strings"
"testing"
"golang.org/x/exp/maps"
@@ -23,12 +22,6 @@ import (
"github.com/ollama/ollama/llm"
)
-type tensorData struct {
- Offsets []int `json:"data_offsets"`
- Type string `json:"dtype"`
- Shape []int `json:"shape"`
-}
-
func convertFull(t *testing.T, fsys fs.FS) (*os.File, llm.KV, llm.Tensors) {
t.Helper()
@@ -103,7 +96,6 @@ func TestConvertModel(t *testing.T) {
"Mistral-7B-Instruct-v0.2",
"Mixtral-8x7B-Instruct-v0.1",
"gemma-2b-it",
- "gemma-2-2b-it",
// microsoft/Phi-3-mini-128-instruct@d548c233192db00165d842bf8edff054bb3212f8
"Phi-3-mini-128k-instruct",
"all-MiniLM-L6-v2",
@@ -148,7 +140,7 @@ func TestConvertModel(t *testing.T) {
}
}
-func TestConvertInvalidTensorNames(t *testing.T) {
+func TestConvertInvalidDatatype(t *testing.T) {
f, err := os.CreateTemp(t.TempDir(), "testmodel")
if err != nil {
t.Fatal(err)
@@ -156,40 +148,23 @@ func TestConvertInvalidTensorNames(t *testing.T) {
defer f.Close()
tempDir := t.TempDir()
-
- td := map[string]*tensorData{}
- offset := 4096
-
- td["model.layers.0.self_attn.q_proj.weight"] = &tensorData{
- Offsets: []int{0, offset},
- Type: "F32",
- Shape: []int{4096, 4096},
- }
- td["blk.0.attn_q.weight"] = &tensorData{
- Offsets: []int{offset, offset * 2},
- Type: "F32",
- Shape: []int{4096, 4096},
- }
- generateSafetensorTestData(t, tempDir, td)
+ generateSafetensorTestData(t, tempDir)
err = ConvertModel(os.DirFS(tempDir), f)
- if err == nil || !strings.HasPrefix(err.Error(), "duplicate tensor name") {
+ if err == nil || err.Error() != "unsupported safetensors model" {
t.Errorf("expected error but didn't get one")
}
}
-func TestConvertInvalidDatatype(t *testing.T) {
- f, err := os.CreateTemp(t.TempDir(), "testmodel")
- if err != nil {
- t.Fatal(err)
+func generateSafetensorTestData(t *testing.T, tempDir string) {
+ type tensorData struct {
+ Offsets []int `json:"data_offsets"`
+ Type string `json:"dtype"`
+ Shape []int `json:"shape"`
}
- defer f.Close()
-
- tempDir := t.TempDir()
-
- td := map[string]*tensorData{}
offset := 4096 * 14336
+ td := map[string]*tensorData{}
td["model.layers.0.mlp.down_proj.weight"] = &tensorData{
Offsets: []int{0, offset},
Type: "I8",
@@ -200,16 +175,8 @@ func TestConvertInvalidDatatype(t *testing.T) {
Type: "U8",
Shape: []int{},
}
- generateSafetensorTestData(t, tempDir, td)
- err = ConvertModel(os.DirFS(tempDir), f)
- if err == nil || err.Error() != "unsupported safetensors model" {
- t.Errorf("expected error but didn't get one")
- }
-}
-
-func generateSafetensorTestData(t *testing.T, tempDir string, tensorData map[string]*tensorData) {
- data, err := json.Marshal(tensorData)
+ data, err := json.Marshal(td)
if err != nil {
t.Fatal(err)
}
@@ -355,6 +322,11 @@ func TestConvertAdapter(t *testing.T) {
}
func generateLoraTestData(t *testing.T, tempDir string) {
+ type tensorData struct {
+ Offsets []int `json:"data_offsets"`
+ Type string `json:"dtype"`
+ Shape []int `json:"shape"`
+ }
offset := 4096 * 8 * 4
td := map[string]*tensorData{"__metadata__": nil}
--- convert/reader_safetensors.go
@@ -49,19 +49,12 @@ func parseSafetensors(fsys fs.FS, replacer *strings.Replacer, ps ...string) ([]T
keys := maps.Keys(headers)
slices.Sort(keys)
- names := make(map[string]struct{}, len(keys))
-
for _, key := range keys {
if value := headers[key]; value.Type != "" {
// bitsandbytes quantized models are unsupported
if len(value.Shape) == 0 {
return nil, errors.New("unsupported safetensors model")
}
- ggufName := replacer.Replace(key)
- if _, ok := names[ggufName]; ok {
- return nil, fmt.Errorf("duplicate tensor name '%s' was found for this model", ggufName)
- }
- names[ggufName] = struct{}{}
ts = append(ts, safetensor{
fs: fsys,
path: p,
@@ -69,7 +62,7 @@ func parseSafetensors(fsys fs.FS, replacer *strings.Replacer, ps ...string) ([]T
offset: safetensorsPad(n, value.Offsets[0]),
size: safetensorsPad(n, value.Offsets[1]) - safetensorsPad(n, value.Offsets[0]),
tensorBase: &tensorBase{
- name: ggufName,
+ name: replacer.Replace(key),
shape: value.Shape,
},
})
--- convert/testdata/gemma-2-2b-it.json
@@ -1,312 +0,0 @@
-{
- "general.architecture": "gemma2",
- "general.file_type": "1",
- "general.quantization_version": "2",
- "gemma2.block_count": "26",
- "gemma2.context_length": "8192",
- "gemma2.embedding_length": "2304",
- "gemma2.feed_forward_length": "9216",
- "gemma2.attention.head_count": "8",
- "gemma2.attention.head_count_kv": "4",
- "gemma2.attention.key_length": "256",
- "gemma2.attention.value_length": "256",
- "gemma2.attention.layer_norm_rms_epsilon": "1e-06",
- "tokenizer.ggml.model": "llama",
- "tokenizer.ggml.add_bos_token": "true",
- "tokenizer.ggml.add_eos_token": "false",
- "tokenizer.ggml.bos_token_id": "2",
- "tokenizer.ggml.eos_token_id": "1",
- "tokenizer.ggml.padding_token_id": "0",
- "tokenizer.ggml.unknown_token_id": "3",
- "tokenizer.ggml.scores": "0872465d173867d755d3ee728f882b9dc2057a0bfd596fe1e3d131522f1250d8",
- "tokenizer.ggml.token_type": "8d40143b3477df77beea4139420335ede458bf5e14102f01b0170197b55da8d8",
- "tokenizer.ggml.tokens": "c6e66de1841f04de8b8d236d461ab720a4c9b9b5414dc293a09c6e10eab45fda",
- "token_embd.weight": "64a9d30707e659e2e673656d71f5aef7a9fb9fd83bb9a77558dfc5abbe218a05",
- "blk.0.attn_k.weight": "d8b4437c5edb3cddf6af9987038e1bb2b191c4f0fce0e160d2abace717f5d5d7",
- "blk.0.attn_norm.weight": "1eb73e3f7aa8e502f6ca31cd19efbb8e4fd9a89692e13e48ac8205545a7fa7e8",
- "blk.0.attn_output.weight": "39e7b78e57d356a22dd89ce1c4d7163b970712ba756545e1703f97866cd2192e",
- "blk.0.attn_q.weight": "795058e23b6109febd9d55c89e1eebe6af0714ec8c56fd86a160876a6135ffe8",
- "blk.0.attn_v.weight": "0cd6e583d1887c020472e961bbb113fe5a0d23ae2f1c2c876fc366cdb7692b52",
- "blk.0.ffn_down.weight": "51eb4d962189e945a84e94e0dc1aad3f8f90cc1a11e18029670afcd0ea0acb1b",
- "blk.0.ffn_gate.weight": "9811a29b8ad48432925897ab21dfcb13c5cbd372aeccbbefca9b7866883b4ce3",
- "blk.0.ffn_norm.weight": "92cbf4652ef503c1de5b10f2be00b3fcf00100980cb3baa8f3013a8d8bf3d851",
- "blk.0.ffn_up.weight": "af87de21746879483ed1b374cdd76b19ba11ca2b6dbb1beba98efdf3be3e8077",
- "blk.0.post_attention_norm.weight": "32e135f1f258ffe407018899e39af1725d59d66d60022b9a21575ba160e0357a",
- "blk.0.post_ffw_norm.weight": "ba286f5ac11b07fbc986173708c66f1920427be5a6d108af38fa0a837c1c8eb6",
- "blk.1.attn_k.weight": "51584435552051f7fade76beca582b3f7190cf7fc07adcf527c2774d4b1c3901",
- "blk.1.attn_norm.weight": "6833104c7fbf35a7e799ae56c262b97fffa14789642aee14381b25acd21ed80a",
- "blk.1.attn_output.weight": "14c39481369087bf292ac9a3ab2ef166f9fe376a9f90c246653213ef264febdc",
- "blk.1.attn_q.weight": "443f64ae2229f857c69d6bebb7800b685786cb77884c3ae19d4286aeed081325",
- "blk.1.attn_v.weight": "0df482de2038f1e4c8a7733ac0ddb69ad90759dab5968b942af0155588de4c4a",
- "blk.1.ffn_down.weight": "66f30763a8bbbcaea609a0087ed75fadb5e771c06378dd2cea94cf17e492e8cf",
- "blk.1.ffn_gate.weight": "a7151bff00a545fa18b2c92dcd2a14572ccf9beb957a6c494f1374e8ebe174c9",
- "blk.1.ffn_norm.weight": "e197d71ea11b5276bc0167d2663b88089b3ff42b47ba91e85f6c5d95f6306435",
- "blk.1.ffn_up.weight": "57c182e0b14cccd1350d388f0c616991702e74281db54637451b70f4ccc24f9b",
- "blk.1.post_attention_norm.weight": "3c56f837168d784c2d8bac247c130bdca6610c095c8da4558c536ccad7605609",
- "blk.1.post_ffw_norm.weight": "d2a51d320fd01069dd7ccaa7082f16a7faeb671885607d7900b10a89c354d0fa",
- "blk.2.attn_k.weight": "bc103c818192de7ce36caaf89dc117be4df13fb902e0bd9a23c64edace5df9b6",
- "blk.2.attn_norm.weight": "0f2503aa126083a5d6ac72481be1ef66c6014705b573682b35bd864e4749a3d5",
- "blk.2.attn_output.weight": "05fcd4a1226e482f91803a266f72caca887a93e63c2d2ba5611ab3c68d38743a",
- "blk.2.attn_q.weight": "6a10b5c2fd423d1e4c4fd60fa8c154a0159b6b2501ea79cae2ef19f45a674e5e",
- "blk.2.attn_v.weight": "3cf891945a1f8ae7cc908a5c6b729ff5b70f4436c5ffdbf245cc0ed4cc19cd1b",
- "blk.2.ffn_down.weight": "ea204fd04e0d2fc728a9861a459216bbfec629c152004ba625f52cd8837bd51e",
- "blk.2.ffn_gate.weight": "3a3518729f1b8b64a82b8792f33987db5418fdb094be0263c68f146a5c38de54",
- "blk.2.ffn_norm.weight": "754ede678b725de41a34b82f0edf7688b5c065be7c0d46df6f7ad9430d986884",
- "blk.2.ffn_up.weight": "ffdcb88439f5828ffbd9fc844b03ff91637b790b9838097258cc3ae75935720c",
- "blk.2.post_attention_norm.weight": "4b3f53b7ba26e8c36b2dfda3b7e5fc4b1065257cefdea235fc7df9af130ac2fd",
- "blk.2.post_ffw_norm.weight": "e550369e26b8485e2b54ad34b34bc98af5494287dcc513c2c39cf1eaa5b89d07",
- "blk.3.attn_k.weight": "89f24ea450e37d9e95757651a83205c085d81b354ee9489dd6310a391d8409f3",
- "blk.3.attn_norm.weight": "24e2ea662b7cb822b4ca5cd61bc17f2709f406d990ec3b4a0dac1cc112db45cf",
- "blk.3.attn_output.weight": "ac4dad69473c6e3fac56669212cadd8c34ecc5973d945972e974d94805334967",
- "blk.3.attn_q.weight": "b6a9c9a7d4722b9096631c65de62228dfddca6e26edfe6af7fce01e116ef0f4c",
- "blk.3.attn_v.weight": "f272a960a40093942309bc342a379984cbacec2d7bc64428db3f64e6b1887ed4",
- "blk.3.ffn_down.weight": "c0188ba50d8228805982029c277fc0e87aa57473b8363037c648f6d006ff828a",
- "blk.3.ffn_gate.weight": "a04aec1561ee6c0fbb18c3db49dc62fb533619cf697fd548cbf2279761aaec3b",
- "blk.3.ffn_norm.weight": "bc053837d44087ec05eb5d9458357b2a5be787789b19cdbbdc694b57697f99a6",
- "blk.3.ffn_up.weight": "b3ce8b274f20796d3b1a7c08ba27a919066f9de89a782faa544c4a8d6bea1382",
- "blk.3.post_attention_norm.weight": "9c922dee7a7df5667289e2788e60170238239cee2dfdbbd9e435763f9f416718",
- "blk.3.post_ffw_norm.weight": "b682544ac953ad2e0b49027ed8916f2e9d1aba5d1587bb4127ac703570c7a03a",
- "blk.4.attn_k.weight": "143b0cbb4b787b95c2b6212374410e32173ccef2adb914908a2f89a7916de512",
- "blk.4.attn_norm.weight": "5668f60491b780273745192662d02c9a92a4f692b29d16aa0bbc7413fec4f85b",
- "blk.4.attn_output.weight": "b9f2bdb68be1e0cf66dd19f8fa2afb105910ad2ef394864cb32cea8f8944e0d5",
- "blk.4.attn_q.weight": "ddcf1343dafbc2dfcd0b8741225af22fe4b54b2becce29240bd01c34265d126c",
- "blk.4.attn_v.weight": "6dc7074366e7ed52d9f48c594dcc85bef738e096276cb99d28228c89eecc5b9c",
- "blk.4.ffn_down.weight": "30334ffc59ce343cf2a1b973174acb7722823463adc07e19a99bd0f404bc9906",
- "blk.4.ffn_gate.weight": "890f7c8af208d63b28db52c4b8c16c2288a382d87ff5a6a6d6b0a5b3bf27e6cd",
- "blk.4.ffn_norm.weight": "ff0316cc7847221eb86a90c1ab441d4ee61553d410c66414a7755021b3b12448",
- "blk.4.ffn_up.weight": "6af97d113f91564c636734f215e25ee602d48eb045458f300b3ec7582be0f41d",
- "blk.4.post_attention_norm.weight": "69438f231e105e68216b078bdeb35a7cdc8b12c4e2845e18ecf4c8d361d6a321",
- "blk.4.post_ffw_norm.weight": "0fd535da78bcf2b32c95b05b2b83dc49817393765be90d8cc1ed3d56f47b68ec",
- "blk.5.attn_k.weight": "0166eb3c6d20dcf3d3c169e94caa8dee057535bb525e29f698fb6f8844f18a6c",
- "blk.5.attn_norm.weight": "a7808f27f164023d5cde2be00fc23cac6c71aa0ddeb60bc23e12411b80087672",
- "blk.5.attn_output.weight": "8b65b2027a0842b68c5308f91d6a31de9599d794157d77df8418b19f9e0d9334",
- "blk.5.attn_q.weight": "966bc626ef2c2394d872087a41c126bb1b67d1d5f6de920204ef5e5b16c34003",
- "blk.5.attn_v.weight": "9a362aef3f4437fbf0ef6e1ba785f3329c3db2960f93fe36547d2795e9c254ea",
- "blk.5.ffn_down.weight": "63e53541d34197720c06f297aa8142ac6b6eec002c7987b296f26e8b1400f931",
- "blk.5.ffn_gate.weight": "d9591fdd32f783e0fc26e20d5d587ee8971ac8ae2e4c818c6eac1c125c7c7f37",
- "blk.5.ffn_norm.weight": "677334cc60ecce3a7f4ab3acda15d359353d7358872f614ad8914e3780e9fc6e",
- "blk.5.ffn_up.weight": "a63764110e1c655ffbd55af0669b2dfe4cc29d0e198d33a8e5426461b08a85f7",
- "blk.5.post_attention_norm.weight": "c55499f859b2c0a7f5cabceaae47309a5ad38bc29d0f4a8db81f1357023162a9",
- "blk.5.post_ffw_norm.weight": "82752754665f842418f3e302cb5f43d1e0504dcd124c4b8ddb77018b2c793837",
- "blk.6.attn_k.weight": "e20a5f0d6c807273c8d491439566b428497ac02097cf0aa55e33748c28e14be6",
- "blk.6.attn_norm.weight": "2c6ba42fd3c73d72073ced03a32dd28d70a89ed9bbbc8fea1ba03a7ade951e6c",
- "blk.6.attn_output.weight": "4de7c5c2f4a133a266e17ed8c14c52959466b54cc7ab9e19f789a33b4850f284",
- "blk.6.attn_q.weight": "56462d921800e6b8cd2213fef04c4ff16d728905cb2f4c58e966d0a053a3b0ae",
- "blk.6.attn_v.weight": "b758dcbff769d6240c2245ede1dbc62c4170a67c77458e866312589220fe29af",
- "blk.6.ffn_down.weight": "582247fb3c2bf687cbe9413fe18d18ad47bef4b65df7d78905e10335c6134764",
- "blk.6.ffn_gate.weight": "3035444d5286aefb7a6d04e55bc27e1fac7cf895cd5be02319a431b8e047b4ae",
- "blk.6.ffn_norm.weight": "e582d24c66e01b96faa20ce6adfda3d8583b11e809bff89969927398175e369a",
- "blk.6.ffn_up.weight": "6f4b7bbfedeacf61a4866ae0616c4ba6c9e856662e8f00ae6aaec7f52c53e7b4",
- "blk.6.post_attention_norm.weight": "8fe51b50bd677d21586aecab0b565c4bf9fa68ad50bfe366f45e8fea3c657ca8",
- "blk.6.post_ffw_norm.weight": "81ba3cb4c2bf5c546b86855b7a885d3fafededc67eb3a35cd3598b03c9e26e65",
- "blk.7.attn_k.weight": "2e044179cdcae0946708c86bfea7aa0391e1f7e2a09b33fca035d384cc3ca758",
- "blk.7.attn_norm.weight": "94b48c546b046803c60e75a3acb17a356b710735989938021b565f68df9b4985",
- "blk.7.attn_output.weight": "65709b4ad7a581f4d75793d39d4032a359f6bcc0c3835205242a0b99e5b66824",
- "blk.7.attn_q.weight": "8ded993c95d1f7caf201ceb6fa035cd6ed6d351b50b999fa9355dfee9486cb5b",
- "blk.7.attn_v.weight": "c92d5e2d2d48397542bc03bea25bf39154075e66c5bb1ead85188505aa04ae91",
- "blk.7.ffn_down.weight": "e8ba8fb57208805ef1dc23cd7c86e9a2d1fb7c52c3940d292cd5bb2eb24b3fac",
- "blk.7.ffn_gate.weight": "f0f06d6a2e06c5ac252083bc61d05c814e6289d3f4e4a87d2f06918254c02c36",
- "blk.7.ffn_norm.weight": "ebf8ef775f72624148e09d68a4332187a7a5020c521fe0623da1cd3485ad33e0",
- "blk.7.ffn_up.weight": "a554adc4fc7122c247c77670e169916ba1794c787b5be30a2b36705138f1f746",
- "blk.7.post_attention_norm.weight": "3aa6bc21d85c3a0c12b964e82b12feaedfdd13130c3cd2229228e24e0967ebdf",
- "blk.7.post_ffw_norm.weight": "508bc7b19ee8ff08f0007c890133a462fc57c7e72b16ee8f6dd64def264ef876",
- "blk.8.attn_k.weight": "363c8e74056642fe9e7c2f3f9769d57319cd3fa0a6022810189ab8d894322885",
- "blk.8.attn_norm.weight": "685b49a1f1acb169f4df0bdd8e3de6943f3033cebad14b898a72000595610d92",
- "blk.8.attn_output.weight": "7bde571e4efef1c6a6143f0526721dfb59e0a0ea0e1a3616a322b2eb937efa48",
- "blk.8.attn_q.weight": "fc993dbc1074c28a0e1d85e5ab2f4ea6a9c6c1affe7ee56027000a275daed9b6",
- "blk.8.attn_v.weight": "281e8791d3aef9b3864f1cb054da0ae0c2fef4ce0a58b1bad8bc136b2fa0f62b",
- "blk.8.ffn_down.weight": "b1164a2578a7f87ed99c2bbc76c5dfbbbc6a1a803605391acc3f320fc989ffd7",
- "blk.8.ffn_gate.weight": "6b39a3b3aaaa79aee61416b54d62160b9258042650e61c6b47bc77c2dd17daf3",
- "blk.8.ffn_norm.weight": "17ea1362c72da27f12bc936500492035bdef3fd8f940cb12b57f37d42ba8ecb1",
- "blk.8.ffn_up.weight": "bc3a7c47afc440d2bdf8fbe9ddf2c9220467472c60c8b4ded8c0f181470ec96c",
- "blk.8.post_attention_norm.weight": "5c506204e00411ef9c8b4134d40eedcc19fffe68dd0af7d7cc49dcabf2dfac7e",
- "blk.8.post_ffw_norm.weight": "002faec235c3678864e2901eed275ce4e9dc229164a91c9cd4c965142ba62305",
- "blk.9.attn_k.weight": "0bab39d8c237f1b6d0010db40467142625a9e6f2e0e4c49a56c12b41e4e0b1fa",
- "blk.9.attn_norm.weight": "de5f38e873b17f07aa7598831b89cc1cae2c9bc3eb2e042ee9af059d2563e84e",
- "blk.9.attn_output.weight": "8a8184702c25a62df9ff309c0c7badc8587208523b2be3e8fa90ce7080573e6f",
- "blk.9.attn_q.weight": "7c961b2431b09ddf95377acd07201cb91bf13d9cd3ae0f2c25c7d6a0358d9f50",
- "blk.9.attn_v.weight": "e22d240cb4743067033e659cbf210ebe2ebbab3e1dea6ccbe5eaa982382ca038",
- "blk.9.ffn_down.weight": "a426f81210f03d6ad53277416e1fdcdf37d8065e4817613edaf6c67a343426be",
- "blk.9.ffn_gate.weight": "a82eba825cb77b8e64f85ff99ede2fc71bc9b01751eeb17e9e6c246ee12ea62e",
- "blk.9.ffn_norm.weight": "1a97f9b1302a3a326d534c5c3fed2db6db0ae45fd0edd381a3e4fc1c75d81030",
- "blk.9.ffn_up.weight": "5f20bac2bbf03bb42adb92fbf99561651e1edda57e0b61935ac7f6c08c0ed7cb",
- "blk.9.post_attention_norm.weight": "9f9866d13988e1946b1e1c80d9374a92a6e3be33748f8eaed3e126d1e1a4c796",
- "blk.9.post_ffw_norm.weight": "a6896dbf698db4dbbe5dbf12417d4fd80e9cad0c539c858892ec0aa5b046bb58",
- "blk.10.attn_k.weight": "ca8446e5d21ecd4e6a70dca8d321be480be4fba94d70cba065205436feb44270",
- "blk.10.attn_norm.weight": "4f41fe290e8f21f63b82151b6cce94bf7318d121468816b0c58af0ff7c1658ab",
- "blk.10.attn_output.weight": "c626d2e9681c5c941bbde43dddfae1a8d4986bf2be4470857bc8e8bd7f869044",
- "blk.10.attn_q.weight": "1e61b210a13a429977325cf15d781ab77d604cfa862f4270329cbd94237d5835",
- "blk.10.attn_v.weight": "8ff8d3e3f058ec3b35ada1057f2ed59c06494d0e0be6a8dc3ff9edf9f0e1a115",
- "blk.10.ffn_down.weight": "bcebc04219f8081a5f483e58103c0ddbbbc631a0a54fd6dd9d55778e041f70ee",
- "blk.10.ffn_gate.weight": "7a23a1e620ef871384ddf9611ccdcfb893fbf013cc203ac8e72f745420f1eea0",
- "blk.10.ffn_norm.weight": "e3a375e43c349a1c6c66c22328e513cc1af3137fe839e43dc8e9be2f65914fd7",
- "blk.10.ffn_up.weight": "5d182e7c94369194fca5f19cbbe668a999911e57f3d363bc7fb6088428700cb9",
- "blk.10.post_attention_norm.weight": "b841c6308296e8984f3c5f549c6e3a242f4b3e19141e1f54cc08de9c46759c09",
- "blk.10.post_ffw_norm.weight": "9d66fa05b5c940208f634f5053d809094c99a2a10a1d1e8847c8281fbd99fb49",
- "blk.11.attn_k.weight": "14adf24ebb2bb17b336ca81cec3e690fd854782f4440ca6c66cc1d7e7bf1c850",
- "blk.11.attn_norm.weight": "2d2213f311f50414702b5b34f22aafb9d9a0b6787243e7578562583dc40ad195",
- "blk.11.attn_output.weight": "de1f14cc2a7fff00cf11b229f0576999205f17b9536e97abc9d6de3cc79a7884",
- "blk.11.attn_q.weight": "2bcc5c147524003109ece0be08b89ac8b25baa71416ffa76573c6c052ffc6eea",
- "blk.11.attn_v.weight": "2e6ab8573070c22dc1e0d7aebe4d52123226dacf7822dcce06fadbb38fb036a4",
- "blk.11.ffn_down.weight": "1b86902f4e36868421e5228b9445051f8290b292df22a6d1af836dcecc1f25c3",
- "blk.11.ffn_gate.weight": "e756e8081bd0a16aea4a9ef5076ad102113524f7a3d50a3a77aaa7f7938b63e8",
- "blk.11.ffn_norm.weight": "6913887267be227cf9d1991a3dd8db2e7e74bb9b5fbdfcb9ac954fd7d7b95b3b",
- "blk.11.ffn_up.weight": "619a3ac0609ebdf42c3fb2b6e4b1db48df79e6dd8418d7ab8f1bbff13d8a6a50",
- "blk.11.post_attention_norm.weight": "e4b4ba92cef7b6a78407e8ab1b0307d47dac6c3df7b6817e28038317ff662d7e",
- "blk.11.post_ffw_norm.weight": "40aceeec58cb855f0c158c9cc217168fcd5d0e735567d587217b1d78df17bc5f",
- "blk.12.attn_k.weight": "c54c5a4d4892522022d1aa2204cfc624f0b4042caa536e678967316293fe5cb1",
- "blk.12.attn_norm.weight": "7cd2ef58298569ffdf244d9b390f3917245276c8206e5780af5f96d8c0bbb446",
- "blk.12.attn_output.weight": "85495ef9cc8b3deb21f741bde463ff6493acae2be51f02ecdeef952cbdec3375",
- "blk.12.attn_q.weight": "d19383f83fd119bfb8c0280c9515705c11d8e7d502019fcf8f49efeef0d106d0",
- "blk.12.attn_v.weight": "869ac669ba49531d9128892a0e27cef15de508ff40cdf80cc1681dde50d09204",
- "blk.12.ffn_down.weight": "578f39f8f9fc2f09138afc884a952d7cc3a9a31de4216acd10e88e19e0b75f8c",
- "blk.12.ffn_gate.weight": "e29a0186bc6c4a0720246306e922d3a83f777dadcf4ac80bad468287031cc8b5",
- "blk.12.ffn_norm.weight": "e1ee95c6584b5cb57fcf1db8ce2bcc03aff91eb389238c094a61c00dde93d1f2",
- "blk.12.ffn_up.weight": "2a826f06d7cdfb3edc6ae250ff44363ef77a2a9cdf96313e23a331b99ebfa17d",
- "blk.12.post_attention_norm.weight": "4bafc7699b948d5cbc0d3e09b418b06c6abc4651a61ada9609d9a2f21c7e5607",
- "blk.12.post_ffw_norm.weight": "bbb8c34a7176bb1a49f9fe2bacca0bd26b673d52c0835b2e90fa11f2962f077f",
- "blk.13.attn_k.weight": "ffeefccfe8255d1b694382012ff4134eee5fec9d9491c8d0ff0a13832d1a37e8",
- "blk.13.attn_norm.weight": "35713726529e3887c4135a88e86e8a4d7270ba5b9f2d1ab462622fbf40a7cdce",
- "blk.13.attn_output.weight": "0d60b7c5cd71190a9ef4b873b0f516be15447c32d83914db2794b14592b0b460",
- "blk.13.attn_q.weight": "8296069e65bef794cefc61257fc65789b3cb22955e30f3df129205e5041b2222",
- "blk.13.attn_v.weight": "ca0f4ab9d16a748fc643a5c0c7a19826a811bf2a4e7316a8c935d4bf0ce8abc6",
- "blk.13.ffn_down.weight": "d5514e0c8e7b3ed1cbcc1605eb5be1733b6ab3514cf8a0508fc72f7d05ed8bcb",
- "blk.13.ffn_gate.weight": "8108e517a82e08a3aefbbd267bfa50a1668f92a76273280ce8a6bc1f6dd61521",
- "blk.13.ffn_norm.weight": "5fcb6132d2134bf1f835b904a99820fa501dbc57d2224129f7098bf3cabc1d36",
- "blk.13.ffn_up.weight": "6d744b7cd390a3cae3aa350dd379b81246acd056a2259996b6aaadece8465ccc",
- "blk.13.post_attention_norm.weight": "e08b14698912509790e9575b8676971fbb0a4d82d719367e3756c0d0c4ab8cc0",
- "blk.13.post_ffw_norm.weight": "2b196e4450fc5f1e7367b2cf7fe33a15fe919fbcdd861d11002346f16e980535",
- "blk.14.attn_k.weight": "120e5f48d7268dfd9ab5f4bc9cc57a7cec63ea9635f56b80d435eb22936e9483",
- "blk.14.attn_norm.weight": "146367bcce4db72cc894419a2e0145a6f533507dd68e4739c10ee480308c401f",
- "blk.14.attn_output.weight": "720fa0165e756876c5cb6ad9e2780dd910390933f3f8849e5add5da04266650b",
- "blk.14.attn_q.weight": "f5183466f56219ca1aca52d8b82c2d966a4198fea40fdd6b39f4d8b06ca2a6dd",
- "blk.14.attn_v.weight": "24f8ea3d5512cd37c43c8329cb0da0c90d1895aef763ac2dcee3fe5157ec50a2",
- "blk.14.ffn_down.weight": "e29960965b384ae5ab3d898a4dbaa8fddd28fa0e477ac28bcac49dec12a5ac67",
- "blk.14.ffn_gate.weight": "6d0d6a74bfe9692e8f8eedff0fc34fc4fa1c8687794f35f2e2b033ab2d7510b8",
- "blk.14.ffn_norm.weight": "f7036c1a9a71e046c9d2af16e9218fda5dbb0f7241ab44747abed1f0f9d602ca",
- "blk.14.ffn_up.weight": "7d69ea1424007ffc9c12247dd0308c616e93ac02a59ec341cfa48f92d6ce3b10",
- "blk.14.post_attention_norm.weight": "65b9712834d9445d4236bec362f3fb795c20d60c541b3dc6dbb7914d9b493e41",
- "blk.14.post_ffw_norm.weight": "9c6a8da2e4e437d5cfdf3b9097e9f8b64bf07946a048badec20f4d374613f38f",
- "blk.15.attn_k.weight": "864bc618303a0e4ee67fb1d5e751de61e936cd51e96669dd86f8cd08f2305045",
- "blk.15.attn_norm.weight": "f9f4187da6eeadc2fc5921d8fe669741697d16c13d71e4aaeb73b82f50dc577e",
- "blk.15.attn_output.weight": "ce2419a0b097036b2a31f2f4ad731d5814bcc2ef4c511786e24471e5eefd273b",
- "blk.15.attn_q.weight": "9539db5a970d11ebe99722d1e13fcd635e250033630811efe583d2f97778e4a9",
- "blk.15.attn_v.weight": "1c834b48ccd88adaeabb7d8bcb6be0bcd6d5ac1354ce88fc28f19a1a96b81ab3",
- "blk.15.ffn_down.weight": "bc1f97a65dde6fa2c1e5397afb612266944b343f2eaa868b635ddd25829f8a42",
- "blk.15.ffn_gate.weight": "1b14529d57056b79037f6cb5008132e62cc35992353b38dda59572274623103b",
- "blk.15.ffn_norm.weight": "9af77458de9ee55c66f93865759f9c2c398557f94f3fa8fa6af30543d7339cde",
- "blk.15.ffn_up.weight": "41d524a26b61a9595816b4fd53cf57ef50a702e4ef32933ff6136dca9136a267",
- "blk.15.post_attention_norm.weight": "c60a03cd0e63a7db5c80015e58e9b97ba2208caa19f66a6fef5c4447eca900ce",
- "blk.15.post_ffw_norm.weight": "34f7f9f96769215bbc3d17084df091864aef96a6645b7d0b3b7d9bd92f1a4b0b",
- "blk.16.attn_k.weight": "7e27240d9f3a8c6cf0f4a980113d43234f514eadc3e3e1792b86efb29ffb1a6d",
- "blk.16.attn_norm.weight": "af798acc0899282a30448edec48223b3e8efda177090273e612d8eca5e377301",
- "blk.16.attn_output.weight": "79df39a3709d3d53e84146291e0944a7a653d06705293d9ccb5648dceadb432c",
- "blk.16.attn_q.weight": "db58a1c3b83ad294804e5fd7321005719e200659173466df5a52a182b80b7165",
- "blk.16.attn_v.weight": "2af6d48cbaeb225b5c1a704f76abd89c8ab1521417695b112b4dcc2cbd39b74d",
- "blk.16.ffn_down.weight": "fc1c813eb5e7da3d6194569d6cb21602fc6eff2dc8e1b0eb753f2d5df148189c",
- "blk.16.ffn_gate.weight": "7a80bcbc42464bd55df4814a6edbd7b5c153e0428323bbe49de55e2d2add33e7",
- "blk.16.ffn_norm.weight": "2041685ee926d30f3f2ae4ec35b5688f1cd834167a6359a7d4057eac804c58b2",
- "blk.16.ffn_up.weight": "8da4b718973ac1d43b928829bc45e062fd101984d6c98dd825bd7c5d08ebfbe3",
- "blk.16.post_attention_norm.weight": "975c48fe680a6167438a106140a8872eee7765191f152d80e3b8ddf47693e095",
- "blk.16.post_ffw_norm.weight": "4de2d4d483acfe4fc77860ea929025df2f4e15c10729413f36a18c94eaa6d689",
- "blk.17.attn_k.weight": "f937e61f0af8c4cd98ee742648eb60e02e579683e21d421071295a3b70aebaad",
- "blk.17.attn_norm.weight": "c3270583ed28b7e423f5b170c59113234f258169b93a867d9274f4c10b7cb115",
- "blk.17.attn_output.weight": "b8c1150e81e685e539a5dcf2c19047a24eba2b281fabe166674b1d71ef4612ea",
- "blk.17.attn_q.weight": "c255100ae2011e7dc7e3bf3bc3ccd96d859fbb98581cae993d7b82c1ba8e8b39",
- "blk.17.attn_v.weight": "5830bb0a555984c6485348067f70b5d22ae337c011aa9248dac2ff4c95944551",
- "blk.17.ffn_down.weight": "8ff9a7cccaa3776434a9d895aae4fb5c36c736bf2ec98784226b4c234940fbb0",
- "blk.17.ffn_gate.weight": "1b52876739712831c272911533da206f407b46034a1a4ae8a88c1f96b6bd5747",
- "blk.17.ffn_norm.weight": "d0e16ba5e87c91b545334e022058c7d03849665c3b1a6298771b656531366b66",
- "blk.17.ffn_up.weight": "4dd6211d01dbebbe21052708eddc242b082a58b5f18ed16479e17987c1d3432e",
- "blk.17.post_attention_norm.weight": "6f49c775c7417dade77ba8268a0f8441c1e5ec28b5d7e4dc5ed07a04d04600c8",
- "blk.17.post_ffw_norm.weight": "b91a0bb2e6679e9c9be06ad323adae441d00a3d673efb19d7c4954be2aa84b27",
- "blk.18.attn_k.weight": "22b565ace1b4da8b33865a58625be1d90beea9891f29686a69fa9cf7c93217db",
- "blk.18.attn_norm.weight": "3e0160d7063c8753de65d2356a66648e47d921efdc5c917efb8209892120f8db",
- "blk.18.attn_output.weight": "e3180f0bb4ca90b31e9b08158db38e332de62dfbaefe34aa94cc316409331e09",
- "blk.18.attn_q.weight": "f3a5a83614c3ba7ea41cdd5b1b0819a241ee2a951a381ce4a9e001d3f700ed8f",
- "blk.18.attn_v.weight": "f3350a5984fb951fc738adcf78147e6d812ff1c576670c460cafc99c253c1654",
- "blk.18.ffn_down.weight": "9e9d09b13a33525e14bdaee6efc65c551ac7cf7680e534b940ab122a3a7c1ac9",
- "blk.18.ffn_gate.weight": "ebaec8b4b578a2e8d815baac12f1675c208f80c68074d5a18288a2e1a60680ee",
- "blk.18.ffn_norm.weight": "33e7687c53a242f2f8dc7093a491c97b18d4a5a8c14d183f02bd586a770f05aa",
- "blk.18.ffn_up.weight": "78a1816662378ce56cc870e705174492781897b3afd2d4d97a51f10f2f2987c1",
- "blk.18.post_attention_norm.weight": "a58dde3f12df3e94cbc27d87c8ea86f89af8a388a506446ff6758f05399b05fc",
- "blk.18.post_ffw_norm.weight": "cebf90cc143577d483cca27b032dfd82031ee59bdf17c0e2cf60a0a3ad5bf996",
- "blk.19.attn_k.weight": "4683375d0599ac9e2232196aae1e90af13a14cae26e865465de5c8e257bb2055",
- "blk.19.attn_norm.weight": "f3eba936bfb1814bbcb0a1d62739eb66daac839df8c9c836fe0e94860df88525",
- "blk.19.attn_output.weight": "51c0f01d38a9dcfe9bdbc4643576fab164c1d9e4b7168b7695c0ee55e6965667",
- "blk.19.attn_q.weight": "28d15b69b8416f2e7ddc88fe381cb1e2ef2ad705fb1c268139ba96498cc74848",
- "blk.19.attn_v.weight": "6860f1cd720638e63a981fa2c0b4db900129826bcb9823c9ddf9fb8b1b9f3383",
- "blk.19.ffn_down.weight": "bc7f2d7827ee01c2dd41401c7b3b1700ad3a4ff620e8bb734f92630d342dcc7f",
- "blk.19.ffn_gate.weight": "54d03ef69ba373fc410fbca8f1e34a565d58e4296d9a035ff7e48340b9c848e7",
- "blk.19.ffn_norm.weight": "9178fc796a340ee6e8128ca74c0cb6203d1adbed6927af4e5ac7863da57affc7",
- "blk.19.ffn_up.weight": "a77bd708026c6e83ad5c79c223278e74621bcf74a9641c7818d96b595daaad20",
- "blk.19.post_attention_norm.weight": "ae94aa26f4c411bf9496a6fd4a6df64ee589ee1ae9a04b531d45acc95721e582",
- "blk.19.post_ffw_norm.weight": "9ad210700edeef12133bdcff04bf1c7f62b49f6f4a9ba483c7cdc59857c24a5c",
- "blk.20.attn_k.weight": "e35bce1e9f4a7a09ef34721f57ea38cfca68c272f52d923fe50af8308f66cfaa",
- "blk.20.attn_norm.weight": "644800f6926fd34f233795c4dec1151a295d2138ca8cac33e3e48167d26f8b41",
- "blk.20.attn_output.weight": "8d3758cd236471741e1ad66c0710cb79077dc8c7a3a292d35bc551c0c5abe627",
- "blk.20.attn_q.weight": "c333b1f0f6f956b5d73891df10b1a0321e55fc31c40d623a24e1f52caa6a998b",
- "blk.20.attn_v.weight": "8562b418d0c4868a050fb19fa3fcaf50a8cf1c669f537d666c80c7b3a04714e1",
- "blk.20.ffn_down.weight": "97efb608ac44cc804198faec3ee66eafe56ced6b7ca5359700c6f1df75b7205e",
- "blk.20.ffn_gate.weight": "5c61151d86f28415c73c73d90ec088c646cbe5c1640197caf58eb501ba7db293",
- "blk.20.ffn_norm.weight": "24bbe0a701afd4bbeea65b3edde712b3cbb2281043bbc43dbf250582453116ed",
- "blk.20.ffn_up.weight": "e170cf68e249566aa99eb6f6b265679bf9a5a6b76830ba24e7e130c2515910c4",
- "blk.20.post_attention_norm.weight": "e092d751cfe20dbf2d348358f3b38397bd83e4ed94d6bbaa6bbaddcd902b2ac4",
- "blk.20.post_ffw_norm.weight": "219a18a47dcba76e669e4322223a5a9227bd3db1de3fbd3d3cfb22e54a783c5a",
- "blk.21.attn_k.weight": "c3a095ebddb42c63824f1c98da65263dc88e4d790a26aa1632840b44f5cc7cb1",
- "blk.21.attn_norm.weight": "ef8bbaded5fbc45ad9cf3985ae02174524e7090fe6362811124f942ef643bec7",
- "blk.21.attn_output.weight": "668f018aba72baac6252aa3ad58569ddd55ab751a0dd8d7bcc9fb9b6efb4bf53",
- "blk.21.attn_q.weight": "e759c65663089f3bbbd51847934c185e680c82f1249065d5d487da638e519e6d",
- "blk.21.attn_v.weight": "2ff57762686cf9ba1f5a6be76503454b97556ce67f4ac98254bd0562231197ba",
- "blk.21.ffn_down.weight": "3fd106556fb721b1c28ae3f4026bc83eb1b08ed910f2ba5f466c6b5f327d91cb",
- "blk.21.ffn_gate.weight": "338022d882f4b6619e8054a6fb909696fa3eef3013cf69b65c3cacdfc5b9e42c",
- "blk.21.ffn_norm.weight": "1e77660c23a3f9653ee721a863d1960f773d87437cabc4dc0a6e17ee3d4e5e44",
- "blk.21.ffn_up.weight": "7d31b20fbc2e6eba8f350f170069dc36f0cb12f68fbc4206ec5022a74085ebcb",
- "blk.21.post_attention_norm.weight": "9638bae8d8bdcd7ed68da282979cd84a07c41ff9cabcaea94ebc846a1803db23",
- "blk.21.post_ffw_norm.weight": "d622ef11115fe0cbe04b727d5a3b6371e7f39bf08c8d5eb9bc6da52e3f3cfb9d",
- "blk.22.attn_k.weight": "5c321cb29deffbe57de200dd206a62005f1e80acb86c4fd2349dd44c8d3594fd",
- "blk.22.attn_norm.weight": "198d949705d7170a331d75889d8c7500c3635254dac2cc6aa4dc35d556584536",
- "blk.22.attn_output.weight": "19805cd5d7025b457e5d41d70db8b3fd63c2dd0e4a94d3ef1704d50ef4e749e8",
- "blk.22.attn_q.weight": "177836cd583fc87405975ddc21ebfebdaa090a0363799664c72caa3da851ae2c",
- "blk.22.attn_v.weight": "fea255692483e30d0108f9e4e250eb3ed7dbda8d83f499b06519b8c223ae6096",
- "blk.22.ffn_down.weight": "00cb8939f03e5817d6d412de8cf2c923c9568d5493e382cec7faf5718fb034eb",
- "blk.22.ffn_gate.weight": "b0591065b91281b2fbd8a9567f3568d40479f680e1f0a29e27ae213f37642489",
- "blk.22.ffn_norm.weight": "96b5c5d0737c2ceb8fc869f54adb9e5f46e28cb7b177c40f49fa926b923c00f8",
- "blk.22.ffn_up.weight": "81f472185b24344ab0594ea8246cc6e200e0dc1cab4943e74fbe4ca19d5a9701",
- "blk.22.post_attention_norm.weight": "27fa9aa6260aa3071e0391e1a1d49322dcb6e8072315b8a9b7064087108dbd06",
- "blk.22.post_ffw_norm.weight": "f37e1dcd7f643d9545675ffe9dc527a11eba86eb204989c2f44f636b266d896a",
- "blk.23.attn_k.weight": "5d82f36658a56c3f94d0bb2d61f65509c966fa6568f81812e0d3e338b380ef8c",
- "blk.23.attn_norm.weight": "b7983f88d9cad88bc88a528923e6da592ad20e699965b223ebc10840fe1f4fec",
- "blk.23.attn_output.weight": "59f97f80f430d71606aab0158a195aed29ccd3405e6c0a5c41c809be8eb01898",
- "blk.23.attn_q.weight": "53ac4789fe958919cc02ea4222bcd64c0ea1b4baa54304bff46635bdf42f7490",
- "blk.23.attn_v.weight": "ec8abe09b9e84dbb52c7a068094657c6d3c62fe551ba8d7c3a3f23da622e9756",
- "blk.23.ffn_down.weight": "3cf547eccb1b82aa64f208cee9682d7f558ca84e0aead7d9d3d1420d90f3d992",
- "blk.23.ffn_gate.weight": "366aa2486d911ba81eb519119e13807deacf7e9908bc1975a2a63e00d6b10124",
- "blk.23.ffn_norm.weight": "6d1d4a4af34bb7dc090ac87d6457d398c3e0fb68bd2e2b60b099dc318b6cfac3",
- "blk.23.ffn_up.weight": "53f76692e253f5d2420b3f200c731b9f3b7a83e379920b4a067c729b4674aa4d",
- "blk.23.post_attention_norm.weight": "7c952fa0efa76b3f048c8c4c9e8dcb5e3724d231327eda6423a34d3f3d3367de",
- "blk.23.post_ffw_norm.weight": "7ab188cfe61f0a91b40309a0ab6bfa99f19d0ff2a37b6ac10e5f0c7f44eb5270",
- "blk.24.attn_k.weight": "225798792f9bfdd10eff0505ebe61e0aad0209c17b431f6044ee7968ffe8c198",
- "blk.24.attn_norm.weight": "635e3c1ebf5219bbebfc40ef164bc32d2b726ef595a94da64ac524ae878e2915",
- "blk.24.attn_output.weight": "482f5bb2db8d9ed22b253d9a3296333b239efe698e5992e5d77e7e12dc2a5cf5",
- "blk.24.attn_q.weight": "43805bbccddb65d58fffc4be9b5c374d4e1df1395ec1e1ffb4bcff03e98d5adb",
- "blk.24.attn_v.weight": "fa741af54b4a3b1775d32f59134756090c5df2e7345a12a2d8db94fe289667a7",
- "blk.24.ffn_down.weight": "83c6351e3162626b276f524a57836144625c2556dbe321b57cbd8fd486a68fab",
- "blk.24.ffn_gate.weight": "fbe66be0d84d12cea5176cc7eaef64382ffc7324cd9d6266a3342dc43442f2ac",
- "blk.24.ffn_norm.weight": "77c1445a8639ad24938bdf0280233eea2362d47391421833dfa72ec756dfc1e8",
- "blk.24.ffn_up.weight": "78235ac729ee23c1cf1ae543751e3af32776d8808cee6e529c2a625a1f027654",
- "blk.24.post_attention_norm.weight": "161f71b6d07628d43e4ae51a4c9088ec6ca2db123a17986a14505d83fdd04dad",
- "blk.24.post_ffw_norm.weight": "cf1ba692aa683368b02ac413e69b2521b98c69a5274eacbb54165b53bf38a8b2",
- "blk.25.attn_k.weight": "057a56bd8c8d2b41608d1f71faa3052902152ddf85e47669ad950c1c3e77c33f",
- "blk.25.attn_norm.weight": "b7179fe02c334da556ddcf6c1b502245639a728c4cbba8b552d8e1df4565ee9d",
- "blk.25.attn_output.weight": "4fed8b05b08a0ff75ffd022701bbeb52f17b23d09332a1ddcba737244bd0d3b0",
- "blk.25.attn_q.weight": "c52e99f5d38bf7538d6106a0bbf38ac6dc6296bca9a3f849afa384ea67b4af01",
- "blk.25.attn_v.weight": "c49c23d8e1cfa6a8eb971eb69942204890c6d7d830dc8774c84b108a80598912",
- "blk.25.ffn_down.weight": "c08d4dc8412b19fdc870c164b83c341b236ec6fe7bb4a9bcfe0dc100faa20286",
- "blk.25.ffn_gate.weight": "1a4cb3f36735d59181721471452807903006539e5e1b5ceb4f72d1d7ae134127",
- "blk.25.ffn_norm.weight": "8fd6bd0dcec5198761525a36992a57c9ec5e9da60a22092839a84ae8c4e87f26",
- "blk.25.ffn_up.weight": "3a00f39bdd5f31dc5e3b281d2002e1ac4f2475d49a0ac1d7720a25b377dcd04a",
- "blk.25.post_attention_norm.weight": "e5f31a648612c859b6d21c9ee426e87a86cb1973dfdd86276c767371d9cef5ad",
- "blk.25.post_ffw_norm.weight": "553c3bd774922c99c2384380a142d019881d30dbf0fe3bf9430dabfb3f6cbd33",
- "output_norm.weight": "49445c4585ab0a8135717a0bdb1cda4a062a030177d0119561d91542aec5744b"
-}
|
ollama
|
ollama
|
Go
|
Go
| 131,099
| 10,753
|
Get up and running with Llama 3.3, DeepSeek-R1, Phi-4, Gemma 2, and other large language models.
|
ollama_ollama
|
BUG_FIX
|
obvious
|
31cccab3ecf10087555e77abc793f0028992d41a
|
2023-10-19 22:27:20
|
Shruti Sen
|
Fix typo in ADOPTERS.md (#20520)
| false
| 1
| 1
| 2
|
--- ADOPTERS.md
@@ -29,7 +29,7 @@ This is a list of adopters using PowerShell in production or in their products (
[AWS Lambda Support For PowerShell](https://github.com/aws/aws-lambda-dotnet/tree/master/PowerShell) and [AWS PowerShell Tools for `CodeBuild`](https://docs.aws.amazon.com/powershell/latest/reference/items/CodeBuild_cmdlets.html)
as well as supporting PowerShell Core in both Windows and Linux EC2 Images.
* [Azure Resource Manager Deployment Scripts](https://learn.microsoft.com/azure/azure-resource-manager/templates/deployment-script-template) Complete the "last mile" of your Azure Resource Manager (ARM) template deployments with a Deployment Script, which enables you to run an arbitrary PowerShell script in the context of a deployment.
- It is designed to let you complete tasks that should be part of a deployment, but are not possible in an ARM template today — for example, creating a Key Vault certificate or querying an external API for a new CIDR block.
+ Designed to let you complete tasks that should be part of a deployment, but are not possible in an ARM template today — for example, creating a Key Vault certificate or querying an external API for a new CIDR block.
* [Azure Pipelines Hosted Agents](https://learn.microsoft.com/azure/devops/pipelines/agents/hosted?view=azure-devops) Windows, Ubuntu, and macOS Agents used by Azure Pipelines customers have PowerShell pre-installed so that customers can make use of it for all their CI/CD needs.
* [GitHub Actions Virtual Environments for Hosted Runners](https://help.github.com/actions/reference/virtual-environments-for-github-hosted-runners) Windows, Ubuntu, and macOS virtual environments used by customers of GitHub Actions include PowerShell out of the box.
* [GitHub Actions Python builds](https://github.com/actions/python-versions) GitHub Actions uses PowerShell to automate building Python from source for its runners.
|
powershell
|
powershell
|
C#
|
C#
| 46,656
| 7,522
|
PowerShell for every system!
|
powershell_powershell
|
DOC_CHANGE
|
changes in md file
|
06ce2b82085d7776a6c68bd20000461d3fb6908f
|
2023-01-10 20:23:15
|
Zeeshan Lakhani
|
discord link and some readme updates (#705) * fix issue update action
| false
| 8
| 45
| 53
|
--- .github/workflows/lychee.yml
@@ -11,7 +11,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/[email protected]
-
+
- name: Link Checker
uses: lycheeverse/lychee-action@master
#env:
@@ -20,8 +20,9 @@ jobs:
args: --verbose --max-retries 4 --no-progress '**/*.md'
- name: Create Issue From File
- uses: peter-evans/create-issue-from-file@v4
+ uses: peter-evans/create-issue-from-file@v2
with:
title: Link Checker Report
content-filepath: ./lychee/out.md
labels: report, automated issue
+
--- README.md
@@ -1,22 +1,58 @@
## 
-
- [](https://discord.gg/Tu2VynkRWV)
+
+ [](https://gitter.im/papers-we-love/community?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge)
**Papers We Love** (*PWL*) is a community built around reading, discussing and learning more about academic computer science papers. This repository serves as a directory of some of the best papers the community can find, bringing together documents scattered across the web. You can also visit the [Papers We Love site](http://paperswelove.org/) for more info.
Due to [licenses](https://github.com/papers-we-love/papers-we-love/blob/master/.github/CONTRIBUTING.md#respect-content-licenses) we cannot always host the papers themselves (when we do, you will see a :scroll: emoji next to its title in the directory README) but we can provide links to their locations.
-If you enjoy the papers, perhaps stop by a local chapter meetup and join in on the vibrant discussions around them. You can also discuss *PWL* events, the content in this repository, and/or anything related to *PWL* on our [Discord](https://discord.gg/Tu2VynkRWV) server.
+If you enjoy the papers, perhaps stop by a local chapter meetup and join in on the vibrant discussions around them. You can also discuss *PWL* events, the content in this repository, and/or anything related to *PWL* on our [Slack](https://paperswelove.slack.com/messages/general/), after [signing-up](http://papersweloveslack.herokuapp.com/) to join it, or on our *#paperswelove* IRC channel on freenode.
### Chapters
-Let us know if you are interested in [starting one](https://github.com/papers-we-love/organizers) in your city!
+Here are our official chapters. Let us know if you are interested in [starting one](https://github.com/papers-we-love/organizers) in your city!
+
+* [Athens](https://www.meetup.com/Papers-We-Love-Athens)
+* [Atlanta](https://www.meetup.com/Papers-We-Love-Atlanta)
+* [Bangalore](http://www.meetup.com/Papers-we-love-Bangalore/)
+* [Berlin](http://www.meetup.com/Papers-We-Love-Berlin/)
+* [Bhubaneswar](https://www.facebook.com/groups/pwlbbsr/)
+* [Boston](http://www.meetup.com/Papers-We-Love-Boston-Cambridge/)
+* [Bucharest](http://www.meetup.com/papers-we-love-bucharest/)
+* [Buenos Aires](https://paperswelove.org/buenos-aires/)
+* [Chattanooga](http://www.meetup.com/Papers-We-Love-Chattanooga/)
+* [Chicago](http://www.meetup.com/papers-we-love-chicago/)
+* [Columbus, Ohio](http://www.meetup.com/Papers-We-Love-Columbus/)
+* [Hyderabad](http://www.meetup.com/papers-we-love-hyderabad/)
+* [Iowa City](https://www.meetup.com/techcorridorio)
+* [Kathmandu](https://www.facebook.com/groups/PapersWeLoveKathmandu/)
+* [Kyiv](https://www.facebook.com/groups/PapersWeLoveKyiv)
+* [Lebanon](http://www.paperswelovelb.club)
+* [London](http://www.meetup.com/papers-we-love-london)
+* [Los Angeles](http://www.meetup.com/papers-we-love-la)
+* [Montreal](http://www.meetup.com/Papers-We-Love-Montreal/)
+* [New York City](http://www.meetup.com/papers-we-love/)
+* [Paris](http://www.meetup.com/Papers-We-Love-Paris/)
+* [Portland](https://www.meetup.com/papers-we-love-pdx/)
+* [Pune](http://www.meetup.com/Doo-Things)
+* [Raleigh-Durham](https://www.meetup.com/Papers-We-Love-Raleigh-Durham/)
+* [San Diego](http://www.meetup.com/Papers-We-Love-San-Diego/)
+* [San Francisco](http://www.meetup.com/papers-we-love-too/)
+* [Seattle](http://www.meetup.com/Papers-We-Love-Seattle/)
+* [Seoul, Korea](http://www.meetup.com/seoul-tech-society)
+* [Singapore](https://www.facebook.com/groups/paperswelovesg/)
+* [Teresina](https://www.meetup.com/pt-BR/Papers-We-Love-Teresina/)
+* [Toronto](http://www.meetup.com/Papers-We-Love-Toronto/)
+* [Vienna](http://www.meetup.com/Papers-We-Love-Vienna/)
+* [Washington, DC](http://www.meetup.com/Papers-We-Love-DC-NoVA/)
+* [Winnipeg](http://pwlwpg.ca/)
+* [Zürich](https://www.meetup.com/Papers-we-love-Zurich/)
All of our meetups follow our [Code of Conduct](CODE_OF_CONDUCT.md).
### Past Presentations
-Check out our [YouTube](https://www.youtube.com/user/PapersWeLove) channel for videos and video playlists.
+Check out our [YouTube](https://www.youtube.com/user/PapersWeLove) and [MixCloud](https://www.mixcloud.com/paperswelove/) (audio-only format) channels.
## Info
@@ -58,7 +94,7 @@ Reading a paper is not the same as reading a blogpost or a novel. Here are a few
* [Should I Read Papers?](http://michaelrbernste.in/2014/10/21/should-i-read-papers.html)
* [The Refreshingly Rewarding Realm of Research Papers](https://www.youtube.com/watch?v=8eRx5Wo3xYA)
* [How to read a paper](http://ccr.sigcomm.org/online/files/p83-keshavA.pdf)
-
+
### Applications/Ideas built around Papers We Love
* Love a Paper - [@loveapaper](https://twitter.com/loveapaper)
|
papers-we-love
|
papers-we-love
|
Shell
|
Shell
| 91,347
| 5,859
|
Papers from the computer science community to read and discuss.
|
papers-we-love_papers-we-love
|
CONFIG_CHANGE
|
changes in yml file and readme
|
510a21946f9b30d766bea9175436b26c3deda6bd
| null |
Stu Grossman
|
Add mh-aix
| false
| 1
| 0
| 1
|
--- .Sanitize
@@ -21,6 +21,7 @@ Do-first:
Things-to-keep:
+mh-aix
mh-apollo68
mh-decstation
mh-delta88
|
bminor_binutils-gdb.json
| null | null | null | null | null | null |
bminor_binutils-gdb.json
|
CONFIG_CHANGE
|
Changes under commit directory
|
d3e22fb28f3afa3d878bf66d4023cc18fd46d52b
|
2023-11-23 17:25:48
|
Jaida Wu
|
Fix
| false
| 1
| 1
| 2
|
--- bypass.php
@@ -329,7 +329,7 @@ logf("Refactoring parameters...");
$data = json_decode(decryptData($args), true);
// V816 is the special identity for HyperOS in MIUI version
-$data["rom_version"] = str_replace("V816", "V14", $data["rom_version"]);
+$data["rom_version"] = str_replace("V816", "V14", decryptData($args));
$data = json_encode($data);
$sign = signData($data);
|
xiaomi-hyperos-bootloader-bypass
|
mlgmxyysd
|
PHP
|
PHP
| 3,496
| 367
|
A PoC that exploits a vulnerability to bypass the Xiaomi HyperOS community restrictions of BootLoader unlocked account bindings.
|
mlgmxyysd_xiaomi-hyperos-bootloader-bypass
|
BUG_FIX
|
obvious
|
504a410f02e01a2ec948a92e4579a28295184898
|
2024-09-18 06:41:26
|
Michael Yang
|
llm: add solar pro (preview) (#6846)
| false
| 402
| 0
| 402
|
--- llm/patches/0008-solar-pro.patch
@@ -1,402 +0,0 @@
-From 8313ce5f43f11f3d84f352f97f3802792e90e18c Mon Sep 17 00:00:00 2001
-From: Michael Yang <[email protected]>
-Date: Mon, 16 Sep 2024 15:53:16 -0700
-Subject: [PATCH] add solar-pro support
-
-solar-pro introduces block skip connections where blocks are connected
-to other, non-sequential blocks with a scale multiple
-
-this change adds 4 new keys to store the skip connections and one new
-tensor to store the scalar. the scalar is implemented a 1-dimensional
-tensor with 2 elements dervied from the model's bskcn_tv configuration.
-in general, the values are (bskcn_tv, 1 - bskcn_tv)
----
- src/llama.cpp | 267 +++++++++++++++++++++++++++++++++++++++++++++++---
- 1 file changed, 254 insertions(+), 13 deletions(-)
-
-diff --git a/src/llama.cpp b/src/llama.cpp
-index f79bd782..b7771f53 100644
---- a/src/llama.cpp
-+++ b/src/llama.cpp
-@@ -213,6 +213,7 @@ enum llm_arch {
- LLM_ARCH_NEMOTRON,
- LLM_ARCH_EXAONE,
- LLM_ARCH_RWKV6,
-+ LLM_ARCH_SOLAR,
- LLM_ARCH_UNKNOWN,
- };
-
-@@ -261,6 +262,7 @@ static const std::map<llm_arch, const char *> LLM_ARCH_NAMES = {
- { LLM_ARCH_NEMOTRON, "nemotron" },
- { LLM_ARCH_EXAONE, "exaone" },
- { LLM_ARCH_RWKV6, "rwkv6" },
-+ { LLM_ARCH_SOLAR, "solar" },
- { LLM_ARCH_UNKNOWN, "(unknown)" },
- };
-
-@@ -314,6 +316,7 @@ enum llm_kv {
- LLM_KV_ATTENTION_KV_LORA_RANK,
- LLM_KV_ATTENTION_RELATIVE_BUCKETS_COUNT,
- LLM_KV_ATTENTION_SLIDING_WINDOW,
-+ LLM_KV_ATTENTION_BLOCK_SKIP_CONNECTION,
-
- LLM_KV_ROPE_DIMENSION_COUNT,
- LLM_KV_ROPE_FREQ_BASE,
-@@ -405,19 +408,20 @@ static const std::map<llm_kv, const char *> LLM_KV_NAMES = {
- { LLM_KV_TIME_MIX_EXTRA_DIM, "%s.time_mix_extra_dim" },
- { LLM_KV_TIME_DECAY_EXTRA_DIM, "%s.time_decay_extra_dim" },
-
-- { LLM_KV_ATTENTION_HEAD_COUNT, "%s.attention.head_count" },
-- { LLM_KV_ATTENTION_HEAD_COUNT_KV, "%s.attention.head_count_kv" },
-- { LLM_KV_ATTENTION_MAX_ALIBI_BIAS, "%s.attention.max_alibi_bias" },
-- { LLM_KV_ATTENTION_CLAMP_KQV, "%s.attention.clamp_kqv" },
-- { LLM_KV_ATTENTION_KEY_LENGTH, "%s.attention.key_length" },
-- { LLM_KV_ATTENTION_VALUE_LENGTH, "%s.attention.value_length" },
-- { LLM_KV_ATTENTION_LAYERNORM_EPS, "%s.attention.layer_norm_epsilon" },
-- { LLM_KV_ATTENTION_LAYERNORM_RMS_EPS, "%s.attention.layer_norm_rms_epsilon" },
-- { LLM_KV_ATTENTION_CAUSAL, "%s.attention.causal" },
-- { LLM_KV_ATTENTION_Q_LORA_RANK, "%s.attention.q_lora_rank" },
-- { LLM_KV_ATTENTION_KV_LORA_RANK, "%s.attention.kv_lora_rank" },
-- { LLM_KV_ATTENTION_RELATIVE_BUCKETS_COUNT, "%s.attention.relative_buckets_count" },
-- { LLM_KV_ATTENTION_SLIDING_WINDOW, "%s.attention.sliding_window" },
-+ { LLM_KV_ATTENTION_HEAD_COUNT, "%s.attention.head_count" },
-+ { LLM_KV_ATTENTION_HEAD_COUNT_KV, "%s.attention.head_count_kv" },
-+ { LLM_KV_ATTENTION_MAX_ALIBI_BIAS, "%s.attention.max_alibi_bias" },
-+ { LLM_KV_ATTENTION_CLAMP_KQV, "%s.attention.clamp_kqv" },
-+ { LLM_KV_ATTENTION_KEY_LENGTH, "%s.attention.key_length" },
-+ { LLM_KV_ATTENTION_VALUE_LENGTH, "%s.attention.value_length" },
-+ { LLM_KV_ATTENTION_LAYERNORM_EPS, "%s.attention.layer_norm_epsilon" },
-+ { LLM_KV_ATTENTION_LAYERNORM_RMS_EPS, "%s.attention.layer_norm_rms_epsilon" },
-+ { LLM_KV_ATTENTION_CAUSAL, "%s.attention.causal" },
-+ { LLM_KV_ATTENTION_Q_LORA_RANK, "%s.attention.q_lora_rank" },
-+ { LLM_KV_ATTENTION_KV_LORA_RANK, "%s.attention.kv_lora_rank" },
-+ { LLM_KV_ATTENTION_RELATIVE_BUCKETS_COUNT, "%s.attention.relative_buckets_count" },
-+ { LLM_KV_ATTENTION_SLIDING_WINDOW, "%s.attention.sliding_window" },
-+ { LLM_KV_ATTENTION_BLOCK_SKIP_CONNECTION, "%s.attention.block_skip_connection.%d" },
-
- { LLM_KV_ROPE_DIMENSION_COUNT, "%s.rope.dimension_count" },
- { LLM_KV_ROPE_FREQ_BASE, "%s.rope.freq_base" },
-@@ -589,6 +593,7 @@ enum llm_tensor {
- LLM_TENSOR_ENC_FFN_DOWN,
- LLM_TENSOR_ENC_FFN_UP,
- LLM_TENSOR_ENC_OUTPUT_NORM,
-+ LLM_TENSOR_BSKCN_TV,
- };
-
- static const std::map<llm_arch, std::map<llm_tensor, std::string>> LLM_TENSOR_NAMES = {
-@@ -1408,6 +1413,24 @@ static const std::map<llm_arch, std::map<llm_tensor, std::string>> LLM_TENSOR_NA
- { LLM_TENSOR_CHANNEL_MIX_RECEPTANCE, "blk.%d.channel_mix_receptance" },
- },
- },
-+ {
-+ LLM_ARCH_SOLAR,
-+ {
-+ { LLM_TENSOR_TOKEN_EMBD, "token_embd" },
-+ { LLM_TENSOR_OUTPUT_NORM, "output_norm" },
-+ { LLM_TENSOR_OUTPUT, "output" },
-+ { LLM_TENSOR_ATTN_NORM, "blk.%d.attn_norm" },
-+ { LLM_TENSOR_ATTN_Q, "blk.%d.attn_q" },
-+ { LLM_TENSOR_ATTN_K, "blk.%d.attn_k" },
-+ { LLM_TENSOR_ATTN_V, "blk.%d.attn_v" },
-+ { LLM_TENSOR_ATTN_OUT, "blk.%d.attn_output" },
-+ { LLM_TENSOR_FFN_NORM, "blk.%d.ffn_norm" },
-+ { LLM_TENSOR_FFN_GATE, "blk.%d.ffn_gate" },
-+ { LLM_TENSOR_FFN_DOWN, "blk.%d.ffn_down" },
-+ { LLM_TENSOR_FFN_UP, "blk.%d.ffn_up" },
-+ { LLM_TENSOR_BSKCN_TV, "bskcn_tv" },
-+ },
-+ },
- {
- LLM_ARCH_UNKNOWN,
- {
-@@ -2237,6 +2260,7 @@ enum e_model {
- MODEL_15B,
- MODEL_16B,
- MODEL_20B,
-+ MODEL_22B,
- MODEL_30B,
- MODEL_34B,
- MODEL_35B,
-@@ -2284,6 +2308,8 @@ struct llama_hparams {
- std::array<uint32_t, LLAMA_MAX_LAYERS> n_head_kv_arr;
- std::array<uint32_t, LLAMA_MAX_LAYERS> n_ff_arr;
-
-+ std::array<std::array<uint32_t, LLAMA_MAX_LAYERS>, 4> n_bskcn_arr;
-+
- uint32_t n_layer_dense_lead = 0;
- uint32_t n_lora_q = 0;
- uint32_t n_lora_kv = 0;
-@@ -2349,6 +2375,7 @@ struct llama_hparams {
- if (this->n_head_arr != other.n_head_arr) return true;
- if (this->n_head_kv_arr != other.n_head_kv_arr) return true;
- if (this->n_ff_arr != other.n_ff_arr) return true;
-+ if (this->n_bskcn_arr != other.n_bskcn_arr) return true;
-
- if (this->n_rel_attn_bkts != other.n_rel_attn_bkts) return true;
- if (this->n_layer_dense_lead != other.n_layer_dense_lead) return true;
-@@ -2455,6 +2482,14 @@ struct llama_hparams {
- return ssm_d_state * ssm_d_inner;
- }
- }
-+
-+ bool n_bskcn(uint32_t n, uint32_t il = 0) const {
-+ if (il < n_layer) {
-+ return n_bskcn_arr[n][il] > 0;
-+ }
-+
-+ GGML_ABORT("fatal error");
-+ }
- };
-
- static_assert(std::is_trivially_copyable<llama_hparams>::value, "llama_hparams must be trivially copyable");
-@@ -2635,6 +2670,8 @@ struct llama_layer {
- struct ggml_tensor * ffn_gate_scale;
- struct ggml_tensor * ffn_up_scale;
- struct ggml_tensor * ffn_down_scale;
-+
-+ struct ggml_tensor * bskcn_tv;
- };
-
- // very similar to llama_batch,
-@@ -5937,6 +5974,21 @@ static void llm_load_hparams(
- default: model.type = e_model::MODEL_UNKNOWN;
- }
- } break;
-+ case LLM_ARCH_SOLAR:
-+ {
-+ ml.get_key(LLM_KV_ATTENTION_LAYERNORM_RMS_EPS, hparams.f_norm_rms_eps);
-+
-+ for (int i = 0; i < hparams.n_bskcn_arr.max_size(); ++i) {
-+ auto & bskcn = hparams.n_bskcn_arr.at(i);
-+ bskcn.fill(0);
-+ ml.get_key_or_arr(::format(LLM_KV_NAMES.at(LLM_KV_ATTENTION_BLOCK_SKIP_CONNECTION), LLM_ARCH_NAMES.at(ml.llm_kv.arch), i), bskcn, hparams.n_layer, false);
-+ }
-+
-+ switch (hparams.n_layer) {
-+ case 64: model.type = e_model::MODEL_22B; break;
-+ default: model.type = e_model::MODEL_UNKNOWN;
-+ }
-+ }
- default: (void)0;
- }
-
-@@ -8420,6 +8472,38 @@ static bool llm_load_tensors(
- }
-
- } break;
-+ case LLM_ARCH_SOLAR:
-+ {
-+ model.tok_embd = ml.create_tensor(ctx_input, tn(LLM_TENSOR_TOKEN_EMBD, "weight"), {n_embd, n_vocab});
-+
-+ // output
-+ {
-+ model.output_norm = ml.create_tensor(ctx_output, tn(LLM_TENSOR_OUTPUT_NORM, "weight"), {n_embd});
-+ model.output = ml.create_tensor(ctx_output_split, tn(LLM_TENSOR_OUTPUT, "weight"), {n_embd, n_vocab}, llama_model_loader::TENSOR_NOT_REQUIRED);
-+ }
-+
-+ for (int i = 0; i < n_layer; ++i) {
-+ ggml_context * ctx_layer = ctx_for_layer(i);
-+ ggml_context * ctx_split = ctx_for_layer_split(i);
-+
-+ auto & layer = model.layers[i];
-+
-+ layer.attn_norm = ml.create_tensor(ctx_layer, tn(LLM_TENSOR_ATTN_NORM, "weight", i), {n_embd});
-+
-+ layer.wq = ml.create_tensor(ctx_split, tn(LLM_TENSOR_ATTN_Q, "weight", i), {n_embd, n_embd_head_k * n_head});
-+ layer.wk = ml.create_tensor(ctx_split, tn(LLM_TENSOR_ATTN_K, "weight", i), {n_embd, n_embd_k_gqa});
-+ layer.wv = ml.create_tensor(ctx_split, tn(LLM_TENSOR_ATTN_V, "weight", i), {n_embd, n_embd_v_gqa});
-+ layer.wo = ml.create_tensor(ctx_split, tn(LLM_TENSOR_ATTN_OUT, "weight", i), {n_embd_head_k * n_head, n_embd});
-+
-+ layer.ffn_norm = ml.create_tensor(ctx_layer, tn(LLM_TENSOR_FFN_NORM, "weight", i), {n_embd});
-+
-+ layer.bskcn_tv = ml.create_tensor(ctx_layer, tn(LLM_TENSOR_BSKCN_TV, "weight"), {2}, llama_model_loader::TENSOR_NOT_REQUIRED | (i != 0 ? llama_model_loader::TENSOR_DUPLICATED : 0));
-+
-+ layer.ffn_gate = ml.create_tensor(ctx_split, tn(LLM_TENSOR_FFN_GATE, "weight", i), {n_embd, n_ff});
-+ layer.ffn_down = ml.create_tensor(ctx_split, tn(LLM_TENSOR_FFN_DOWN, "weight", i), { n_ff, n_embd});
-+ layer.ffn_up = ml.create_tensor(ctx_split, tn(LLM_TENSOR_FFN_UP, "weight", i), {n_embd, n_ff});
-+ }
-+ } break;
- default:
- throw std::runtime_error("unknown architecture");
- }
-@@ -15173,6 +15257,158 @@ struct llm_build_context {
-
- return gf;
- }
-+
-+ ggml_cgraph * build_solar() {
-+ struct ggml_cgraph * gf = ggml_new_graph_custom(ctx0, llama_model_max_nodes(model), false);
-+
-+ // mutable variable, needed during the last layer of the computation to skip unused tokens
-+ int32_t n_tokens = this->n_tokens;
-+
-+ const int64_t n_embd_head = hparams.n_embd_head_v;
-+ GGML_ASSERT(n_embd_head == hparams.n_embd_head_k);
-+ GGML_ASSERT(n_embd_head == hparams.n_rot);
-+
-+ struct ggml_tensor * cur;
-+ struct ggml_tensor * inpL;
-+
-+ inpL = llm_build_inp_embd(ctx0, lctx, hparams, batch, model.tok_embd, cb);
-+
-+ // inp_pos - contains the positions
-+ struct ggml_tensor * inp_pos = build_inp_pos();
-+
-+ // KQ_mask (mask for 1 head, it will be broadcasted to all heads)
-+ struct ggml_tensor * KQ_mask = build_inp_KQ_mask();
-+
-+ struct ggml_tensor * bskcn_1;
-+ struct ggml_tensor * bskcn_2;
-+
-+ for (int il = 0; il < n_layer; ++il) {
-+ struct ggml_tensor * inpSA = inpL;
-+
-+ if (hparams.n_bskcn(0, il)) {
-+ bskcn_1 = inpSA;
-+ }
-+
-+ if (hparams.n_bskcn(1, il)) {
-+ bskcn_2 = inpSA;
-+ }
-+
-+ if (hparams.n_bskcn(2, il)) {
-+ inpSA = ggml_add(
-+ ctx0,
-+ ggml_mul(ctx0, bskcn_1, ggml_view_1d(ctx0, model.layers[il].bskcn_tv, 1, 0)),
-+ ggml_mul(ctx0, inpSA, ggml_view_1d(ctx0, model.layers[il].bskcn_tv, 1, ggml_element_size(model.layers[il].bskcn_tv))));
-+ }
-+
-+ if (hparams.n_bskcn(3, il)) {
-+ inpSA = ggml_add(
-+ ctx0,
-+ ggml_mul(ctx0, bskcn_2, ggml_view_1d(ctx0, model.layers[il].bskcn_tv, 1, 0)),
-+ ggml_mul(ctx0, inpSA, ggml_view_1d(ctx0, model.layers[il].bskcn_tv, 1, ggml_element_size(model.layers[il].bskcn_tv))));
-+ }
-+
-+ // norm
-+ cur = llm_build_norm(ctx0, inpL, hparams,
-+ model.layers[il].attn_norm, NULL,
-+ LLM_NORM_RMS, cb, il);
-+ cb(cur, "attn_norm", il);
-+
-+ // self-attention
-+ {
-+ // rope freq factors for llama3; may return nullptr for llama2 and other models
-+ struct ggml_tensor * rope_factors = build_rope_factors(il);
-+
-+ // compute Q and K and RoPE them
-+ struct ggml_tensor * Qcur = llm_build_lora_mm(lctx, ctx0, model.layers[il].wq, cur);
-+ cb(Qcur, "Qcur", il);
-+ if (model.layers[il].bq) {
-+ Qcur = ggml_add(ctx0, Qcur, model.layers[il].bq);
-+ cb(Qcur, "Qcur", il);
-+ }
-+
-+ struct ggml_tensor * Kcur = llm_build_lora_mm(lctx, ctx0, model.layers[il].wk, cur);
-+ cb(Kcur, "Kcur", il);
-+ if (model.layers[il].bk) {
-+ Kcur = ggml_add(ctx0, Kcur, model.layers[il].bk);
-+ cb(Kcur, "Kcur", il);
-+ }
-+
-+ struct ggml_tensor * Vcur = llm_build_lora_mm(lctx, ctx0, model.layers[il].wv, cur);
-+ cb(Vcur, "Vcur", il);
-+ if (model.layers[il].bv) {
-+ Vcur = ggml_add(ctx0, Vcur, model.layers[il].bv);
-+ cb(Vcur, "Vcur", il);
-+ }
-+
-+ Qcur = ggml_rope_ext(
-+ ctx0, ggml_reshape_3d(ctx0, Qcur, n_embd_head, n_head, n_tokens), inp_pos, rope_factors,
-+ n_rot, rope_type, n_ctx_orig, freq_base, freq_scale,
-+ ext_factor, attn_factor, beta_fast, beta_slow
-+ );
-+ cb(Qcur, "Qcur", il);
-+
-+ Kcur = ggml_rope_ext(
-+ ctx0, ggml_reshape_3d(ctx0, Kcur, n_embd_head, n_head_kv, n_tokens), inp_pos, rope_factors,
-+ n_rot, rope_type, n_ctx_orig, freq_base, freq_scale,
-+ ext_factor, attn_factor, beta_fast, beta_slow
-+ );
-+ cb(Kcur, "Kcur", il);
-+
-+ cur = llm_build_kv(ctx0, lctx, kv_self, gf,
-+ model.layers[il].wo, model.layers[il].bo,
-+ Kcur, Vcur, Qcur, KQ_mask, n_tokens, kv_head, n_kv, 1.0f/sqrtf(float(n_embd_head)), cb, il);
-+ }
-+
-+ if (il == n_layer - 1) {
-+ // skip computing output for unused tokens
-+ struct ggml_tensor * inp_out_ids = build_inp_out_ids();
-+ n_tokens = n_outputs;
-+ cur = ggml_get_rows(ctx0, cur, inp_out_ids);
-+ inpSA = ggml_get_rows(ctx0, inpSA, inp_out_ids);
-+ }
-+
-+ struct ggml_tensor * ffn_inp = ggml_add(ctx0, cur, inpSA);
-+ cb(ffn_inp, "ffn_inp", il);
-+
-+ // feed-forward network
-+ cur = llm_build_norm(ctx0, ffn_inp, hparams,
-+ model.layers[il].ffn_norm, NULL,
-+ LLM_NORM_RMS, cb, il);
-+ cb(cur, "ffn_norm", il);
-+
-+ cur = llm_build_ffn(ctx0, lctx, cur,
-+ model.layers[il].ffn_up, model.layers[il].ffn_up_b, NULL,
-+ model.layers[il].ffn_gate, model.layers[il].ffn_gate_b, NULL,
-+ model.layers[il].ffn_down, model.layers[il].ffn_down_b, NULL,
-+ NULL,
-+ LLM_FFN_SILU, LLM_FFN_PAR, cb, il);
-+ cb(cur, "ffn_out", il);
-+
-+ cur = ggml_add(ctx0, cur, ffn_inp);
-+ cb(cur, "ffn_out", il);
-+
-+ cur = lctx.cvec.apply_to(ctx0, cur, il);
-+ cb(cur, "l_out", il);
-+
-+ // input for next layer
-+ inpL = cur;
-+ }
-+
-+ cur = inpL;
-+
-+ cur = llm_build_norm(ctx0, cur, hparams,
-+ model.output_norm, NULL,
-+ LLM_NORM_RMS, cb, -1);
-+ cb(cur, "result_norm", -1);
-+
-+ // lm_head
-+ cur = llm_build_lora_mm(lctx, ctx0, model.output, cur);
-+ cb(cur, "result_output", -1);
-+
-+ ggml_build_forward_expand(gf, cur);
-+
-+ return gf;
-+ }
- };
-
- static struct ggml_cgraph * llama_build_graph_defrag(llama_context & lctx, const std::vector<uint32_t> & ids) {
-@@ -15423,6 +15659,10 @@ static struct ggml_cgraph * llama_build_graph(
- {
- result = llm.build_rwkv6();
- } break;
-+ case LLM_ARCH_SOLAR:
-+ {
-+ result = llm.build_solar();
-+ } break;
- default:
- GGML_ABORT("fatal error");
- }
-@@ -18503,6 +18743,7 @@ enum llama_rope_type llama_rope_type(const struct llama_model * model) {
- case LLM_ARCH_ARCTIC:
- case LLM_ARCH_DEEPSEEK2:
- case LLM_ARCH_CHATGLM:
-+ case LLM_ARCH_SOLAR:
- return LLAMA_ROPE_TYPE_NORM;
-
- // the pairs of head values are offset by n_rot/2
---
-2.46.0
-
|
ollama
|
ollama
|
Go
|
Go
| 131,099
| 10,753
|
Get up and running with Llama 3.3, DeepSeek-R1, Phi-4, Gemma 2, and other large language models.
|
ollama_ollama
|
NEW_FEAT
|
Solar Pro patch added which seems like a new feature
|
77ab8d9135807dfa5f745713603de954b55c102b
|
2023-09-21 05:18:14
|
Will Ceolin
|
New school form
| false
| 303
| 22
| 325
|
--- lib/organizations/organizations_context.ex
@@ -52,20 +52,4 @@ defmodule Uneebee.Organizations do
def update_school(%School{} = school, attrs \\ %{}) do
school |> change_school(attrs) |> Repo.update()
end
-
- @doc """
- Get a school by slug.
-
- ## Examples
-
- iex> get_school_by_slug!("slug")
- %School{}
-
- iex> get_school_by_slug!("invalid_slug")
- ** (Ecto.NoResultsError)
- """
- @spec get_school_by_slug!(String.t()) :: School.t()
- def get_school_by_slug!(slug) do
- Repo.get_by!(School, slug: slug)
- end
end
--- lib/organizations/school_live/new.ex
@@ -1,49 +0,0 @@
-defmodule UneebeeWeb.Live.Organizations.School.New do
- @moduledoc false
- use UneebeeWeb, :live_view
-
- alias Uneebee.Organizations
- alias Uneebee.Organizations.School
-
- @impl Phoenix.LiveView
- def mount(_params, _session, socket) do
- changeset = Organizations.change_school(%School{})
-
- socket =
- socket |> assign(page_title: dgettext("orgs", "Create school")) |> assign(form: to_form(changeset))
-
- {:ok, socket}
- end
-
- @impl Phoenix.LiveView
- def handle_event("validate", %{"school" => school_params}, socket) do
- form =
- %School{}
- |> Organizations.change_school(school_params)
- |> Map.put(:action, :validate)
- |> to_form()
-
- socket = assign(socket, form: form)
-
- {:noreply, socket}
- end
-
- @impl Phoenix.LiveView
- def handle_event("save", %{"school" => school_params}, socket) do
- %{current_user: user} = socket.assigns
-
- attrs = Map.merge(school_params, %{"created_by_id" => user.id})
-
- case Organizations.create_school(attrs) do
- {:ok, _school} ->
- {:noreply,
- socket |> put_flash(:info, dgettext("orgs", "School created successfully")) |> push_navigate(to: ~p"/")}
-
- {:error, %Ecto.Changeset{} = changeset} ->
- {:noreply,
- socket
- |> assign(form: to_form(changeset))
- |> put_flash(:error, dgettext("orgs", "School could not be created"))}
- end
- end
-end
--- lib/organizations/school_live/new.html.heex
@@ -1,18 +0,0 @@
-<.simple_form for={@form} id="school-form" phx-submit="save" phx-change="validate" class="mx-auto w-full max-w-sm">
- <.header icon="tabler-rocket"><%= dgettext("orgs", "Create school") %></.header>
-
- <.input type="text" field={@form[:name]} label={dgettext("orgs", "School name")} required />
- <.input type="email" field={@form[:email]} label={dgettext("orgs", "School email")} required />
-
- <.input
- type="text"
- field={@form[:slug]}
- label={dgettext("orgs", "Nickname")}
- helper={dgettext("orgs", "Choose a nickname to acess your school profile")}
- required
- />
-
- <.button icon="tabler-school" type="submit" phx-disable-with={gettext("Saving...")}>
- <%= dgettext("orgs", "Create school") %>
- </.button>
-</.simple_form>
--- lib/router.ex
@@ -51,8 +51,7 @@ defmodule UneebeeWeb.Router do
end
end
- # Requires authentication
- scope "/", UneebeeWeb.Live do
+ scope "/", UneebeeWeb.Live.Accounts.User do
pipe_through [:browser, :require_authenticated_user]
live_session :require_authenticated_user,
@@ -61,15 +60,13 @@ defmodule UneebeeWeb.Router do
{UneebeeWeb.Plugs.Translate, :set_locale_from_session},
UneebeeWeb.Plugs.ActivePage
] do
- live "/schools/new", Organizations.School.New
-
- live "/users/settings/email", Accounts.User.Settings, :email
- live "/users/settings/language", Accounts.User.Settings, :language
- live "/users/settings/name", Accounts.User.Settings, :name
- live "/users/settings/password", Accounts.User.Settings, :password
- live "/users/settings/username", Accounts.User.Settings, :username
+ live "/users/settings/email", Settings, :email
+ live "/users/settings/language", Settings, :language
+ live "/users/settings/name", Settings, :name
+ live "/users/settings/password", Settings, :password
+ live "/users/settings/username", Settings, :username
- live "/users/settings/confirm_email/:token", Accounts.User.Settings, :confirm_email
+ live "/users/settings/confirm_email/:token", Settings, :confirm_email
end
end
--- priv/gettext/default.pot
@@ -58,7 +58,6 @@ msgstr ""
#: lib/accounts/user_live/registration.html.heex:36
#: lib/accounts/user_live/settings.html.heex:49
-#: lib/organizations/school_live/new.html.heex:15
#, elixir-autogen, elixir-format
msgid "Saving..."
msgstr ""
--- priv/gettext/en/LC_MESSAGES/default.po
@@ -58,7 +58,6 @@ msgstr ""
#: lib/accounts/user_live/registration.html.heex:36
#: lib/accounts/user_live/settings.html.heex:49
-#: lib/organizations/school_live/new.html.heex:15
#, elixir-autogen, elixir-format
msgid "Saving..."
msgstr ""
--- priv/gettext/en/LC_MESSAGES/errors.po
@@ -149,6 +149,11 @@ msgstr ""
msgid "must have the @ sign and no spaces"
msgstr ""
+#: lib/organizations/school_schema.ex:53
+#, elixir-autogen, elixir-format
+msgid "cannot be changed"
+msgstr ""
+
#: lib/organizations/school_schema.ex:35
#, elixir-autogen, elixir-format
msgid "must start with / or https://"
--- priv/gettext/en/LC_MESSAGES/orgs.po
@@ -1,49 +0,0 @@
-## "msgid"s in this file come from POT (.pot) files.
-###
-### Do not add, change, or remove "msgid"s manually here as
-### they're tied to the ones in the corresponding POT file
-### (with the same domain).
-###
-### Use "mix gettext.extract --merge" or "mix gettext.merge"
-### to merge POT files into PO files.
-msgid ""
-msgstr ""
-"Language: en\n"
-"Plural-Forms: nplurals=2; plural=(n != 1);\n"
-
-#: lib/organizations/school_live/new.html.heex:11
-#, elixir-autogen, elixir-format
-msgid "Choose a nickname to acess your school profile"
-msgstr ""
-
-#: lib/organizations/school_live/new.ex:13
-#: lib/organizations/school_live/new.html.heex:2
-#: lib/organizations/school_live/new.html.heex:16
-#, elixir-autogen, elixir-format
-msgid "Create school"
-msgstr ""
-
-#: lib/organizations/school_live/new.html.heex:10
-#, elixir-autogen, elixir-format
-msgid "Nickname"
-msgstr ""
-
-#: lib/organizations/school_live/new.ex:46
-#, elixir-autogen, elixir-format
-msgid "School could not be created"
-msgstr ""
-
-#: lib/organizations/school_live/new.ex:40
-#, elixir-autogen, elixir-format
-msgid "School created successfully"
-msgstr ""
-
-#: lib/organizations/school_live/new.html.heex:5
-#, elixir-autogen, elixir-format
-msgid "School email"
-msgstr ""
-
-#: lib/organizations/school_live/new.html.heex:4
-#, elixir-autogen, elixir-format
-msgid "School name"
-msgstr ""
--- priv/gettext/errors.pot
@@ -146,6 +146,11 @@ msgstr ""
msgid "must have the @ sign and no spaces"
msgstr ""
+#: lib/organizations/school_schema.ex:53
+#, elixir-autogen, elixir-format
+msgid "cannot be changed"
+msgstr ""
+
#: lib/organizations/school_schema.ex:35
#, elixir-autogen, elixir-format
msgid "must start with / or https://"
--- priv/gettext/orgs.pot
@@ -1,49 +0,0 @@
-## This file is a PO Template file.
-##
-## "msgid"s here are often extracted from source code.
-## Add new messages manually only if they're dynamic
-## messages that can't be statically extracted.
-##
-## Run "mix gettext.extract" to bring this file up to
-## date. Leave "msgstr"s empty as changing them here has no
-## effect: edit them in PO (.po) files instead.
-#
-msgid ""
-msgstr ""
-
-#: lib/organizations/school_live/new.html.heex:11
-#, elixir-autogen, elixir-format
-msgid "Choose a nickname to acess your school profile"
-msgstr ""
-
-#: lib/organizations/school_live/new.ex:13
-#: lib/organizations/school_live/new.html.heex:2
-#: lib/organizations/school_live/new.html.heex:16
-#, elixir-autogen, elixir-format
-msgid "Create school"
-msgstr ""
-
-#: lib/organizations/school_live/new.html.heex:10
-#, elixir-autogen, elixir-format
-msgid "Nickname"
-msgstr ""
-
-#: lib/organizations/school_live/new.ex:46
-#, elixir-autogen, elixir-format
-msgid "School could not be created"
-msgstr ""
-
-#: lib/organizations/school_live/new.ex:40
-#, elixir-autogen, elixir-format
-msgid "School created successfully"
-msgstr ""
-
-#: lib/organizations/school_live/new.html.heex:5
-#, elixir-autogen, elixir-format
-msgid "School email"
-msgstr ""
-
-#: lib/organizations/school_live/new.html.heex:4
-#, elixir-autogen, elixir-format
-msgid "School name"
-msgstr ""
--- priv/gettext/pt/LC_MESSAGES/default.po
@@ -58,7 +58,6 @@ msgstr "Oops, algo deu errado! Por favor, verifique os erros abaixo."
#: lib/accounts/user_live/registration.html.heex:36
#: lib/accounts/user_live/settings.html.heex:49
-#: lib/organizations/school_live/new.html.heex:15
#, elixir-autogen, elixir-format
msgid "Saving..."
msgstr "Salvando..."
--- priv/gettext/pt/LC_MESSAGES/errors.po
@@ -139,6 +139,11 @@ msgstr "não é igual à senha"
msgid "must have the @ sign and no spaces"
msgstr "deve ter arroba (@) e nenhum espaço"
+#: lib/organizations/school_schema.ex:53
+#, elixir-autogen, elixir-format
+msgid "cannot be changed"
+msgstr "não pode ser alterado"
+
#: lib/organizations/school_schema.ex:35
#, elixir-autogen, elixir-format
msgid "must start with / or https://"
--- priv/gettext/pt/LC_MESSAGES/orgs.po
@@ -1,49 +0,0 @@
-## "msgid"s in this file come from POT (.pot) files.
-###
-### Do not add, change, or remove "msgid"s manually here as
-### they're tied to the ones in the corresponding POT file
-### (with the same domain).
-###
-### Use "mix gettext.extract --merge" or "mix gettext.merge"
-### to merge POT files into PO files.
-msgid ""
-msgstr ""
-"Language: pt\n"
-"Plural-Forms: nplurals=2; plural=(n != 1);\n"
-
-#: lib/organizations/school_live/new.html.heex:11
-#, elixir-autogen, elixir-format
-msgid "Choose a nickname to acess your school profile"
-msgstr "Escolha um apelido para acessar o perfil da sua escola"
-
-#: lib/organizations/school_live/new.ex:13
-#: lib/organizations/school_live/new.html.heex:2
-#: lib/organizations/school_live/new.html.heex:16
-#, elixir-autogen, elixir-format
-msgid "Create school"
-msgstr "Criar escola"
-
-#: lib/organizations/school_live/new.html.heex:10
-#, elixir-autogen, elixir-format
-msgid "Nickname"
-msgstr "Apelido"
-
-#: lib/organizations/school_live/new.ex:46
-#, elixir-autogen, elixir-format
-msgid "School could not be created"
-msgstr "Não foi possível criar a escola"
-
-#: lib/organizations/school_live/new.ex:40
-#, elixir-autogen, elixir-format
-msgid "School created successfully"
-msgstr "Escolda criada com sucesso"
-
-#: lib/organizations/school_live/new.html.heex:5
-#, elixir-autogen, elixir-format
-msgid "School email"
-msgstr "E-mail da escola"
-
-#: lib/organizations/school_live/new.html.heex:4
-#, elixir-autogen, elixir-format
-msgid "School name"
-msgstr "Nome da escola"
--- test/organizations/organizations_context_test.exs
@@ -139,15 +139,4 @@ defmodule Uneebee.OrganizationsTest do
assert {:error, %Ecto.Changeset{}} = Organizations.update_school(school, invalid_attrs)
end
end
-
- describe "get_school_by_slug!/1" do
- test "returns the school with given id" do
- school = school_fixture()
- assert Organizations.get_school_by_slug!(school.slug) == school
- end
-
- test "raises an error if the school doesn't exist" do
- assert_raise Ecto.NoResultsError, fn -> Organizations.get_school_by_slug!("invalid") end
- end
- end
end
--- test/organizations/school_new_live_test.exs
@@ -1,49 +0,0 @@
-defmodule UneebeeWeb.NewSchoolLiveTest do
- use UneebeeWeb.ConnCase, async: true
-
- import Phoenix.LiveViewTest
- import Uneebee.Fixtures.Organizations
-
- alias Uneebee.Organizations
-
- @school_form "#school-form"
-
- describe "New school page (authenticated users)" do
- setup :register_and_log_in_user
-
- test "creates a school", %{conn: conn, user: user} do
- attrs = valid_school_attributes()
-
- {:ok, lv, _html} = live(conn, ~p"/schools/new")
-
- {:ok, _lv, html} =
- lv
- |> form(@school_form, school: %{name: attrs.name, email: attrs.email, slug: attrs.slug})
- |> render_submit()
- |> follow_redirect(conn, ~p"/")
-
- assert html =~ "School created successfully"
-
- school = Organizations.get_school_by_slug!(attrs.slug)
- assert school.created_by_id == user.id
- assert school.name == attrs.name
- end
-
- test "renders error when slug is duplicated", %{conn: conn} do
- existing_school = school_fixture()
- {:ok, lv, _html} = live(conn, ~p"/schools/new")
- assert field_change(lv, %{slug: existing_school.slug}) =~ "has already been taken"
- end
- end
-
- describe "New school page (non-authenticated users)" do
- test "redirects to login page", %{conn: conn} do
- result = get(conn, ~p"/schools/new")
- assert redirected_to(result) =~ "/login"
- end
- end
-
- defp field_change(lv, changes) do
- lv |> element(@school_form) |> render_change(school: changes)
- end
-end
|
uneebee
|
zoonk
|
Elixir
|
Elixir
| 1,339
| 83
|
Platform for creating interactive courses.
|
zoonk_uneebee
|
NEW_FEAT
|
Large additions with few deletions
|
0ff6ae6042a60450d3086d21073f902e5ad3d726
|
2025-02-24 20:41:38
|
Eser DENIZ
|
fix: deprecated error in tests (#501)
| false
| 33
| 23
| 56
|
--- tests/DTOs/QueueWorkerTest.php
@@ -17,10 +17,8 @@ test('the factory method generates an array of config objects for several format
)->queuesToConsume->toBe(['default']
);
- expect(Arr::first(array_filter($configObject,
- fn (QueueConfig $config) => $config->alias === $worker)))->memoryLimit->toBe(128);
- expect(Arr::first(array_filter($configObject,
- fn (QueueConfig $config) => $config->alias === $worker)))->timeout->toBe(60);
+ expect(Arr::first(array_filter($configObject, fn (QueueConfig $config) => $config->alias === $worker)))->memoryLimit->toBe(128);
+ expect(Arr::first(array_filter($configObject, fn (QueueConfig $config) => $config->alias === $worker)))->timeout->toBe(60);
continue;
}
@@ -31,45 +29,37 @@ test('the factory method generates an array of config objects for several format
)->queuesToConsume->toBe($worker['queues'] ?? ['default']
);
- expect(Arr::first(array_filter($configObject,
- fn (QueueConfig $config) => $config->alias === $alias)))->memoryLimit->toBe($worker['memory_limit'] ?? 128);
- expect(Arr::first(array_filter($configObject,
- fn (QueueConfig $config) => $config->alias === $alias)))->timeout->toBe($worker['timeout'] ?? 60);
+ expect(Arr::first(array_filter($configObject, fn (QueueConfig $config) => $config->alias === $alias)))->memoryLimit->toBe($worker['memory_limit'] ?? 128);
+ expect(Arr::first(array_filter($configObject, fn (QueueConfig $config) => $config->alias === $alias)))->timeout->toBe($worker['timeout'] ?? 60);
}
})->with([
[
- [
- 'queue_workers' => [
- 'some_worker' => [
- 'queues' => ['default'],
- 'memory_limit' => 64,
- 'timeout' => 60,
- ],
+ 'queue_workers' => [
+ 'some_worker' => [
+ 'queues' => ['default'],
+ 'memory_limit' => 64,
+ 'timeout' => 60,
],
],
],
[
- [
- 'queue_workers' => [
- 'some_worker' => [],
- 'another_worker' => [],
- ],
+ 'queue_workers' => [
+ 'some_worker' => [],
+ 'another_worker' => [],
],
],
[
- [
- 'queue_workers' => [
- 'some_worker' => [
- ],
- 'another_worker' => [
- 'queues' => ['default', 'another'],
- ],
- 'yet_another_worker' => [
- 'memory_limit' => 256,
- ],
- 'one_more_worker' => [
- 'timeout' => 120,
- ],
+ 'queue_workers' => [
+ 'some_worker' => [
+ ],
+ 'another_worker' => [
+ 'queues' => ['default', 'another'],
+ ],
+ 'yet_another_worker' => [
+ 'memory_limit' => 256,
+ ],
+ 'one_more_worker' => [
+ 'timeout' => 120,
],
],
],
|
laravel
|
nativephp
|
PHP
|
PHP
| 3,498
| 182
|
Laravel wrapper for the NativePHP framework
|
nativephp_laravel
|
CODE_IMPROVEMENT
|
Obvious
|
58d33989365338dd9e3ae1f09a7120be78676c11
|
2023-01-03 23:38:58
|
Robert Hofer
|
Remove abandoned project pyannotate
| false
| 0
| 1
| 1
|
--- README.md
@@ -260,6 +260,7 @@ Inspired by [awesome-php](https://github.com/ziadoz/awesome-php).
* [typeshed](https://github.com/python/typeshed) - Collection of library stubs for Python, with static types.
* Static Type Annotations Generators
* [MonkeyType](https://github.com/Instagram/MonkeyType) - A system for Python that generates static type annotations by collecting runtime types.
+ * [pyannotate](https://github.com/dropbox/pyannotate) - Auto-generate PEP-484 annotations.
* [pytype](https://github.com/google/pytype) - Pytype checks and infers types for Python code - without requiring type annotations.
## Command-line Interface Development
|
awesome-python
|
vinta
|
Python
|
Python
| 236,071
| 25,368
|
An opinionated list of awesome Python frameworks, libraries, software and resources.
|
vinta_awesome-python
|
DOC_CHANGE
|
Obvious
|
c8804bff6d1616a23547eaff3f91b75200a3319a
|
2025-02-08 03:47:10
|
yongruilin
|
fix: flagz endpoint to return parsed flags value
| false
| 9
| 1
| 10
|
--- cmd/kube-apiserver/app/options/completion.go
@@ -57,7 +57,7 @@ func (s *ServerRunOptions) Complete(ctx context.Context) (CompletedOptions, erro
if err != nil {
return CompletedOptions{}, err
}
- controlplane, err := s.Options.Complete(ctx, *s.ParsedFlags, []string{"kubernetes.default.svc", "kubernetes.default", "kubernetes"}, []net.IP{apiServerServiceIP})
+ controlplane, err := s.Options.Complete(ctx, s.Flags(), []string{"kubernetes.default.svc", "kubernetes.default", "kubernetes"}, []net.IP{apiServerServiceIP})
if err != nil {
return CompletedOptions{}, err
}
--- cmd/kube-apiserver/app/options/options.go
@@ -41,8 +41,6 @@ type ServerRunOptions struct {
CloudProvider *kubeoptions.CloudProviderOptions
Extra
- // ParsedFlags hold the parsed CLI flags.
- ParsedFlags *cliflag.NamedFlagSets
}
type Extra struct {
@@ -102,9 +100,6 @@ func NewServerRunOptions() *ServerRunOptions {
// Flags returns flags for a specific APIServer by section name
func (s *ServerRunOptions) Flags() (fss cliflag.NamedFlagSets) {
- if s.ParsedFlags != nil {
- return *s.ParsedFlags
- }
s.Options.AddFlags(&fss)
s.CloudProvider.AddFlags(fss.FlagSet("cloud provider"))
@@ -161,6 +156,5 @@ func (s *ServerRunOptions) Flags() (fss cliflag.NamedFlagSets) {
"The number of apiservers running in the cluster, must be a positive number. (In use when --endpoint-reconciler-type=master-count is enabled.)")
fs.MarkDeprecated("apiserver-count", "apiserver-count is deprecated and will be removed in a future version.")
- s.ParsedFlags = &fss
return fss
}
--- cmd/kube-apiserver/app/options/options_test.go
@@ -335,7 +335,6 @@ func TestAddFlags(t *testing.T) {
CloudConfigFile: "/cloud-config",
CloudProvider: "azure",
},
- ParsedFlags: s.ParsedFlags,
}
expected.Authentication.OIDC.UsernameClaim = "sub"
--- cmd/kube-apiserver/app/server.go
@@ -124,7 +124,6 @@ cluster's shared state through which all other components interact.`,
fs := cmd.Flags()
namedFlagSets := s.Flags()
- s.ParsedFlags = &namedFlagSets
verflag.AddFlags(namedFlagSets.FlagSet("global"))
globalflag.AddGlobalFlags(namedFlagSets.FlagSet("global"), cmd.Name(), logs.SkipLoggingConfigurationFlags())
options.AddCustomGlobalFlags(namedFlagSets.FlagSet("generic"))
|
kubernetes
|
kubernetes
|
Go
|
Go
| 113,460
| 40,344
|
Production-Grade Container Scheduling and Management
|
kubernetes_kubernetes
|
CONFIG_CHANGE
|
Very small changes
|
eed0058eb51222f09c358c2305daa07e14132c48
|
2025-03-20 03:40:12
|
Weiyi Wang
|
Support int64 mapping in litert model <-> tflite fb PiperOrigin-RevId: 738542686
| false
| 4
| 0
| 4
|
--- tensorflow/lite/experimental/litert/core/model/flatbuffer_to_litert.cc
@@ -93,8 +93,6 @@ LiteRtElementType MapElementType(TflElementType type) {
return kLiteRtElementTypeFloat16;
case tflite::TensorType_INT32:
return kLiteRtElementTypeInt32;
- case tflite::TensorType_INT64:
- return kLiteRtElementTypeInt64;
case tflite::TensorType_BOOL:
return kLiteRtElementTypeBool;
case tflite::TensorType_INT16:
--- tensorflow/lite/experimental/litert/core/model/litert_to_flatbuffer.cc
@@ -38,8 +38,6 @@ Expected<TflElementType> MapElementType(LiteRtElementType litert_element_type) {
return tflite::TensorType_FLOAT16;
case kLiteRtElementTypeInt32:
return tflite::TensorType_INT32;
- case kLiteRtElementTypeInt64:
- return tflite::TensorType_INT64;
case kLiteRtElementTypeBool:
return tflite::TensorType_BOOL;
case kLiteRtElementTypeInt16:
|
tensorflow
|
tensorflow
|
C++
|
C++
| 188,388
| 74,565
|
An Open Source Machine Learning Framework for Everyone
|
nan_tensorflow
|
NEW_FEAT
|
support added for INT64
|
8e9f92b64951f71f84225142b68f296edcbe8ef8
|
2025-02-21 18:40:39
|
Jiachi Liu
|
[dev-overlay] pick up build error message (#76290)
| false
| 128
| 52
| 180
|
--- packages/next/src/client/components/react-dev-overlay/_experimental/internal/container/build-error.tsx
@@ -1,5 +1,4 @@
-import React, { useCallback, useMemo } from 'react'
-import stripAnsi from 'next/dist/compiled/strip-ansi'
+import * as React from 'react'
import { Terminal } from '../components/terminal'
import { noop as css } from '../helpers/noop-template'
import { ErrorOverlayLayout } from '../components/errors/error-overlay-layout/error-overlay-layout'
@@ -9,41 +8,22 @@ export interface BuildErrorProps extends ErrorBaseProps {
message: string
}
-const getErrorTextFromBuildErrorMessage = (multiLineMessage: string) => {
- const lines = multiLineMessage.split('\n')
- // The multi-line build error message looks like:
- // <file path>:<line number>:<column number>
- // <error message>
- // <error code frame of compiler or bundler>
- // e.g.
- // ./path/to/file.js:1:1
- // SyntaxError: ...
- // > 1 | con st foo =
- // ...
- return stripAnsi(lines[1] || '')
-}
-
export const BuildError: React.FC<BuildErrorProps> = function BuildError({
message,
...props
}) {
- const noop = useCallback(() => {}, [])
+ const noop = React.useCallback(() => {}, [])
const error = new Error(message)
- const formattedMessage = useMemo(
- () => getErrorTextFromBuildErrorMessage(message) || 'Failed to compile',
- [message]
- )
-
return (
<ErrorOverlayLayout
errorType="Build Error"
- errorMessage={formattedMessage}
+ errorMessage="Failed to compile"
onClose={noop}
error={error}
footerMessage="This error occurred during the build process and can only be dismissed by fixing the error."
{...props}
>
- <Terminal content={message} />
+ <Terminal content={error.message} />
</ErrorOverlayLayout>
)
}
--- packages/next/src/client/components/react-dev-overlay/pages/hot-reloader-client.ts
@@ -195,7 +195,6 @@ function handleErrors(errors: any) {
})
// Only show the first error.
-
onBuildError(formatted.errors[0])
// Also log them to the console.
--- test/development/acceptance-app/error-recovery.test.ts
@@ -4,10 +4,9 @@ import { FileRef, nextTestSetup } from 'e2e-utils'
import { check, describeVariants as describe } from 'next-test-utils'
import path from 'path'
import { outdent } from 'outdent'
-import stripAnsi from 'strip-ansi'
describe.each(['default', 'turbo'])('Error recovery app %s', () => {
- const { next, isTurbopack } = nextTestSetup({
+ const { next } = nextTestSetup({
files: new FileRef(path.join(__dirname, 'fixtures', 'default-template')),
skipStart: true,
})
@@ -471,12 +470,8 @@ describe.each(['default', 'turbo'])('Error recovery app %s', () => {
)
const { session } = sandbox
await session.assertHasRedbox()
-
- const source = stripAnsi(await session.getRedboxSource(true))
- if (isTurbopack) {
- expect(source).toMatch(/Parsing ecmascript source code failed/)
- } else {
- expect(source).toMatch(/x Expected '}', got '<eof>'/)
- }
+ await expect(session.getRedboxSource(true)).resolves.toMatch(
+ /Failed to compile/
+ )
})
})
--- test/development/app-dir/dynamic-io-dev-errors/dynamic-io-dev-errors.test.ts
@@ -142,7 +142,7 @@ describe('Dynamic IO Dev Errors', () => {
await expect(browser).toDisplayRedbox(`
{
"count": 1,
- "description": "Ecmascript file had an error",
+ "description": "Failed to compile",
"environmentLabel": null,
"label": "Build Error",
"source": "./app/page.tsx (1:14)
@@ -156,7 +156,7 @@ describe('Dynamic IO Dev Errors', () => {
await expect(browser).toDisplayRedbox(`
{
"count": 1,
- "description": "Error: x Route segment config "revalidate" is not compatible with \`nextConfig.experimental.dynamicIO\`. Please remove it.",
+ "description": "Failed to compile",
"environmentLabel": null,
"label": "Build Error",
"source": "./app/page.tsx
--- test/development/app-dir/server-component-next-dynamic-ssr-false/server-component-next-dynamic-ssr-false.test.ts
@@ -18,10 +18,9 @@ describe('app-dir - server-component-next-dynamic-ssr-false', () => {
source: await getRedboxSource(browser),
}
+ expect(redbox.description).toBe('Failed to compile')
+
if (process.env.TURBOPACK) {
- expect(redbox.description).toMatchInlineSnapshot(
- `"Ecmascript file had an error"`
- )
expect(redbox.source).toMatchInlineSnapshot(`
"./app/page.js (3:23)
Ecmascript file had an error
@@ -36,9 +35,6 @@ describe('app-dir - server-component-next-dynamic-ssr-false', () => {
\`ssr: false\` is not allowed with \`next/dynamic\` in Server Components. Please move it into a client component."
`)
} else {
- expect(redbox.description).toMatchInlineSnapshot(
- `"Error: x \`ssr: false\` is not allowed with \`next/dynamic\` in Server Components. Please move it into a client component."`
- )
expect(redbox.source).toMatchInlineSnapshot(`
"./app/page.js
Error: x \`ssr: false\` is not allowed with \`next/dynamic\` in Server Components. Please move it into a client component.
--- test/development/app-dir/ssr-in-rsc/ssr-in-rsc.test.ts
@@ -283,7 +283,7 @@ describe('react-dom/server in React Server environment', () => {
if (isTurbopack) {
expect(redbox).toMatchInlineSnapshot(`
{
- "description": "Ecmascript file had an error",
+ "description": "Failed to compile",
"source": "./app/exports/app-code/react-dom-server-edge-implicit/page.js (3:1)
Ecmascript file had an error
1 | import * as ReactDOMServerEdge from 'react-dom/server'
@@ -378,7 +378,7 @@ describe('react-dom/server in React Server environment', () => {
if (isTurbopack) {
expect(redbox).toMatchInlineSnapshot(`
{
- "description": "Ecmascript file had an error",
+ "description": "Failed to compile",
"source": "./app/exports/app-code/react-dom-server-node-implicit/page.js (3:1)
Ecmascript file had an error
1 | import * as ReactDOMServerNode from 'react-dom/server'
@@ -395,29 +395,29 @@ describe('react-dom/server in React Server environment', () => {
`)
} else {
expect(redbox).toMatchInlineSnapshot(`
- {
- "description": "Error: x You're importing a component that imports react-dom/server. To fix it, render or return the content directly as a Server Component instead for perf and security.",
- "source": "./app/exports/app-code/react-dom-server-node-implicit/page.js
- Error: x You're importing a component that imports react-dom/server. To fix it, render or return the content directly as a Server Component instead for perf and security.
- | Learn more: https://nextjs.org/docs/app/building-your-application/rendering
- ,-[1:1]
- 1 | import * as ReactDOMServerNode from 'react-dom/server'
- : ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
- 2 | // Fine to drop once React is on ESM
- 3 | import ReactDOMServerNodeDefault from 'react-dom/server'
- \`----
- x You're importing a component that imports react-dom/server. To fix it, render or return the content directly as a Server Component instead for perf and security.
- | Learn more: https://nextjs.org/docs/app/building-your-application/rendering
- ,-[3:1]
- 1 | import * as ReactDOMServerNode from 'react-dom/server'
- 2 | // Fine to drop once React is on ESM
- 3 | import ReactDOMServerNodeDefault from 'react-dom/server'
- : ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
- 4 |
- 5 | export const runtime = 'nodejs'
- \`----",
- }
- `)
+ {
+ "description": "Failed to compile",
+ "source": "./app/exports/app-code/react-dom-server-node-implicit/page.js
+ Error: x You're importing a component that imports react-dom/server. To fix it, render or return the content directly as a Server Component instead for perf and security.
+ | Learn more: https://nextjs.org/docs/app/building-your-application/rendering
+ ,-[1:1]
+ 1 | import * as ReactDOMServerNode from 'react-dom/server'
+ : ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+ 2 | // Fine to drop once React is on ESM
+ 3 | import ReactDOMServerNodeDefault from 'react-dom/server'
+ \`----
+ x You're importing a component that imports react-dom/server. To fix it, render or return the content directly as a Server Component instead for perf and security.
+ | Learn more: https://nextjs.org/docs/app/building-your-application/rendering
+ ,-[3:1]
+ 1 | import * as ReactDOMServerNode from 'react-dom/server'
+ 2 | // Fine to drop once React is on ESM
+ 3 | import ReactDOMServerNodeDefault from 'react-dom/server'
+ : ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+ 4 |
+ 5 | export const runtime = 'nodejs'
+ \`----",
+ }
+ `)
}
})
--- test/development/basic/hmr/error-recovery.test.ts
@@ -519,7 +519,7 @@ describe.each([
)
await assertHasRedbox(browser)
- expect(await getRedboxHeader(browser)).toMatch('Build Error')
+ expect(await getRedboxHeader(browser)).toMatch('Failed to compile')
if (process.env.TURBOPACK) {
expect(await getRedboxSource(browser)).toMatchInlineSnapshot(`
@@ -587,7 +587,7 @@ describe.each([
)
await assertHasRedbox(browser)
- expect(await getRedboxHeader(browser)).toMatch('Build Error')
+ expect(await getRedboxHeader(browser)).toMatch('Failed to compile')
let redboxSource = await getRedboxSource(browser)
redboxSource = redboxSource.replace(`${next.testDir}`, '.')
--- test/development/middleware-errors/index.test.ts
@@ -2,7 +2,6 @@ import {
assertHasRedbox,
assertNoRedbox,
check,
- getRedboxDescription,
getRedboxSource,
retry,
} from 'next-test-utils'
@@ -331,16 +330,9 @@ describe('middleware - development errors', () => {
it('renders the error correctly and recovers', async () => {
const browser = await next.browser('/')
await assertHasRedbox(browser)
- const description = await getRedboxDescription(browser)
- if (isTurbopack) {
- expect(description).toMatchInlineSnapshot(
- `"Parsing ecmascript source code failed"`
- )
- } else {
- expect(description).toMatchInlineSnapshot(
- `"Error: x Expected '{', got '}'"`
- )
- }
+ expect(
+ await browser.elementByCss('#nextjs__container_errors_desc').text()
+ ).toEqual('Failed to compile')
await next.patchFile('middleware.js', `export default function () {}`)
await assertNoRedbox(browser)
expect(await browser.elementByCss('#page-title')).toBeTruthy()
--- test/e2e/app-dir/dynamic-io-segment-configs/dynamic-io-segment-configs.test.ts
@@ -32,15 +32,7 @@ describe('dynamic-io-segment-configs', () => {
source: await getRedboxSource(browser),
}
- if (isTurbopack) {
- expect(redbox.description).toMatchInlineSnapshot(
- `"Ecmascript file had an error"`
- )
- } else {
- expect(redbox.description).toMatchInlineSnapshot(
- `"Error: x Route segment config "revalidate" is not compatible with \`nextConfig.experimental.dynamicIO\`. Please remove it."`
- )
- }
+ expect(redbox.description).toMatchInlineSnapshot(`"Failed to compile"`)
expect(redbox.source).toContain(
'"revalidate" is not compatible with `nextConfig.experimental.dynamicIO`. Please remove it.'
)
@@ -93,15 +85,9 @@ describe('dynamic-io-segment-configs', () => {
source: await getRedboxSource(browser),
}
- if (isTurbopack) {
- expect(redbox.description).toMatchInlineSnapshot(
- `"Ecmascript file had an error"`
- )
- } else {
- expect(redbox.description).toMatchInlineSnapshot(
- `"Error: x Route segment config "runtime" is not compatible with \`nextConfig.experimental.dynamicIO\`. Please remove it."`
- )
- }
+ expect(redbox.description).toMatchInlineSnapshot(
+ `"Failed to compile"`
+ )
expect(redbox.source).toContain(
'"runtime" is not compatible with `nextConfig.experimental.dynamicIO`. Please remove it.'
)
--- test/e2e/app-dir/use-cache-segment-configs/use-cache-segment-configs.test.ts
@@ -26,11 +26,10 @@ describe('use-cache-segment-configs', () => {
await assertHasRedbox(browser)
const description = await getRedboxDescription(browser)
- expect(description).toMatchInlineSnapshot(
- `"Ecmascript file had an error"`
- )
const source = await getRedboxSource(browser)
+ expect(description).toBe('Failed to compile')
+
expect(source).toMatchInlineSnapshot(`
"./app/runtime/page.tsx (1:14)
Ecmascript file had an error
--- test/e2e/app-dir/use-cache-unknown-cache-kind/use-cache-unknown-cache-kind.test.ts
@@ -100,15 +100,7 @@ describe('use-cache-unknown-cache-kind', () => {
const errorDescription = await getRedboxDescription(browser)
const errorSource = await getRedboxSource(browser)
- if (isTurbopack) {
- expect(errorDescription).toMatchInlineSnapshot(
- `"Ecmascript file had an error"`
- )
- } else {
- expect(errorDescription).toMatchInlineSnapshot(
- `"Error: x Unknown cache kind "custom". Please configure a cache handler for this kind in the "experimental.cacheHandlers" object in your Next.js config."`
- )
- }
+ expect(errorDescription).toBe('Failed to compile')
if (isTurbopack) {
expect(errorSource).toMatchInlineSnapshot(`
--- test/e2e/app-dir/use-cache-without-experimental-flag/use-cache-without-experimental-flag.test.ts
@@ -84,15 +84,7 @@ describe('use-cache-without-experimental-flag', () => {
const errorDescription = await getRedboxDescription(browser)
const errorSource = await getRedboxSource(browser)
- if (isTurbopack) {
- expect(errorDescription).toMatchInlineSnapshot(
- `"Ecmascript file had an error"`
- )
- } else {
- expect(errorDescription).toMatchInlineSnapshot(
- `"Error: x To use "use cache", please enable the experimental feature flag "useCache" in your Next.js config."`
- )
- }
+ expect(errorDescription).toBe('Failed to compile')
if (isTurbopack) {
expect(errorSource).toMatchInlineSnapshot(`
--- test/integration/next-image-new/invalid-image-import/test/index.test.ts
@@ -4,7 +4,7 @@ import { join } from 'path'
import {
assertHasRedbox,
findPort,
- getRedboxDescription,
+ getRedboxHeader,
getRedboxSource,
killApp,
launchApp,
@@ -23,14 +23,7 @@ function runTests({ isDev }) {
if (isDev) {
const browser = await webdriver(appPort, '/')
await assertHasRedbox(browser)
- const description = await getRedboxDescription(browser)
- if (process.env.TURBOPACK) {
- expect(description).toMatchInlineSnapshot(`"Processing image failed"`)
- } else {
- expect(description).toMatchInlineSnapshot(
- `"Error: Image import "../public/invalid.svg" is not a valid image file. The image may be corrupted or an unsupported format."`
- )
- }
+ expect(await getRedboxHeader(browser)).toMatch('Failed to compile')
const source = await getRedboxSource(browser)
if (process.env.TURBOPACK) {
expect(source).toMatchInlineSnapshot(`
|
next.js
|
vercel
|
JavaScript
|
JavaScript
| 129,891
| 27,821
|
The React Framework
|
vercel_next.js
|
BUG_FIX
|
this commit fixes/polishes an earlier feature
|
469ef576b1ae12711e72ea736bf1f3efa10bb495
|
2025-03-20 12:52:23
|
Benjamin Pasero
|
Revisit wording for users to setup Copilot (Free) (fix microsoft/vscode-copilot#14339) (#244085)
| false
| 1
| 1
| 2
|
--- src/vs/workbench/contrib/chat/browser/chatSetup.ts
@@ -467,7 +467,7 @@ class ChatSetup {
const markdown = this.instantiationService.createInstance(MarkdownRenderer, {});
// Header
- const header = localize({ key: 'headerDialog', comment: ['{Locked="[Copilot]({0})"}'] }, "[Copilot]({0}) is your AI pair programmer. Write code faster with completions, fix bugs and build new features across multiple files, and learn about your codebase through chat.", defaultChat.documentationUrl);
+ const header = localize({ key: 'headerDialog', comment: ['{Locked="[Copilot]({0})"}'] }, "[Copilot]({0}) is your AI pair programmer. It helps you code faster with Completions, build features with Copilot Edits, and explore your codebase with Chat.", defaultChat.documentationUrl);
element.appendChild($('p.setup-header', undefined, disposables.add(markdown.render(new MarkdownString(header, { isTrusted: true }))).element));
// Terms
|
vscode
|
microsoft
|
TypeScript
|
TypeScript
| 168,072
| 30,802
|
Visual Studio Code
|
microsoft_vscode
|
CODE_IMPROVEMENT
|
just some wordings changed which will be seen by the user while setting up the Copilot
|
7814418429ca0d2830a46945b8ca84dbcc08f0b2
|
2023-07-16 02:00:56
|
Toshiaki Takeuchi
|
Adds GPT-4 support, etc. * adds support for GPT-4 models * adds Connection Timeout settings. You can add proxy via Web Preferences in MATLAB * adds support for stop sequences (max 4). * improves the description of the API error messages
| false
| 26
| 5
| 31
|
--- MatGPT.mlapp
Binary files a/MatGPT.mlapp and b/MatGPT.mlapp differ
--- README.md
@@ -8,13 +8,6 @@ MatGPT is a MATLAB app powered by chatGPT class that allows you to easily access
The app and class simply serve as an interface to the ChatGPT API. You should be familiar with the limitations and risks associated with using this technology as well as with [OpenAI terms and policies](https://openai.com/policies). You are responsible for any fees OpenAI may charge for the use of their API.
-## What's New
-
-* adds support for GPT-4 models [available to all API users who have a history of successful payments](https://openai.com/blog/gpt-4-api-general-availability)
-* adds Connection Timeout settings. You can add proxy via [Web Preferences](https://www.mathworks.com/help/matlab/ref/preferences.html) in MATLAB.
-* adds support for stop sequences (max 4).
-* improves the description of the API error messages
-
## Requirements
* **MathWorks Products (https://www.mathworks.com)**: To use MatGPT, you need to have MATLAB R2021a or later installed on your computer.
@@ -33,7 +26,7 @@ setenv("OPENAI_API_KEY","your key here")
## How to use: MatGPT app
-
+
1. Click on "+ New Chat" in the left nav to add a new chat. This opens the Settings tab.
2. In the Settings tab, either choose a preset to populate the settings or customize on your own. Once you have completed the settings, click "Start New Chat" to initiate a chat. This will take you back to the Main tab.
* Presets are loaded from [Presets.csv](contents/presets.csv) - feel free to customize your prompts.
--- helpers/chatGPT.m
@@ -36,9 +36,8 @@
arguments
options.model string {mustBeTextScalar, ...
mustBeMember(options.model, ...
- ["gpt-3.5-turbo","gpt-3.5-turbo-0613", ...
- "gpt-4","gpt-4-0613", ...
- "gpt-4-32k","gpt-4-32k-0613"])} = "gpt-3.5-turbo";
+ ["gpt-3.5-turbo","gpt-3.5-turbo-0301"])} = ...
+ "gpt-3.5-turbo";
options.role string {mustBeTextScalar} = ...
"You are a helpful assistant.";
options.max_tokens (1,1) double {mustBeNumeric, ...
--- helpers/chatter.m
@@ -10,7 +10,6 @@
arguments
obj
prompt string {mustBeTextScalar}
- options.timeout double {mustBeScalarOrEmpty}
options.stop string {mustBeText,mustBeNonzeroLengthText}
end
@@ -54,16 +53,8 @@
headers(2) = HeaderField('Authorization', "Bearer " + api_key);
% the request message
request = RequestMessage('post',headers,query);
-
- % Create a HTTPOptions object; set proxy in MATLAB Web Preferences if needed
- httpOpts = matlab.net.http.HTTPOptions;
- % Set the ConnectTimeout option to 30 seconds
- if isfield(options,'timeout') && options.timeout > 0
- httpOpts.ConnectTimeout = options.timeout;
- end
% send the request and store the response
- response = send(request, URI(obj.api_endpoint),httpOpts);
-
+ response = send(request, URI(obj.api_endpoint));
% extract the response text
if response.StatusCode == "OK"
% extract text from the response
@@ -82,11 +73,7 @@
responseText = responseText + response.StatusLine.ReasonPhrase;
if string(response.StatusCode) == "401"
responseText = responseText + newline + "Check your API key.";
- responseText = responseText + newline + "You may have an invalid API key.";
- elseif string(response.StatusCode) == "404"
- responseText = responseText + newline + "You may not have access to the model.";
- elseif string(response.StatusCode) == "429"
- responseText = responseText + newline + "You exceeded the API limit. Your free trial for OpenAI API may have expired.";
+ responseText = responseText + newline + "Your free trial for OpenAI API may have expired.";
end
id = "chatter:invalidKey";
ME = MException(id,responseText);
|
matgpt
|
toshiakit
|
MATLAB
|
MATLAB
| 218
| 33
|
MATLAB app to access ChatGPT API from OpenAI
|
toshiakit_matgpt
|
DOC_CHANGE
|
Obvious
|
56efcfa2e7b755afc50327ec933bdeece1c5d7e9
|
2023-04-07 16:03:47
|
Kumar Arnav
|
Update link of Time, Clocks, and the Ordering of Events in a Distributed System (#735)
| false
| 1
| 1
| 2
|
--- concurrency/README.md
@@ -2,7 +2,7 @@
* [Everything You Always Wanted to Know About Synchronization but Were Afraid to Ask](http://sigops.org/sosp/sosp13/papers/p33-david.pdf)
-* [Time, Clocks, and the Ordering of Events in a Distributed System](https://lamport.azurewebsites.net/pubs/time-clocks.pdf)
+* [Time, Clocks, and the Ordering of Events in a Distributed System](http://lamport.azurewebsites.net/pubs/time-clocks.pdf)
* [Heap Architectures For Concurrent Languages Using Message Passing](http://www.fantasi.se/publications/ISMM02.pdf)
|
papers-we-love
|
papers-we-love
|
Shell
|
Shell
| 91,347
| 5,859
|
Papers from the computer science community to read and discuss.
|
papers-we-love_papers-we-love
|
CONFIG_CHANGE
|
Very small changes
|
7320efd0a908a2ee42bc77c4c89922226ccfc507
|
2024-03-13 18:45:56
|
Gregor Vostrak
|
fix tests, remove rate limit in non production environments
| false
| 10
| 9
| 19
|
--- app/Providers/RouteServiceProvider.php
@@ -27,10 +27,6 @@ class RouteServiceProvider extends ServiceProvider
public function boot(): void
{
RateLimiter::for('api', function (Request $request) {
- if (! $this->app->isProduction()) {
- return Limit::none();
- }
-
return Limit::perMinute(60)->by($request->user()?->id ?: $request->ip());
});
--- e2e/timetracker.spec.ts
@@ -364,7 +364,9 @@ test('test that adding a new tag works', async ({ page }) => {
);
});
await expect(page.getByTestId('tag_dropdown_search')).toHaveValue('');
- await expect(page.getByRole('option', { name: newTagName })).toBeVisible();
+ await expect(page.getByTestId('tag_dropdown_entries')).toHaveText(
+ newTagName
+ );
});
test('test that adding a new tag when the timer is running', async ({
@@ -387,8 +389,9 @@ test('test that adding a new tag when the timer is running', async ({
);
});
await expect(page.getByTestId('tag_dropdown_search')).toHaveValue('');
-
- await expect(page.getByRole('option', { name: newTagName })).toBeVisible();
+ await expect(page.getByTestId('tag_dropdown_entries')).toHaveText(
+ newTagName
+ );
await page.waitForResponse(async (response) => {
return (
--- package.json
@@ -7,7 +7,7 @@
"lint": "eslint --ext .js,.vue,.ts --ignore-path .gitignore .",
"lint:fix": "eslint --fix --ext .js,.vue,.ts --ignore-path .gitignore .",
"type-check": "vue-tsc --noEmit",
- "test:e2e": "rm -rf test-results/.auth && npx playwright test",
+ "test:e2e": "npx playwright test",
"generate:zod": "npx openapi-zod-client http://localhost:80/docs/api.json --output openapi.json.client.ts --base-url http://solidtime.test/api"
},
"devDependencies": {
--- playwright.config.ts
@@ -20,14 +20,14 @@ export default defineConfig({
/* Opt out of parallel tests on CI. */
workers: process.env.CI ? 1 : undefined,
/* Reporter to use. See https://playwright.dev/docs/test-reporters */
- reporter: process.env.CI ? 'github' : 'html',
+ reporter: process.env.CI ? 'list' : 'html',
/* Shared settings for all the projects below. See https://playwright.dev/docs/api/class-testoptions. */
use: {
/* Base URL to use in actions like `await page.goto('/')`. */
// baseURL: 'http://127.0.0.1:3000',
/* Collect trace when retrying the failed test. See https://playwright.dev/docs/trace-viewer */
- trace: 'on',
+ trace: 'on-first-retry',
},
/* Configure projects for major browsers */
|
solidtime
|
solidtime-io
|
PHP
|
PHP
| 5,267
| 278
|
Modern open-source time-tracking app
|
solidtime-io_solidtime
|
CONFIG_CHANGE
|
Obvious
|
cdf09805c042cc760397b08d9e7cf58fbf8f76a4
|
2024-03-01 05:22:34
|
Romain Vimont
|
Add missing initialization
| false
| 1
| 0
| 1
|
--- app/src/display.c
@@ -62,7 +62,6 @@ sc_display_init(struct sc_display *display, SDL_Window *window, bool mipmaps) {
LOGD("Trilinear filtering disabled (not an OpenGL renderer)");
}
- display->texture = NULL;
display->pending.flags = 0;
display->pending.frame = NULL;
|
scrcpy
|
genymobile
|
C
|
C
| 118,486
| 11,201
|
Display and control your Android device
|
genymobile_scrcpy
|
BUG_FIX
|
this commit fixes/polishes an earlier feature
|
c9afcbade7308cf66b67b9ce080f10b621b17c6a
|
2025-01-09 03:24:54
|
Robert Griesemer
|
go/types, types2: require iterator yield to return bool (work-around) The original implementation of the type checkers accepted any boolean result type for yield, but the compiler's front-end had a problem with it (#71131). As a temporary fix (for 1.24), adjust the type checkers to insist on the spec's literal wording and avoid the compiler panic. Fixes #71131. For #71164. Change-Id: Ie25f9a892e58b5e489d399b0bce2d0af55dc3c48 Reviewed-on: https://go-review.googlesource.com/c/go/+/640599 Reviewed-by: Robert Griesemer <[email protected]> Auto-Submit: Robert Griesemer <[email protected]> Reviewed-by: Tim King <[email protected]> LUCI-TryBot-Result: Go LUCI <[email protected]>
| false
| 34
| 5
| 39
|
--- src/cmd/compile/internal/types2/stmt.go
@@ -1057,13 +1057,8 @@ func rangeKeyVal(typ Type, allowVersion func(goVersion) bool) (key, val Type, ca
return bad("func must be func(yield func(...) bool): argument is not func")
case cb.Params().Len() > 2:
return bad("func must be func(yield func(...) bool): yield func has too many parameters")
- case cb.Results().Len() != 1 || !Identical(cb.Results().At(0).Type(), universeBool):
- // see go.dev/issues/71131, go.dev/issues/71164
- if cb.Results().Len() == 1 && isBoolean(cb.Results().At(0).Type()) {
- return bad("func must be func(yield func(...) bool): yield func returns user-defined boolean, not bool")
- } else {
- return bad("func must be func(yield func(...) bool): yield func does not return bool")
- }
+ case cb.Results().Len() != 1 || !isBoolean(cb.Results().At(0).Type()):
+ return bad("func must be func(yield func(...) bool): yield func does not return bool")
}
assert(cb.Recv() == nil)
// determine key and value types, if any
--- src/cmd/compile/internal/types2/universe.go
@@ -21,7 +21,6 @@ var Unsafe *Package
var (
universeIota Object
- universeBool Type
universeByte Type // uint8 alias, but has name "byte"
universeRune Type // int32 alias, but has name "rune"
universeAnyNoAlias *TypeName
@@ -276,7 +275,6 @@ func init() {
defPredeclaredFuncs()
universeIota = Universe.Lookup("iota")
- universeBool = Universe.Lookup("bool").Type()
universeByte = Universe.Lookup("byte").Type()
universeRune = Universe.Lookup("rune").Type()
universeError = Universe.Lookup("error").Type()
--- src/go/types/stmt.go
@@ -1075,13 +1075,8 @@ func rangeKeyVal(typ Type, allowVersion func(goVersion) bool) (key, val Type, ca
return bad("func must be func(yield func(...) bool): argument is not func")
case cb.Params().Len() > 2:
return bad("func must be func(yield func(...) bool): yield func has too many parameters")
- case cb.Results().Len() != 1 || !Identical(cb.Results().At(0).Type(), universeBool):
- // see go.dev/issues/71131, go.dev/issues/71164
- if cb.Results().Len() == 1 && isBoolean(cb.Results().At(0).Type()) {
- return bad("func must be func(yield func(...) bool): yield func returns user-defined boolean, not bool")
- } else {
- return bad("func must be func(yield func(...) bool): yield func does not return bool")
- }
+ case cb.Results().Len() != 1 || !isBoolean(cb.Results().At(0).Type()):
+ return bad("func must be func(yield func(...) bool): yield func does not return bool")
}
assert(cb.Recv() == nil)
// determine key and value types, if any
--- src/go/types/universe.go
@@ -24,7 +24,6 @@ var Unsafe *Package
var (
universeIota Object
- universeBool Type
universeByte Type // uint8 alias, but has name "byte"
universeRune Type // int32 alias, but has name "rune"
universeAnyNoAlias *TypeName
@@ -279,7 +278,6 @@ func init() {
defPredeclaredFuncs()
universeIota = Universe.Lookup("iota")
- universeBool = Universe.Lookup("bool").Type()
universeByte = Universe.Lookup("byte").Type()
universeRune = Universe.Lookup("rune").Type()
universeError = Universe.Lookup("error").Type()
--- src/internal/types/testdata/fixedbugs/issue71131.go
@@ -1,15 +0,0 @@
-// Copyright 2025 The Go Authors. All rights reserved.
-// Use of this source code is governed by a BSD-style
-// license that can be found in the LICENSE file.
-
-package p
-
-func _() {
- type Bool bool
- for range func /* ERROR "yield func returns user-defined boolean, not bool" */ (func() Bool) {} {
- }
- for range func /* ERROR "yield func returns user-defined boolean, not bool" */ (func(int) Bool) {} {
- }
- for range func /* ERROR "yield func returns user-defined boolean, not bool" */ (func(int, string) Bool) {} {
- }
-}
--- src/internal/types/testdata/spec/range.go
@@ -5,7 +5,7 @@
package p
type MyInt int32
-type MyBool = bool // TODO(gri) remove alias declaration - see go.dev/issues/71131, go.dev/issues/71164
+type MyBool bool
type MyString string
type MyFunc1 func(func(int) bool)
type MyFunc2 func(int) bool
|
go
|
golang
|
Go
|
Go
| 126,191
| 17,926
|
The Go programming language
|
golang_go
|
BUG_FIX
|
Obvious
|
dd4dac26025f06e99afa983e68323a004c28622e
|
2025-03-24 11:53:42
|
Haochen Jiang
|
Revert "AVX10.2 ymm rounding: Support vsqrtp{s,d,h} and vsubp{s,d,h} intrins" This reverts commit 7f62e7104ebc11c4570745972a023579922ef265.
| false
| 0
| 447
| 447
|
--- gcc/config/i386/avx10_2roundingintrin.h
@@ -3986,6 +3986,216 @@ _mm256_maskz_scalef_round_ps (__mmask8 __U, __m256 __A, __m256 __B,
(__mmask8) __U,
__R);
}
+
+extern __inline __m256d
+__attribute__ ((__gnu_inline__, __always_inline__, __artificial__))
+_mm256_sqrt_round_pd (__m256d __A, const int __R)
+{
+ return (__m256d) __builtin_ia32_sqrtpd256_mask_round ((__v4df) __A,
+ (__v4df)
+ _mm256_undefined_pd (),
+ (__mmask8) -1,
+ __R);
+}
+
+extern __inline __m256d
+__attribute__ ((__gnu_inline__, __always_inline__, __artificial__))
+_mm256_mask_sqrt_round_pd (__m256d __W, __mmask8 __U, __m256d __A,
+ const int __R)
+{
+ return (__m256d) __builtin_ia32_sqrtpd256_mask_round ((__v4df) __A,
+ (__v4df) __W,
+ (__mmask8) __U,
+ __R);
+}
+
+extern __inline __m256d
+__attribute__ ((__gnu_inline__, __always_inline__, __artificial__))
+_mm256_maskz_sqrt_round_pd (__mmask8 __U, __m256d __A, const int __R)
+{
+ return (__m256d) __builtin_ia32_sqrtpd256_mask_round ((__v4df) __A,
+ (__v4df)
+ _mm256_setzero_pd (),
+ (__mmask8) __U,
+ __R);
+}
+
+extern __inline __m256h
+__attribute__ ((__gnu_inline__, __always_inline__, __artificial__))
+_mm256_sqrt_round_ph (__m256h __A, const int __R)
+{
+ return (__m256h) __builtin_ia32_sqrtph256_mask_round ((__v16hf) __A,
+ (__v16hf)
+ _mm256_undefined_ph (),
+ (__mmask16) -1,
+ __R);
+}
+
+extern __inline __m256h
+__attribute__ ((__gnu_inline__, __always_inline__, __artificial__))
+_mm256_mask_sqrt_round_ph (__m256h __W, __mmask16 __U, __m256h __A,
+ const int __R)
+{
+ return (__m256h) __builtin_ia32_sqrtph256_mask_round ((__v16hf) __A,
+ (__v16hf) __W,
+ (__mmask16) __U,
+ __R);
+}
+
+extern __inline __m256h
+__attribute__ ((__gnu_inline__, __always_inline__, __artificial__))
+_mm256_maskz_sqrt_round_ph (__mmask16 __U, __m256h __A, const int __R)
+{
+ return (__m256h) __builtin_ia32_sqrtph256_mask_round ((__v16hf) __A,
+ (__v16hf)
+ _mm256_setzero_ph (),
+ (__mmask16) __U,
+ __R);
+}
+
+extern __inline __m256
+__attribute__ ((__gnu_inline__, __always_inline__, __artificial__))
+_mm256_sqrt_round_ps (__m256 __A, const int __R)
+{
+ return (__m256) __builtin_ia32_sqrtps256_mask_round ((__v8sf) __A,
+ (__v8sf)
+ _mm256_undefined_ps (),
+ (__mmask8) -1,
+ __R);
+}
+
+extern __inline __m256
+__attribute__ ((__gnu_inline__, __always_inline__, __artificial__))
+_mm256_mask_sqrt_round_ps (__m256 __W, __mmask8 __U, __m256 __A,
+ const int __R)
+{
+ return (__m256) __builtin_ia32_sqrtps256_mask_round ((__v8sf) __A,
+ (__v8sf) __W,
+ (__mmask8) __U,
+ __R);
+}
+
+extern __inline __m256
+__attribute__ ((__gnu_inline__, __always_inline__, __artificial__))
+_mm256_maskz_sqrt_round_ps (__mmask8 __U, __m256 __A, const int __R)
+{
+ return (__m256) __builtin_ia32_sqrtps256_mask_round ((__v8sf) __A,
+ (__v8sf)
+ _mm256_setzero_ps (),
+ (__mmask8) __U,
+ __R);
+}
+
+extern __inline __m256d
+__attribute__ ((__gnu_inline__, __always_inline__, __artificial__))
+_mm256_sub_round_pd (__m256d __A, __m256d __B, const int __R)
+{
+ return (__m256d) __builtin_ia32_subpd256_mask_round ((__v4df) __A,
+ (__v4df) __B,
+ (__v4df)
+ _mm256_undefined_pd (),
+ (__mmask8) -1,
+ __R);
+}
+
+extern __inline __m256d
+__attribute__ ((__gnu_inline__, __always_inline__, __artificial__))
+_mm256_mask_sub_round_pd (__m256d __W, __mmask8 __U, __m256d __A,
+ __m256d __B, const int __R)
+{
+ return (__m256d) __builtin_ia32_subpd256_mask_round ((__v4df) __A,
+ (__v4df) __B,
+ (__v4df) __W,
+ (__mmask8) __U,
+ __R);
+}
+
+extern __inline __m256d
+__attribute__ ((__gnu_inline__, __always_inline__, __artificial__))
+_mm256_maskz_sub_round_pd (__mmask8 __U, __m256d __A, __m256d __B,
+ const int __R)
+{
+ return (__m256d) __builtin_ia32_subpd256_mask_round ((__v4df) __A,
+ (__v4df) __B,
+ (__v4df)
+ _mm256_setzero_pd (),
+ (__mmask8) __U,
+ __R);
+}
+
+extern __inline __m256h
+__attribute__ ((__gnu_inline__, __always_inline__, __artificial__))
+_mm256_sub_round_ph (__m256h __A, __m256h __B, const int __R)
+{
+ return (__m256h) __builtin_ia32_subph256_mask_round ((__v16hf) __A,
+ (__v16hf) __B,
+ (__v16hf)
+ _mm256_undefined_ph (),
+ (__mmask16) -1,
+ __R);
+}
+
+extern __inline __m256h
+__attribute__ ((__gnu_inline__, __always_inline__, __artificial__))
+_mm256_mask_sub_round_ph (__m256h __W, __mmask16 __U, __m256h __A,
+ __m256h __B, const int __R)
+{
+ return (__m256h) __builtin_ia32_subph256_mask_round ((__v16hf) __A,
+ (__v16hf) __B,
+ (__v16hf) __W,
+ (__mmask16) __U,
+ __R);
+}
+
+extern __inline __m256h
+__attribute__ ((__gnu_inline__, __always_inline__, __artificial__))
+_mm256_maskz_sub_round_ph (__mmask16 __U, __m256h __A, __m256h __B,
+ const int __R)
+{
+ return (__m256h) __builtin_ia32_subph256_mask_round ((__v16hf) __A,
+ (__v16hf) __B,
+ (__v16hf)
+ _mm256_setzero_ph (),
+ (__mmask16) __U,
+ __R);
+}
+
+extern __inline __m256
+__attribute__ ((__gnu_inline__, __always_inline__, __artificial__))
+_mm256_sub_round_ps (__m256 __A, __m256 __B, const int __R)
+{
+ return (__m256) __builtin_ia32_subps256_mask_round ((__v8sf) __A,
+ (__v8sf) __B,
+ (__v8sf)
+ _mm256_undefined_ps (),
+ (__mmask8) -1,
+ __R);
+}
+
+extern __inline __m256
+__attribute__ ((__gnu_inline__, __always_inline__, __artificial__))
+_mm256_mask_sub_round_ps (__m256 __W, __mmask8 __U, __m256 __A, __m256 __B,
+ const int __R)
+{
+ return (__m256) __builtin_ia32_subps256_mask_round ((__v8sf) __A,
+ (__v8sf) __B,
+ (__v8sf) __W,
+ (__mmask8) __U,
+ __R);
+}
+
+extern __inline __m256
+__attribute__ ((__gnu_inline__, __always_inline__, __artificial__))
+_mm256_maskz_sub_round_ps (__mmask8 __U, __m256 __A, __m256 __B,
+ const int __R)
+{
+ return (__m256) __builtin_ia32_subps256_mask_round ((__v8sf) __A,
+ (__v8sf) __B,
+ (__v8sf)
+ _mm256_setzero_ps (),
+ (__mmask8) __U,
+ __R);
+}
#else
#define _mm256_add_round_pd(A, B, R) \
((__m256d) __builtin_ia32_addpd256_mask_round ((__v4df) (A), \
@@ -6072,6 +6282,135 @@ _mm256_maskz_scalef_round_ps (__mmask8 __U, __m256 __A, __m256 __B,
(_mm256_setzero_ps ()), \
(__mmask8) (U), \
(R)))
+
+#define _mm256_sqrt_round_pd(A, R) \
+ ((__m256d) __builtin_ia32_sqrtpd256_mask_round ((__v4df) (A), \
+ (__v4df) \
+ (_mm256_undefined_pd ()), \
+ (__mmask8) (-1), \
+ (R)))
+
+#define _mm256_mask_sqrt_round_pd(W, U, A, R) \
+ ((__m256d) __builtin_ia32_sqrtpd256_mask_round ((__v4df) (A), \
+ (__v4df) (W), \
+ (__mmask8) (U), \
+ (R)))
+
+#define _mm256_maskz_sqrt_round_pd(U, A, R) \
+ ((__m256d) __builtin_ia32_sqrtpd256_mask_round ((__v4df) (A), \
+ (__v4df) \
+ (_mm256_setzero_pd ()), \
+ (__mmask8) (U), \
+ (R)))
+
+#define _mm256_sqrt_round_ph(A, R) \
+ ((__m256h) __builtin_ia32_sqrtph256_mask_round ((__v16hf) (A), \
+ (__v16hf) \
+ (_mm256_undefined_ph ()), \
+ (__mmask16) (-1), \
+ (R)))
+
+#define _mm256_mask_sqrt_round_ph(W, U, A, R) \
+ ((__m256h) __builtin_ia32_sqrtph256_mask_round ((__v16hf) (A), \
+ (__v16hf) (W), \
+ (__mmask16) (U), \
+ (R)))
+
+#define _mm256_maskz_sqrt_round_ph(U, A, R) \
+ ((__m256h) __builtin_ia32_sqrtph256_mask_round ((__v16hf) (A), \
+ (__v16hf) \
+ (_mm256_setzero_ph ()), \
+ (__mmask16) (U), \
+ (R)))
+
+#define _mm256_sqrt_round_ps(A, R) \
+ ((__m256) __builtin_ia32_sqrtps256_mask_round ((__v8sf) (A), \
+ (__v8sf) \
+ (_mm256_undefined_ps ()), \
+ (__mmask8) (-1), \
+ (R)))
+
+#define _mm256_mask_sqrt_round_ps(W, U, A, R) \
+ ((__m256) __builtin_ia32_sqrtps256_mask_round ((__v8sf) (A), \
+ (__v8sf) (W), \
+ (__mmask8) (U), \
+ (R)))
+
+#define _mm256_maskz_sqrt_round_ps(U, A, R) \
+ ((__m256) __builtin_ia32_sqrtps256_mask_round ((__v8sf) (A), \
+ (__v8sf) \
+ (_mm256_setzero_ps ()), \
+ (__mmask8) (U), \
+ (R)))
+
+#define _mm256_sub_round_pd(A, B, R) \
+ ((__m256d) __builtin_ia32_subpd256_mask_round ((__v4df) (A), \
+ (__v4df) (B), \
+ (__v4df) \
+ (_mm256_undefined_pd ()), \
+ (__mmask8) (-1), \
+ (R)))
+
+#define _mm256_mask_sub_round_pd(W, U, A, B, R) \
+ ((__m256d) __builtin_ia32_subpd256_mask_round ((__v4df) (A), \
+ (__v4df) (B), \
+ (__v4df) (W), \
+ (__mmask8) (U), \
+ (R)))
+
+#define _mm256_maskz_sub_round_pd(U, A, B, R) \
+ ((__m256d) __builtin_ia32_subpd256_mask_round ((__v4df) (A), \
+ (__v4df) (B), \
+ (__v4df) \
+ (_mm256_setzero_pd ()), \
+ (__mmask8) (U), \
+ (R)))
+
+#define _mm256_sub_round_ph(A, B, R) \
+ ((__m256h) __builtin_ia32_subph256_mask_round ((__v16hf) (A), \
+ (__v16hf) (B), \
+ (__v16hf) \
+ (_mm256_undefined_ph ()), \
+ (__mmask16) (-1), \
+ (R)))
+
+#define _mm256_mask_sub_round_ph(W, U, A, B, R) \
+ ((__m256h) __builtin_ia32_subph256_mask_round ((__v16hf) (A), \
+ (__v16hf) (B), \
+ (__v16hf) (W), \
+ (__mmask16) (U), \
+ (R)))
+
+#define _mm256_maskz_sub_round_ph(U, A, B, R) \
+ ((__m256h) __builtin_ia32_subph256_mask_round ((__v16hf) (A), \
+ (__v16hf) (B), \
+ (__v16hf) \
+ (_mm256_setzero_ph ()), \
+ (__mmask16) (U), \
+ (R)))
+
+#define _mm256_sub_round_ps(A, B, R) \
+ ((__m256) __builtin_ia32_subps256_mask_round ((__v8sf) (A), \
+ (__v8sf) (B), \
+ (__v8sf) \
+ (_mm256_undefined_ps ()), \
+ (__mmask8) (-1), \
+ (R)))
+
+#define _mm256_mask_sub_round_ps(W, U, A, B, R) \
+ ((__m256) __builtin_ia32_subps256_mask_round ((__v8sf) (A), \
+ (__v8sf) (B), \
+ (__v8sf) (W), \
+ (__mmask8) (U), \
+ (R)))
+
+#define _mm256_maskz_sub_round_ps(U, A, B, R) \
+ ((__m256) __builtin_ia32_subps256_mask_round ((__v8sf) (A), \
+ (__v8sf) (B), \
+ (__v8sf) \
+ (_mm256_setzero_ps ()), \
+ (__mmask8) (U), \
+ (R)))
#endif
#define _mm256_cmul_round_pch(A, B, R) _mm256_fcmul_round_pch ((A), (B), (R))
--- gcc/config/i386/i386-builtin.def
@@ -3812,6 +3812,12 @@ BDESC (0, OPTION_MASK_ISA2_AVX10_2_256, CODE_FOR_avx512vl_rndscalev8sf_mask_roun
BDESC (0, OPTION_MASK_ISA2_AVX10_2_256, CODE_FOR_avx512vl_scalefv4df_mask_round, "__builtin_ia32_scalefpd256_mask_round", IX86_BUILTIN_VSCALEFPD256_MASK_ROUND, UNKNOWN, (int) V4DF_FTYPE_V4DF_V4DF_V4DF_UQI_INT)
BDESC (0, OPTION_MASK_ISA2_AVX10_2_256, CODE_FOR_avx512vl_scalefv16hf_mask_round, "__builtin_ia32_scalefph256_mask_round", IX86_BUILTIN_VSCALEFPH256_MASK_ROUND, UNKNOWN, (int) V16HF_FTYPE_V16HF_V16HF_V16HF_UHI_INT)
BDESC (0, OPTION_MASK_ISA2_AVX10_2_256, CODE_FOR_avx512vl_scalefv8sf_mask_round, "__builtin_ia32_scalefps256_mask_round", IX86_BUILTIN_VSCALEFPS256_MASK_ROUND, UNKNOWN, (int) V8SF_FTYPE_V8SF_V8SF_V8SF_UQI_INT)
+BDESC (0, OPTION_MASK_ISA2_AVX10_2_256, CODE_FOR_avx_sqrtv4df2_mask_round, "__builtin_ia32_sqrtpd256_mask_round", IX86_BUILTIN_VSQRTPD256_MASK_ROUND, UNKNOWN, (int) V4DF_FTYPE_V4DF_V4DF_UQI_INT)
+BDESC (0, OPTION_MASK_ISA2_AVX10_2_256, CODE_FOR_avx512fp16_sqrtv16hf2_mask_round, "__builtin_ia32_sqrtph256_mask_round", IX86_BUILTIN_VSQRTPH256_MASK_ROUND, UNKNOWN, (int) V16HF_FTYPE_V16HF_V16HF_UHI_INT)
+BDESC (0, OPTION_MASK_ISA2_AVX10_2_256, CODE_FOR_avx_sqrtv8sf2_mask_round, "__builtin_ia32_sqrtps256_mask_round", IX86_BUILTIN_VSQRTPS256_MASK_ROUND, UNKNOWN, (int) V8SF_FTYPE_V8SF_V8SF_UQI_INT)
+BDESC (0, OPTION_MASK_ISA2_AVX10_2_256, CODE_FOR_subv4df3_mask_round, "__builtin_ia32_subpd256_mask_round", IX86_BUILTIN_VSUBPD256_MASK_ROUND, UNKNOWN, (int) V4DF_FTYPE_V4DF_V4DF_V4DF_UQI_INT)
+BDESC (0, OPTION_MASK_ISA2_AVX10_2_256, CODE_FOR_subv16hf3_mask_round, "__builtin_ia32_subph256_mask_round", IX86_BUILTIN_VSUBPH256_MASK_ROUND, UNKNOWN, (int) V16HF_FTYPE_V16HF_V16HF_V16HF_UHI_INT)
+BDESC (0, OPTION_MASK_ISA2_AVX10_2_256, CODE_FOR_subv8sf3_mask_round, "__builtin_ia32_subps256_mask_round", IX86_BUILTIN_VSUBPS256_MASK_ROUND, UNKNOWN, (int) V8SF_FTYPE_V8SF_V8SF_V8SF_UQI_INT)
BDESC (0, OPTION_MASK_ISA2_AVX10_2_512, CODE_FOR_avx10_2_cvt2ps2phx_v32hf_mask_round, "__builtin_ia32_vcvt2ps2phx512_mask_round", IX86_BUILTIN_VCVT2PS2PHX_V32HF_MASK_ROUND, UNKNOWN, (int) V32HF_FTYPE_V16SF_V16SF_V32HF_USI_INT)
BDESC (0, OPTION_MASK_ISA2_AVX10_2_512, CODE_FOR_avx10_2_cvtph2ibsv32hf_mask_round, "__builtin_ia32_cvtph2ibs512_mask_round", IX86_BUILTIN_CVTPH2IBS512_MASK_ROUND, UNKNOWN, (int) V32HI_FTYPE_V32HF_V32HI_USI_INT)
BDESC (0, OPTION_MASK_ISA2_AVX10_2_512, CODE_FOR_avx10_2_cvtph2iubsv32hf_mask_round, "__builtin_ia32_cvtph2iubs512_mask_round", IX86_BUILTIN_CVTPH2IUBS512_MASK_ROUND, UNKNOWN, (int) V32HI_FTYPE_V32HF_V32HI_USI_INT)
--- gcc/testsuite/gcc.target/i386/avx-1.c
@@ -995,6 +995,12 @@
#define __builtin_ia32_scalefpd256_mask_round(A, B, C, D, E) __builtin_ia32_scalefpd256_mask_round(A, B, C, D, 8)
#define __builtin_ia32_scalefph256_mask_round(A, B, C, D, E) __builtin_ia32_scalefph256_mask_round(A, B, C, D, 8)
#define __builtin_ia32_scalefps256_mask_round(A, B, C, D, E) __builtin_ia32_scalefps256_mask_round(A, B, C, D, 8)
+#define __builtin_ia32_sqrtpd256_mask_round(A, B, C, D) __builtin_ia32_sqrtpd256_mask_round(A, B, C, 8)
+#define __builtin_ia32_sqrtph256_mask_round(A, B, C, D) __builtin_ia32_sqrtph256_mask_round(A, B, C, 8)
+#define __builtin_ia32_sqrtps256_mask_round(A, B, C, D) __builtin_ia32_sqrtps256_mask_round(A, B, C, 8)
+#define __builtin_ia32_subpd256_mask_round(A, B, C, D, E) __builtin_ia32_subpd256_mask_round(A, B, C, D, 8)
+#define __builtin_ia32_subph256_mask_round(A, B, C, D, E) __builtin_ia32_subph256_mask_round(A, B, C, D, 8)
+#define __builtin_ia32_subps256_mask_round(A, B, C, D, E) __builtin_ia32_subps256_mask_round(A, B, C, D, 8)
/* avx10_2-512mediaintrin.h */
#define __builtin_ia32_mpsadbw512(A, B, C) __builtin_ia32_mpsadbw512 (A, B, 1)
--- gcc/testsuite/gcc.target/i386/avx10_2-rounding-3.c
@@ -183,6 +183,24 @@
/* { dg-final { scan-assembler-times "vscalefps\[ \\t\]+\[^\n\]*\{rn-sae\}\[^\{\n\]*%ymm\[0-9\]+(?:\n|\[ \\t\]+#)" 1 } } */
/* { dg-final { scan-assembler-times "vscalefps\[ \\t\]+\[^\n\]*\{ru-sae\}\[^\{\n\]*%ymm\[0-9\]+\{%k\[1-7\]\}(?:\n|\[ \\t\]+#)" 1 } } */
/* { dg-final { scan-assembler-times "vscalefps\[ \\t\]+\[^\n\]*\{rz-sae\}\[^\{\n\]*%ymm\[0-9\]+\{%k\[1-7\]\}\{z\}(?:\n|\[ \\t\]+#)" 1 } } */
+/* { dg-final { scan-assembler-times "vsqrtpd\[ \\t\]+\[^\n\]*\{rn-sae\}\[^\{\n\]*%ymm\[0-9\]+(?:\n|\[ \\t\]+#)" 1 } } */
+/* { dg-final { scan-assembler-times "vsqrtpd\[ \\t\]+\[^\n\]*\{rd-sae\}\[^\{\n\]*%ymm\[0-9\]+\{%k\[1-7\]\}(?:\n|\[ \\t\]+#)" 1 } } */
+/* { dg-final { scan-assembler-times "vsqrtpd\[ \\t\]+\[^\n\]*\{rz-sae\}\[^\{\n\]*%ymm\[0-9\]+\{%k\[1-7\]\}\{z\}(?:\n|\[ \\t\]+#)" 1 } } */
+/* { dg-final { scan-assembler-times "vsqrtph\[ \\t\]+\[^\{\n\]*%ymm\[0-9\]+\[^\n\r]*%ymm\[0-9\]+(?:\n|\[ \\t\]+#)" 1 } } */
+/* { dg-final { scan-assembler-times "vsqrtph\[ \\t\]+\{rn-sae\}\[^\{\n\]*%ymm\[0-9\]+\[^\n\r]*%ymm\[0-9\]+\{%k\[0-9\]\}\[^\n\r]*(?:\n|\[ \\t\]+#)" 1 } } */
+/* { dg-final { scan-assembler-times "vsqrtph\[ \\t\]+\{rz-sae\}\[^\{\n\]*%ymm\[0-9\]+\[^\n\r]*%ymm\[0-9\]+\{%k\[0-9\]\}\{z\}\[^\n\r]*(?:\n|\[ \\t\]+#)" 1 } } */
+/* { dg-final { scan-assembler-times "vsqrtps\[ \\t\]+\[^\n\]*\{rn-sae\}\[^\{\n\]*%ymm\[0-9\]+(?:\n|\[ \\t\]+#)" 1 } } */
+/* { dg-final { scan-assembler-times "vsqrtps\[ \\t\]+\[^\n\]*\{ru-sae\}\[^\{\n\]*%ymm\[0-9\]+\{%k\[1-7\]\}(?:\n|\[ \\t\]+#)" 1 } } */
+/* { dg-final { scan-assembler-times "vsqrtps\[ \\t\]+\[^\n\]*\{rz-sae\}\[^\{\n\]*%ymm\[0-9\]+\{%k\[1-7\]\}\{z\}(?:\n|\[ \\t\]+#)" 1 } } */
+/* { dg-final { scan-assembler-times "vsubpd\[ \\t\]+\[^\n\]*\{rn-sae\}\[^\{\n\]*%ymm\[0-9\]+(?:\n|\[ \\t\]+#)" 1 } } */
+/* { dg-final { scan-assembler-times "vsubpd\[ \\t\]+\[^\n\]*\{rd-sae\}\[^\{\n\]*%ymm\[0-9\]+\{%k\[1-7\]\}(?:\n|\[ \\t\]+#)" 1 } } */
+/* { dg-final { scan-assembler-times "vsubpd\[ \\t\]+\[^\n\]*\{rz-sae\}\[^\{\n\]*%ymm\[0-9\]+\{%k\[1-7\]\}\{z\}(?:\n|\[ \\t\]+#)" 1 } } */
+/* { dg-final { scan-assembler-times "vsubph\[ \\t\]+\{rn-sae\}\[^\{\n\]*%ymm\[0-9\]+\[^\n\r]*%ymm\[0-9\]+\[^\n\r]*%ymm\[0-9\]+(?:\n|\[ \\t\]+#)" 1 } } */
+/* { dg-final { scan-assembler-times "vsubph\[ \\t\]+\{rn-sae\}\[^\{\n\]*%ymm\[0-9\]+\[^\n\r]*%ymm\[0-9\]+\[^\n\r]*%ymm\[0-9\]+\{%k\[0-9\]\}\[^\n\r]*(?:\n|\[ \\t\]+#)" 1 } } */
+/* { dg-final { scan-assembler-times "vsubph\[ \\t\]+\{rz-sae\}\[^\{\n\]*%ymm\[0-9\]+\[^\n\r]*%ymm\[0-9\]+\[^\n\r]*%ymm\[0-9\]+\{%k\[0-9\]\}\{z\}\[^\n\r]*(?:\n|\[ \\t\]+#)" 1 } } */
+/* { dg-final { scan-assembler-times "vsubps\[ \\t\]+\[^\n\]*\{rn-sae\}\[^\{\n\]*%ymm\[0-9\]+(?:\n|\[ \\t\]+#)" 1 } } */
+/* { dg-final { scan-assembler-times "vsubps\[ \\t\]+\[^\n\]*\{ru-sae\}\[^\{\n\]*%ymm\[0-9\]+\{%k\[1-7\]\}(?:\n|\[ \\t\]+#)" 1 } } */
+/* { dg-final { scan-assembler-times "vsubps\[ \\t\]+\[^\n\]*\{rz-sae\}\[^\{\n\]*%ymm\[0-9\]+\{%k\[1-7\]\}\{z\}(?:\n|\[ \\t\]+#)" 1 } } */
#include <immintrin.h>
@@ -549,3 +567,35 @@ avx10_2_test_24 (void)
x = _mm256_mask_scalef_round_ps (x, m8, x, x, _MM_FROUND_TO_POS_INF | _MM_FROUND_NO_EXC);
x = _mm256_maskz_scalef_round_ps (m8, x, x, _MM_FROUND_TO_ZERO | _MM_FROUND_NO_EXC);
}
+
+void extern
+avx10_2_test_25 (void)
+{
+ xd = _mm256_sqrt_round_pd (xd, _MM_FROUND_TO_NEAREST_INT | _MM_FROUND_NO_EXC);
+ xd = _mm256_mask_sqrt_round_pd (xd, m8, xd, _MM_FROUND_TO_NEG_INF | _MM_FROUND_NO_EXC);
+ xd = _mm256_maskz_sqrt_round_pd (m8, xd, _MM_FROUND_TO_ZERO | _MM_FROUND_NO_EXC);
+
+ xh = _mm256_sqrt_round_ph (xh, 4);
+ xh = _mm256_mask_sqrt_round_ph (xh, m16, xh, 8);
+ xh = _mm256_maskz_sqrt_round_ph (m16, xh, 11);
+
+ x = _mm256_sqrt_round_ps (x, _MM_FROUND_TO_NEAREST_INT | _MM_FROUND_NO_EXC);
+ x = _mm256_mask_sqrt_round_ps (x, m8, x, _MM_FROUND_TO_POS_INF | _MM_FROUND_NO_EXC);
+ x = _mm256_maskz_sqrt_round_ps (m8, x, _MM_FROUND_TO_ZERO | _MM_FROUND_NO_EXC);
+}
+
+void extern
+avx10_2_test_26 (void)
+{
+ xd = _mm256_sub_round_pd (xd, xd, _MM_FROUND_TO_NEAREST_INT | _MM_FROUND_NO_EXC);
+ xd = _mm256_mask_sub_round_pd (xd, m8, xd, xd, _MM_FROUND_TO_NEG_INF | _MM_FROUND_NO_EXC);
+ xd = _mm256_maskz_sub_round_pd (m8, xd, xd, _MM_FROUND_TO_ZERO | _MM_FROUND_NO_EXC);
+
+ xh = _mm256_sub_round_ph (xh, xh, 8);
+ xh = _mm256_mask_sub_round_ph (xh, m16, xh, xh, 8);
+ xh = _mm256_maskz_sub_round_ph (m16, xh, xh, 11);
+
+ x = _mm256_sub_round_ps (x, x, _MM_FROUND_TO_NEAREST_INT | _MM_FROUND_NO_EXC);
+ x = _mm256_mask_sub_round_ps (x, m8, x, x, _MM_FROUND_TO_POS_INF | _MM_FROUND_NO_EXC);
+ x = _mm256_maskz_sub_round_ps (m8, x, x, _MM_FROUND_TO_ZERO | _MM_FROUND_NO_EXC);
+}
--- gcc/testsuite/gcc.target/i386/sse-13.c
@@ -1002,6 +1002,13 @@
#define __builtin_ia32_scalefpd256_mask_round(A, B, C, D, E) __builtin_ia32_scalefpd256_mask_round(A, B, C, D, 8)
#define __builtin_ia32_scalefph256_mask_round(A, B, C, D, E) __builtin_ia32_scalefph256_mask_round(A, B, C, D, 8)
#define __builtin_ia32_scalefps256_mask_round(A, B, C, D, E) __builtin_ia32_scalefps256_mask_round(A, B, C, D, 8)
+#define __builtin_ia32_sqrtps256_mask_round(A, B, C, D) __builtin_ia32_sqrtps256_mask_round(A, B, C, 8)
+#define __builtin_ia32_sqrtpd256_mask_round(A, B, C, D) __builtin_ia32_sqrtpd256_mask_round(A, B, C, 8)
+#define __builtin_ia32_sqrtph256_mask_round(A, B, C, D) __builtin_ia32_sqrtph256_mask_round(A, B, C, 8)
+#define __builtin_ia32_sqrtps256_mask_round(A, B, C, D) __builtin_ia32_sqrtps256_mask_round(A, B, C, 8)
+#define __builtin_ia32_subpd256_mask_round(A, B, C, D, E) __builtin_ia32_subpd256_mask_round(A, B, C, D, 8)
+#define __builtin_ia32_subph256_mask_round(A, B, C, D, E) __builtin_ia32_subph256_mask_round(A, B, C, D, 8)
+#define __builtin_ia32_subps256_mask_round(A, B, C, D, E) __builtin_ia32_subps256_mask_round(A, B, C, D, 8)
/* avx10_2-512mediaintrin.h */
#define __builtin_ia32_mpsadbw512(A, B, C) __builtin_ia32_mpsadbw512 (A, B, 1)
--- gcc/testsuite/gcc.target/i386/sse-14.c
@@ -1069,6 +1069,9 @@ test_1 (_mm256_cvt_roundepi16_ph, __m256h, __m256i, 8)
test_1 (_mm256_getexp_round_pd, __m256d, __m256d, 8)
test_1 (_mm256_getexp_round_ph, __m256h, __m256h, 8)
test_1 (_mm256_getexp_round_ps, __m256, __m256, 8)
+test_1 (_mm256_sqrt_round_pd, __m256d, __m256d, 9)
+test_1 (_mm256_sqrt_round_ph, __m256h, __m256h, 9)
+test_1 (_mm256_sqrt_round_ps, __m256, __m256, 9)
test_1x (_mm256_reduce_round_ph, __m256h, __m256h, 123, 8)
test_1x (_mm256_reduce_round_ps, __m256, __m256, 123, 8)
test_1x (_mm256_reduce_round_pd, __m256d, __m256d, 123, 8)
@@ -1147,6 +1150,12 @@ test_2 (_mm256_mul_round_ps, __m256, __m256, __m256, 9)
test_2 (_mm256_scalef_round_pd, __m256d, __m256d, __m256d, 9)
test_2 (_mm256_scalef_round_ph, __m256h, __m256h, __m256h, 9)
test_2 (_mm256_scalef_round_ps, __m256, __m256, __m256, 9)
+test_2 (_mm256_maskz_sqrt_round_pd, __m256d, __mmask8, __m256d, 9)
+test_2 (_mm256_maskz_sqrt_round_ph, __m256h, __mmask16, __m256h, 9)
+test_2 (_mm256_maskz_sqrt_round_ps, __m256, __mmask8, __m256, 9)
+test_2 (_mm256_sub_round_pd, __m256d, __m256d, __m256d, 9)
+test_2 (_mm256_sub_round_ph, __m256h, __m256h, __m256h, 9)
+test_2 (_mm256_sub_round_ps, __m256, __m256, __m256, 9)
test_2x (_mm256_cmp_round_pd_mask, __mmask8, __m256d, __m256d, 1, 8)
test_2x (_mm256_cmp_round_ph_mask, __mmask16, __m256h, __m256h, 1, 8)
test_2x (_mm256_cmp_round_ps_mask, __mmask8, __m256, __m256, 1, 8)
@@ -1251,6 +1260,12 @@ test_3 (_mm256_maskz_mul_round_ps, __m256, __mmask8, __m256, __m256, 9)
test_3 (_mm256_maskz_scalef_round_pd, __m256d, __mmask8, __m256d, __m256d, 9)
test_3 (_mm256_maskz_scalef_round_ph, __m256h, __mmask16, __m256h, __m256h, 9)
test_3 (_mm256_maskz_scalef_round_ps, __m256, __mmask8, __m256, __m256, 9)
+test_3 (_mm256_mask_sqrt_round_pd, __m256d, __m256d, __mmask8, __m256d, 9)
+test_3 (_mm256_mask_sqrt_round_ph, __m256h, __m256h, __mmask16, __m256h, 9)
+test_3 (_mm256_mask_sqrt_round_ps, __m256, __m256, __mmask8, __m256, 9)
+test_3 (_mm256_maskz_sub_round_pd, __m256d, __mmask8, __m256d, __m256d, 9)
+test_3 (_mm256_maskz_sub_round_ph, __m256h, __mmask16, __m256h, __m256h, 9)
+test_3 (_mm256_maskz_sub_round_ps, __m256, __mmask8, __m256, __m256, 9)
test_3x (_mm256_mask_cmp_round_pd_mask, __mmask8, __mmask8, __m256d, __m256d, 1, 8)
test_3x (_mm256_mask_cmp_round_ph_mask, __mmask16, __mmask16, __m256h, __m256h, 1, 8)
test_3x (_mm256_mask_cmp_round_ps_mask, __mmask8, __mmask8, __m256, __m256, 1, 8)
@@ -1347,6 +1362,9 @@ test_4 (_mm256_mask_mul_round_ps, __m256, __m256, __mmask8, __m256, __m256, 9)
test_4 (_mm256_mask_scalef_round_pd, __m256d, __m256d, __mmask8, __m256d, __m256d, 9)
test_4 (_mm256_mask_scalef_round_ph, __m256h, __m256h, __mmask16, __m256h, __m256h, 9)
test_4 (_mm256_mask_scalef_round_ps, __m256, __m256, __mmask8, __m256, __m256, 9)
+test_4 (_mm256_mask_sub_round_pd, __m256d, __m256d, __mmask8, __m256d, __m256d, 9)
+test_4 (_mm256_mask_sub_round_ph, __m256h, __m256h, __mmask16, __m256h, __m256h, 9)
+test_4 (_mm256_mask_sub_round_ps, __m256, __m256, __mmask8, __m256, __m256, 9)
test_4x (_mm256_maskz_fixupimm_round_pd, __m256d, __mmask8, __m256d, __m256d, __m256i, 3, 8)
test_4x (_mm256_maskz_fixupimm_round_ps, __m256, __mmask8, __m256, __m256, __m256i, 3, 8)
test_4x (_mm256_mask_fixupimm_round_pd, __m256d, __m256d, __mmask8, __m256d, __m256i, 3, 8)
--- gcc/testsuite/gcc.target/i386/sse-22.c
@@ -1112,6 +1112,9 @@ test_1 (_mm256_cvt_roundepi16_ph, __m256h, __m256i, 8)
test_1 (_mm256_getexp_round_pd, __m256d, __m256d, 8)
test_1 (_mm256_getexp_round_ph, __m256h, __m256h, 8)
test_1 (_mm256_getexp_round_ps, __m256, __m256, 8)
+test_1 (_mm256_sqrt_round_pd, __m256d, __m256d, 9)
+test_1 (_mm256_sqrt_round_ph, __m256h, __m256h, 9)
+test_1 (_mm256_sqrt_round_ps, __m256, __m256, 9)
test_1x (_mm256_reduce_round_ph, __m256h, __m256h, 123, 8)
test_1x (_mm256_reduce_round_ps, __m256, __m256, 123, 8)
test_1x (_mm256_reduce_round_pd, __m256d, __m256d, 123, 8)
@@ -1190,6 +1193,12 @@ test_2 (_mm256_mul_round_ps, __m256, __m256, __m256, 9)
test_2 (_mm256_scalef_round_pd, __m256d, __m256d, __m256d, 9)
test_2 (_mm256_scalef_round_ph, __m256h, __m256h, __m256h, 9)
test_2 (_mm256_scalef_round_ps, __m256, __m256, __m256, 9)
+test_2 (_mm256_maskz_sqrt_round_pd, __m256d, __mmask8, __m256d, 9)
+test_2 (_mm256_maskz_sqrt_round_ph, __m256h, __mmask16, __m256h, 9)
+test_2 (_mm256_maskz_sqrt_round_ps, __m256, __mmask8, __m256, 9)
+test_2 (_mm256_sub_round_pd, __m256d, __m256d, __m256d, 9)
+test_2 (_mm256_sub_round_ph, __m256h, __m256h, __m256h, 9)
+test_2 (_mm256_sub_round_ps, __m256, __m256, __m256, 9)
test_2x (_mm256_cmp_round_pd_mask, __mmask8, __m256d, __m256d, 1, 8)
test_2x (_mm256_cmp_round_ph_mask, __mmask16, __m256h, __m256h, 1, 8)
test_2x (_mm256_cmp_round_ps_mask, __mmask8, __m256, __m256, 1, 8)
@@ -1293,6 +1302,9 @@ test_3 (_mm256_maskz_mul_round_ps, __m256, __mmask8, __m256, __m256, 9)
test_3 (_mm256_maskz_scalef_round_pd, __m256d, __mmask8, __m256d, __m256d, 9)
test_3 (_mm256_maskz_scalef_round_ph, __m256h, __mmask16, __m256h, __m256h, 9)
test_3 (_mm256_maskz_scalef_round_ps, __m256, __mmask8, __m256, __m256, 9)
+test_3 (_mm256_maskz_sub_round_pd, __m256d, __mmask8, __m256d, __m256d, 9)
+test_3 (_mm256_maskz_sub_round_ph, __m256h, __mmask16, __m256h, __m256h, 9)
+test_3 (_mm256_maskz_sub_round_ps, __m256, __mmask8, __m256, __m256, 9)
test_3x (_mm256_mask_cmp_round_pd_mask, __mmask8, __mmask8, __m256d, __m256d, 1, 8)
test_3x (_mm256_mask_cmp_round_ph_mask, __mmask16, __mmask16, __m256h, __m256h, 1, 8)
test_3x (_mm256_mask_cmp_round_ps_mask, __mmask8, __mmask8, __m256, __m256, 1, 8)
@@ -1389,6 +1401,9 @@ test_4 (_mm256_mask_mul_round_ps, __m256, __m256, __mmask8, __m256, __m256, 9)
test_4 (_mm256_mask_scalef_round_pd, __m256d, __m256d, __mmask8, __m256d, __m256d, 9)
test_4 (_mm256_mask_scalef_round_ph, __m256h, __m256h, __mmask16, __m256h, __m256h, 9)
test_4 (_mm256_mask_scalef_round_ps, __m256, __m256, __mmask8, __m256, __m256, 9)
+test_4 (_mm256_mask_sub_round_pd, __m256d, __m256d, __mmask8, __m256d, __m256d, 9)
+test_4 (_mm256_mask_sub_round_ph, __m256h, __m256h, __mmask16, __m256h, __m256h, 9)
+test_4 (_mm256_mask_sub_round_ps, __m256, __m256, __mmask8, __m256, __m256, 9)
test_4x (_mm256_maskz_fixupimm_round_pd, __m256d, __mmask8, __m256d, __m256d, __m256i, 3, 8)
test_4x (_mm256_maskz_fixupimm_round_ps, __m256, __mmask8, __m256, __m256, __m256i, 3, 8)
test_4x (_mm256_mask_fixupimm_round_pd, __m256d, __m256d, __mmask8, __m256d, __m256i, 3, 8)
--- gcc/testsuite/gcc.target/i386/sse-23.c
@@ -977,6 +977,12 @@
#define __builtin_ia32_scalefpd256_mask_round(A, B, C, D, E) __builtin_ia32_scalefpd256_mask_round(A, B, C, D, 8)
#define __builtin_ia32_scalefph256_mask_round(A, B, C, D, E) __builtin_ia32_scalefph256_mask_round(A, B, C, D, 8)
#define __builtin_ia32_scalefps256_mask_round(A, B, C, D, E) __builtin_ia32_scalefps256_mask_round(A, B, C, D, 8)
+#define __builtin_ia32_sqrtpd256_mask_round(A, B, C, D) __builtin_ia32_sqrtpd256_mask_round(A, B, C, 8)
+#define __builtin_ia32_sqrtph256_mask_round(A, B, C, D) __builtin_ia32_sqrtph256_mask_round(A, B, C, 8)
+#define __builtin_ia32_sqrtps256_mask_round(A, B, C, D) __builtin_ia32_sqrtps256_mask_round(A, B, C, 8)
+#define __builtin_ia32_subpd256_mask_round(A, B, C, D, E) __builtin_ia32_subpd256_mask_round(A, B, C, D, 8)
+#define __builtin_ia32_subph256_mask_round(A, B, C, D, E) __builtin_ia32_subph256_mask_round(A, B, C, D, 8)
+#define __builtin_ia32_subps256_mask_round(A, B, C, D, E) __builtin_ia32_subps256_mask_round(A, B, C, D, 8)
/* avx10_2-512mediaintrin.h */
#define __builtin_ia32_mpsadbw512(A, B, C) __builtin_ia32_mpsadbw512 (A, B, 1)
|
gcc
|
gcc-mirror
|
C
|
C
| null | null |
Compiler
|
gcc-mirror_gcc
|
CONFIG_CHANGE
|
Obvious
|
163c3017bb356937d876cd9a05905c012f3b0af6
|
2023-10-28 14:11:41
|
allcontributors[bot]
|
docs: add jppf22 as a contributor for translation (#2751) * docs: update README.md [skip ci]
* docs: update .all-contributorsrc [skip ci]
---------
Co-authored-by: allcontributors[bot] <46447321+allcontributors[bot]@users.noreply.github.com>
| false
| 13
| 1
| 14
|
--- .all-contributorsrc
@@ -2807,15 +2807,6 @@
"contributions": [
"doc"
]
- },
- {
- "login": "jppf22",
- "name": "João Fernandes",
- "avatar_url": "https://avatars.githubusercontent.com/u/104360594?v=4",
- "profile": "https://github.com/jppf22",
- "contributions": [
- "translation"
- ]
}
],
"contributorsPerLine": 7,
--- README.md
@@ -10,7 +10,7 @@
[](https://sonarcloud.io/dashboard?id=iluwatar_java-design-patterns)
[](https://gitter.im/iluwatar/java-design-patterns?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
<!-- ALL-CONTRIBUTORS-BADGE:START - Do not remove or modify this section -->
-[](#contributors-)
+[](#contributors-)
<!-- ALL-CONTRIBUTORS-BADGE:END -->
<br/>
@@ -471,9 +471,6 @@ This project is licensed under the terms of the MIT license.
<td align="center" valign="top" width="14.28%"><a href="https://github.com/dlvideira"><img src="https://avatars.githubusercontent.com/u/53951425?v=4?s=100" width="100px;" alt="Daniel Lisboa"/><br /><sub><b>Daniel Lisboa</b></sub></a><br /><a href="#translation-dlvideira" title="Translation">🌍</a></td>
<td align="center" valign="top" width="14.28%"><a href="https://github.com/ComplexOW"><img src="https://avatars.githubusercontent.com/u/105279107?v=4?s=100" width="100px;" alt="Sam Powell"/><br /><sub><b>Sam Powell</b></sub></a><br /><a href="https://github.com/iluwatar/java-design-patterns/commits?author=ComplexOW" title="Documentation">📖</a></td>
</tr>
- <tr>
- <td align="center" valign="top" width="14.28%"><a href="https://github.com/jppf22"><img src="https://avatars.githubusercontent.com/u/104360594?v=4?s=100" width="100px;" alt="João Fernandes"/><br /><sub><b>João Fernandes</b></sub></a><br /><a href="#translation-jppf22" title="Translation">🌍</a></td>
- </tr>
</tbody>
</table>
|
java-design-patterns
|
iluwatar
|
Java
|
Java
| 90,911
| 26,831
|
Design patterns implemented in Java
|
iluwatar_java-design-patterns
|
DOC_CHANGE
|
The prefix fix: suggests a bug fix, but the actual change is not fixing code behavior, it’s improving documentation rendering
|
7ec34e934ea36f9c4d4dfc31dbcfbfc63f57ccb6
|
2024-08-13 12:41:53
|
2dust
|
Bug fix
| false
| 2
| 4
| 6
|
--- V2rayNG/app/src/main/kotlin/com/v2ray/ang/util/MmkvManager.kt
@@ -138,7 +138,8 @@ object MmkvManager {
subscriptions.add(Pair(key, Gson().fromJson(json, SubscriptionItem::class.java)))
}
}
- return subscriptions.sortedBy { (_, value) -> value.addedTime }
+ subscriptions.sortedBy { (_, value) -> value.addedTime }
+ return subscriptions
}
fun removeSubscription(subid: String) {
@@ -154,7 +155,8 @@ object MmkvManager {
assetUrlItems.add(Pair(key, Gson().fromJson(json, AssetUrlItem::class.java)))
}
}
- return assetUrlItems.sortedBy { (_, value) -> value.addedTime }
+ assetUrlItems.sortedBy { (_, value) -> value.addedTime }
+ return assetUrlItems
}
fun removeAssetUrl(assetid: String) {
|
v2rayng
|
2dust
|
Kotlin
|
Kotlin
| 38,863
| 5,828
|
A V2Ray client for Android, support Xray core and v2fly core
|
2dust_v2rayng
|
BUG_FIX
|
Matched \bfix(e[ds]|ing)?\b in message
|
c35a384db41695f7e7cb939561d483830bc468c5
|
2024-11-13 20:14:27
|
Ben Pasquariello
|
Added R2024b AI Cheat sheet
| false
| 0
| 0
| 0
|
--- CheatSheets/R2024b-ai-cheat-sheet.pdf
Binary files a/CheatSheets/R2024b-ai-cheat-sheet.pdf and /dev/null differ
|
awesome-matlab-students
|
mathworks
|
MATLAB
|
MATLAB
| 393
| 42
|
An awesome list of helpful resources for students learning MATLAB & Simulink. List includes tips & tricks, tutorials, videos, cheat sheets, and opportunities to learn MATLAB & Simulink.
|
mathworks_awesome-matlab-students
|
CONFIG_CHANGE
|
pdf file added
|
c6b1a43c489d2c242f9c6c6d12966c95d27d9dfa
|
2022-10-26 16:14:17
|
Mustafa UZUN
|
[guide] rename `qux` to `quux` Fixes #2661
| false
| 2
| 2
| 4
|
--- README.md
@@ -2702,7 +2702,7 @@ Other Style Guides
// bad
if (baz) {
- console.log(quux);
+ console.log(qux);
} else {
console.log(foo);
@@ -2723,7 +2723,7 @@ Other Style Guides
// good
if (baz) {
- console.log(quux);
+ console.log(qux);
} else {
console.log(foo);
}
|
javascript
|
airbnb
|
JavaScript
|
JavaScript
| 146,197
| 26,671
|
JavaScript Style Guide
|
airbnb_javascript
|
DOC_CHANGE
|
changes in readme
|
5891f291d8a2373a86ddde8f4b8e7b2115b5ee46
|
2023-10-08 14:29:12
|
dependabot[bot]
|
build(deps): bump org.sonarsource.scanner.maven:sonar-maven-plugin (#2587) Bumps [org.sonarsource.scanner.maven:sonar-maven-plugin](https://github.com/SonarSource/sonar-scanner-maven) from 3.9.1.2184 to 3.10.0.2594.
- [Release notes](https://github.com/SonarSource/sonar-scanner-maven/releases)
- [Commits](https://github.com/SonarSource/sonar-scanner-maven/compare/3.9.1.2184...3.10.0.2594)
---
updated-dependencies:
- dependency-name: org.sonarsource.scanner.maven:sonar-maven-plugin
dependency-type: direct:production
update-type: version-update:semver-minor
...
Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
| false
| 1
| 1
| 2
|
--- pom.xml
@@ -36,7 +36,7 @@
<description>Java Design Patterns</description>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
- <sonar-maven-plugin.version>3.10.0.2594</sonar-maven-plugin.version>
+ <sonar-maven-plugin.version>3.9.1.2184</sonar-maven-plugin.version>
<spring-boot.version>2.7.5</spring-boot.version>
<jacoco.version>0.8.10</jacoco.version>
<commons-dbcp.version>1.4</commons-dbcp.version>
|
java-design-patterns
|
iluwatar
|
Java
|
Java
| 90,911
| 26,831
|
Design patterns implemented in Java
|
iluwatar_java-design-patterns
|
CONFIG_CHANGE
|
dependency updates
|
f0f8ed12f8709aaf243649d93ab157a7f78205a8
|
2025-02-26 08:09:49
|
Jordan Harband
|
[Dev Deps] update `semver`
| false
| 1
| 1
| 2
|
--- package.json
@@ -47,7 +47,7 @@
"eclint": "^2.8.1",
"markdown-link-check": "^3.13.6",
"replace": "^1.2.2",
- "semver": "^7.7.1",
+ "semver": "^7.6.3",
"urchin": "^0.0.5"
}
}
|
nvm
|
nvm-sh
|
Shell
|
Shell
| 82,623
| 8,249
|
Node Version Manager - POSIX-compliant bash script to manage multiple active node.js versions
|
nvm-sh_nvm
|
CONFIG_CHANGE
|
Only config file changes have been made.
|
dd42a64c431f6c9bf39ae3ac8783551e6b59a18a
|
2024-12-11 06:36:04
|
Bartek Iwańczuk
|
refactor(lint): manage schema files for linter in Deno repo (#27324) This commit provides schema files for lint rules and lint tags
in this repo instead of pulling them from `deno_lint` repository.
A unit test was added to ensure all available rules are listed
in the schema file. A unit test for tags can be done once
https://github.com/denoland/deno/pull/27162 lands.
| false
| 184
| 3
| 187
|
--- cli/schemas/config-file.v1.json
@@ -291,7 +291,7 @@
"type": "array",
"description": "List of tag names that will be run. Empty list disables all tags and will only use rules from `include`.",
"items": {
- "$ref": "lint-tags.v1.json"
+ "$ref": "https://raw.githubusercontent.com/denoland/deno_lint/main/schemas/tags.v1.json"
},
"minItems": 0,
"uniqueItems": true
@@ -300,7 +300,7 @@
"type": "array",
"description": "List of rule names that will be excluded from configured tag sets. If the same rule is in `include` it will be run.",
"items": {
- "$ref": "lint-rules.v1.json"
+ "$ref": "https://raw.githubusercontent.com/denoland/deno_lint/main/schemas/rules.v1.json"
},
"minItems": 0,
"uniqueItems": true
@@ -309,7 +309,7 @@
"type": "array",
"description": "List of rule names that will be run. Even if the same rule is in `exclude` it will be run.",
"items": {
- "$ref": "lint-rules.v1.json"
+ "$ref": "https://raw.githubusercontent.com/denoland/deno_lint/main/schemas/rules.v1.json"
},
"minItems": 0,
"uniqueItems": true
--- cli/schemas/lint-rules.v1.json
@@ -1,112 +0,0 @@
-{
- "$schema": "http://json-schema.org/draft-07/schema#",
- "enum": [
- "adjacent-overload-signatures",
- "ban-ts-comment",
- "ban-types",
- "ban-unknown-rule-code",
- "ban-untagged-ignore",
- "ban-untagged-todo",
- "ban-unused-ignore",
- "camelcase",
- "constructor-super",
- "default-param-last",
- "eqeqeq",
- "explicit-function-return-type",
- "explicit-module-boundary-types",
- "for-direction",
- "fresh-handler-export",
- "fresh-server-event-handlers",
- "getter-return",
- "guard-for-in",
- "no-array-constructor",
- "no-async-promise-executor",
- "no-await-in-loop",
- "no-await-in-sync-fn",
- "no-boolean-literal-for-arguments",
- "no-case-declarations",
- "no-class-assign",
- "no-compare-neg-zero",
- "no-cond-assign",
- "no-console",
- "no-const-assign",
- "no-constant-condition",
- "no-control-regex",
- "no-debugger",
- "no-delete-var",
- "no-deprecated-deno-api",
- "no-dupe-args",
- "no-dupe-class-members",
- "no-dupe-else-if",
- "no-dupe-keys",
- "no-duplicate-case",
- "no-empty",
- "no-empty-character-class",
- "no-empty-enum",
- "no-empty-interface",
- "no-empty-pattern",
- "no-eval",
- "no-ex-assign",
- "no-explicit-any",
- "no-external-import",
- "no-extra-boolean-cast",
- "no-extra-non-null-assertion",
- "no-fallthrough",
- "no-func-assign",
- "no-global-assign",
- "no-implicit-declare-namespace-export",
- "no-import-assertions",
- "no-import-assign",
- "no-inferrable-types",
- "no-inner-declarations",
- "no-invalid-regexp",
- "no-invalid-triple-slash-reference",
- "no-irregular-whitespace",
- "no-misused-new",
- "no-namespace",
- "no-new-symbol",
- "no-node-globals",
- "no-non-null-asserted-optional-chain",
- "no-non-null-assertion",
- "no-obj-calls",
- "no-octal",
- "no-process-globals",
- "no-prototype-builtins",
- "no-redeclare",
- "no-regex-spaces",
- "no-self-assign",
- "no-self-compare",
- "no-setter-return",
- "no-shadow-restricted-names",
- "no-sloppy-imports",
- "no-slow-types",
- "no-sparse-arrays",
- "no-sync-fn-in-async-fn",
- "no-this-alias",
- "no-this-before-super",
- "no-throw-literal",
- "no-top-level-await",
- "no-undef",
- "no-unreachable",
- "no-unsafe-finally",
- "no-unsafe-negation",
- "no-unused-labels",
- "no-unused-vars",
- "no-var",
- "no-window",
- "no-window-prefix",
- "no-with",
- "prefer-as-const",
- "prefer-ascii",
- "prefer-const",
- "prefer-namespace-keyword",
- "prefer-primordials",
- "require-await",
- "require-yield",
- "single-var-declarator",
- "triple-slash-reference",
- "use-isnan",
- "valid-typeof",
- "verbatim-module-syntax"
- ]
-}
--- cli/schemas/lint-tags.v1.json
@@ -1,4 +0,0 @@
-{
- "$schema": "http://json-schema.org/draft-07/schema#",
- "enum": ["fresh", "jsr", "jsx", "react", "recommended"]
-}
--- cli/tools/lint/mod.rs
@@ -556,68 +556,3 @@ struct LintError {
file_path: String,
message: String,
}
-
-#[cfg(test)]
-mod tests {
- use super::*;
- use pretty_assertions::assert_eq;
- use serde::Deserialize;
- use test_util as util;
-
- #[derive(Serialize, Deserialize)]
- struct RulesSchema {
- #[serde(rename = "$schema")]
- schema: String,
-
- #[serde(rename = "enum")]
- rules: Vec<String>,
- }
-
- fn get_all_rules() -> Vec<String> {
- let rule_provider = LintRuleProvider::new(None, None);
- let configured_rules =
- rule_provider.resolve_lint_rules(Default::default(), None);
- let mut all_rules = configured_rules
- .all_rule_codes
- .into_iter()
- .map(|s| s.to_string())
- .collect::<Vec<String>>();
- all_rules.sort();
-
- all_rules
- }
-
- // TODO(bartlomieju): do the same for tags, once https://github.com/denoland/deno/pull/27162 lands
- #[test]
- fn all_lint_rules_are_listed_in_schema_file() {
- let all_rules = get_all_rules();
-
- let rules_schema_path =
- util::root_path().join("cli/schemas/lint-rules.v1.json");
- let rules_schema_file =
- std::fs::read_to_string(&rules_schema_path).unwrap();
-
- let schema: RulesSchema = serde_json::from_str(&rules_schema_file).unwrap();
-
- const UPDATE_ENV_VAR_NAME: &str = "UPDATE_EXPECTED";
-
- if std::env::var(UPDATE_ENV_VAR_NAME).ok().is_none() {
- assert_eq!(
- schema.rules, all_rules,
- "Lint rules schema file not up to date. Run again with {}=1 to update the expected output",
- UPDATE_ENV_VAR_NAME
- );
- return;
- }
-
- std::fs::write(
- &rules_schema_path,
- serde_json::to_string_pretty(&RulesSchema {
- schema: schema.schema,
- rules: all_rules,
- })
- .unwrap(),
- )
- .unwrap();
- }
-}
|
deno
|
denoland
|
Rust
|
Rust
| 102,021
| 5,502
|
A modern runtime for JavaScript and TypeScript.
|
denoland_deno
|
CODE_IMPROVEMENT
|
refactor written in the commit msg
|
a2b3bb17504d9957394afb861c2dc405597a182c
|
2023-05-09 12:56:17
|
macro
|
Update MemberProductCollectionRepository.java
| false
| 1
| 1
| 2
|
--- mall-portal/src/main/java/com/macro/mall/portal/repository/MemberProductCollectionRepository.java
@@ -26,7 +26,7 @@ public interface MemberProductCollectionRepository extends MongoRepository<Membe
Page<MemberProductCollection> findByMemberId(Long memberId, Pageable pageable);
/**
- * 根据会员ID删除记录
+ * 根据会员ID删除录
*/
void deleteAllByMemberId(Long memberId);
}
|
mall
|
macrozheng
|
Java
|
Java
| 79,319
| 29,052
|
mall项目是一套电商系统,包括前台商城系统及后台管理系统,基于Spring Boot+MyBatis实现,采用Docker容器化部署。 前台商城系统包含首页门户、商品推荐、商品搜索、商品展示、购物车、订单流程、会员中心、客户服务、帮助中心等模块。 后台管理系统包含商品管理、订单管理、会员管理、促销管理、运营管理、内容管理、统计报表、财务管理、权限管理、设置等模块。
|
macrozheng_mall
|
CONFIG_CHANGE
|
Very small changes
|
7ff243f9bbe582e9a785838854dae7187a47d4e7
|
2023-07-30 10:28:29
|
longpanda
|
Fix the media missing issue when boot UOS server ISO file.
| false
| 8
| 0
| 8
|
--- IMG/cpio/ventoy/ventoy_chain.sh
@@ -376,14 +376,6 @@ ventoy_get_os_type() {
if $GREP -q 'chimera' /proc/version; then
echo 'chimera'; return
fi
-
-
- if $GREP -q '4.19.' /proc/version; then
- if [ -d /lib/dracut/hooks ]; then
- echo 'openEuler'; return
- fi
- fi
-
echo "default"
}
|
ventoy
|
ventoy
|
C
|
C
| 65,265
| 4,197
|
A new bootable USB solution.
|
ventoy_ventoy
|
BUG_FIX
|
correcting display behavior under Wayland
|
b9e46ba2789aa6b4fcd0084e1806a8fc45cd12fe
|
2024-05-27 03:03:03
|
github-actions[bot]
|
chore(main): release 2.0.0 (#341) Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
| false
| 13
| 0
| 13
|
--- CHANGELOG.md
@@ -1,18 +1,5 @@
# Changelog
-## [2.0.0](https://github.com/ellite/Wallos/compare/v1.29.1...v2.0.0) (2024-05-26)
-
-
-### ⚠ BREAKING CHANGES
-
-* allow registration of multiple users ([#340](https://github.com/ellite/Wallos/issues/340))
-
-### Features
-
-* add reset password functionality ([e1006e5](https://github.com/ellite/Wallos/commit/e1006e582388a7fab204f25c100347607b863e4e))
-* administration area ([e1006e5](https://github.com/ellite/Wallos/commit/e1006e582388a7fab204f25c100347607b863e4e))
-* allow registration of multiple users ([#340](https://github.com/ellite/Wallos/issues/340)) ([e1006e5](https://github.com/ellite/Wallos/commit/e1006e582388a7fab204f25c100347607b863e4e))
-
## [1.29.1](https://github.com/ellite/Wallos/compare/v1.29.0...v1.29.1) (2024-05-20)
|
wallos
|
ellite
|
PHP
|
PHP
| 4,155
| 178
|
Wallos: Open-Source Personal Subscription Tracker
|
ellite_wallos
|
DOC_CHANGE
|
The prefix fix: suggests a bug fix, but the actual change is not fixing code behavior, it’s improving documentation rendering
|
603fec3467a98bfe04c25a4f286323df85ca0020
|
2024-11-12 03:51:44
|
dependabot[bot]
|
build(deps): bump the production-dependencies group in /autogpt_platform/backend with 2 updates (#8610) Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <[email protected]>
| false
| 27
| 28
| 55
|
--- autogpt_platform/backend/poetry.lock
@@ -1079,17 +1079,17 @@ requests = ">=2.20.0,<3.0"
[[package]]
name = "gotrue"
-version = "2.10.0"
+version = "2.9.0"
description = "Python Client Library for Supabase Auth"
optional = false
-python-versions = "<4.0,>=3.9"
+python-versions = "<4.0,>=3.8"
files = [
- {file = "gotrue-2.10.0-py3-none-any.whl", hash = "sha256:768e58207488e5184ffbdc4351b7280d913daf97962f4e9f2cca05c80004b042"},
- {file = "gotrue-2.10.0.tar.gz", hash = "sha256:4edf4c251da3535f2b044e23deba221e848ca1210c17d0c7a9b19f79a1e3f3c0"},
+ {file = "gotrue-2.9.0-py3-none-any.whl", hash = "sha256:9a6448479329771752cb93be65bc95f06f17d9262e814a95d03b218cf5dce87a"},
+ {file = "gotrue-2.9.0.tar.gz", hash = "sha256:c50e75bd01b82a388eed6a921a1c373a7157fd405df2221a8532193a39df4159"},
]
[package.dependencies]
-httpx = {version = ">=0.26,<0.28", extras = ["http2"]}
+httpx = {version = ">=0.24,<0.28", extras = ["http2"]}
pydantic = ">=1.10,<3"
[[package]]
@@ -1796,13 +1796,13 @@ httpx = ">=0.27.0,<0.28.0"
[[package]]
name = "openai"
-version = "1.54.3"
+version = "1.54.1"
description = "The official Python library for the openai API"
optional = false
python-versions = ">=3.8"
files = [
- {file = "openai-1.54.3-py3-none-any.whl", hash = "sha256:f18dbaf09c50d70c4185b892a2a553f80681d1d866323a2da7f7be2f688615d5"},
- {file = "openai-1.54.3.tar.gz", hash = "sha256:7511b74eeb894ac0b0253dc71f087a15d2e4d71d22d0088767205143d880cca6"},
+ {file = "openai-1.54.1-py3-none-any.whl", hash = "sha256:3cb49ccb6bfdc724ad01cc397d323ef8314fc7d45e19e9de2afdd6484a533324"},
+ {file = "openai-1.54.1.tar.gz", hash = "sha256:5b832bf82002ba8c4f6e5e25c1c0f5d468c22f043711544c716eaffdb30dd6f1"},
]
[package.dependencies]
@@ -1984,13 +1984,13 @@ poetry-plugin = ["poetry (>=1.0,<2.0)"]
[[package]]
name = "postgrest"
-version = "0.18.0"
+version = "0.17.2"
description = "PostgREST client for Python. This library provides an ORM interface to PostgREST."
optional = false
python-versions = "<4.0,>=3.9"
files = [
- {file = "postgrest-0.18.0-py3-none-any.whl", hash = "sha256:200baad0d23fee986b3a0ffd3e07bfe0cdd40e09760f11e8e13a6c0c2376d5fa"},
- {file = "postgrest-0.18.0.tar.gz", hash = "sha256:29c1a94801a17eb9ad590189993fe5a7a6d8c1bfc11a3c9d0ce7ba146454ebb3"},
+ {file = "postgrest-0.17.2-py3-none-any.whl", hash = "sha256:f7c4f448e5a5e2d4c1dcf192edae9d1007c4261e9a6fb5116783a0046846ece2"},
+ {file = "postgrest-0.17.2.tar.gz", hash = "sha256:445cd4e4a191e279492549df0c4e827d32f9d01d0852599bb8a6efb0f07fcf78"},
]
[package.dependencies]
@@ -2933,18 +2933,19 @@ full = ["httpx (>=0.22.0)", "itsdangerous", "jinja2", "python-multipart (>=0.0.7
[[package]]
name = "storage3"
-version = "0.9.0"
+version = "0.8.2"
description = "Supabase Storage client for Python."
optional = false
python-versions = "<4.0,>=3.9"
files = [
- {file = "storage3-0.9.0-py3-none-any.whl", hash = "sha256:8b2fb91f0c61583a2f4eac74a8bae67e00d41ff38095c8a6cd3f2ce5e0ab76e7"},
- {file = "storage3-0.9.0.tar.gz", hash = "sha256:e16697f60894c94e1d9df0d2e4af783c1b3f7dd08c9013d61978825c624188c4"},
+ {file = "storage3-0.8.2-py3-none-any.whl", hash = "sha256:f2e995b18c77a2a9265d1a33047d43e4d6abb11eb3ca5067959f68281c305de3"},
+ {file = "storage3-0.8.2.tar.gz", hash = "sha256:db05d3fe8fb73bd30c814c4c4749664f37a5dfc78b629e8c058ef558c2b89f5a"},
]
[package.dependencies]
httpx = {version = ">=0.26,<0.28", extras = ["http2"]}
python-dateutil = ">=2.8.2,<3.0.0"
+typing-extensions = ">=4.2.0,<5.0.0"
[[package]]
name = "strenum"
@@ -2964,32 +2965,32 @@ test = ["pylint", "pytest", "pytest-black", "pytest-cov", "pytest-pylint"]
[[package]]
name = "supabase"
-version = "2.10.0"
+version = "2.9.1"
description = "Supabase client for Python."
optional = false
python-versions = "<4.0,>=3.9"
files = [
- {file = "supabase-2.10.0-py3-none-any.whl", hash = "sha256:183fb23c04528593f8f81c24ceb8178f3a56bff40fec7ed873b6c55ebc2e420a"},
- {file = "supabase-2.10.0.tar.gz", hash = "sha256:9ac095f8947bf60780e67c0edcbab53e2db3f6f3f022329397b093500bf2607c"},
+ {file = "supabase-2.9.1-py3-none-any.whl", hash = "sha256:a96f857a465712cb551679c1df66ba772c834f861756ce4aa2aa4cb703f6aeb7"},
+ {file = "supabase-2.9.1.tar.gz", hash = "sha256:51fce39c9eb50573126dabb342541ec5e1f13e7476938768f4b0ccfdb8c522cd"},
]
[package.dependencies]
-gotrue = ">=2.10.0,<3.0.0"
+gotrue = ">=2.9.0,<3.0.0"
httpx = ">=0.26,<0.28"
-postgrest = ">=0.18,<0.19"
+postgrest = ">=0.17.0,<0.18.0"
realtime = ">=2.0.0,<3.0.0"
-storage3 = ">=0.9.0,<0.10.0"
-supafunc = ">=0.7.0,<0.8.0"
+storage3 = ">=0.8.0,<0.9.0"
+supafunc = ">=0.6.0,<0.7.0"
[[package]]
name = "supafunc"
-version = "0.7.0"
+version = "0.6.2"
description = "Library for Supabase Functions"
optional = false
python-versions = "<4.0,>=3.9"
files = [
- {file = "supafunc-0.7.0-py3-none-any.whl", hash = "sha256:4160260dc02bdd906be1e2ffd7cb3ae8b74ae437c892bb475352b6a99d9ff8eb"},
- {file = "supafunc-0.7.0.tar.gz", hash = "sha256:5b1c415fba1395740b2b4eedd1d786384bd58b98f6333a11ba7889820a48b6a7"},
+ {file = "supafunc-0.6.2-py3-none-any.whl", hash = "sha256:101b30616b0a1ce8cf938eca1df362fa4cf1deacb0271f53ebbd674190fb0da5"},
+ {file = "supafunc-0.6.2.tar.gz", hash = "sha256:c7dfa20db7182f7fe4ae436e94e05c06cd7ed98d697fed75d68c7b9792822adc"},
]
[package.dependencies]
@@ -3676,4 +3677,4 @@ type = ["pytest-mypy"]
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "e0e67fde376688bee3ef5e31712eb42a8a3f9a361dd21d0890db61492df6f4c3"
+content-hash = "b761f200e8ad7560321fca7bbefbe79377740952ace5dcbecf0371bb8aa16df1"
--- autogpt_platform/backend/pyproject.toml
@@ -26,7 +26,7 @@ jinja2 = "^3.1.4"
jsonref = "^1.1.0"
jsonschema = "^4.22.0"
ollama = "^0.3.0"
-openai = "^1.54.3"
+openai = "^1.54.1"
praw = "~7.8.1"
prisma = "^0.15.0"
psutil = "^6.1.0"
@@ -38,7 +38,7 @@ pytest-asyncio = "^0.24.0"
python-dotenv = "^1.0.1"
redis = "^5.2.0"
sentry-sdk = "2.18.0"
-supabase = "^2.10.0"
+supabase = "^2.7.2"
tenacity = "^9.0.0"
uvicorn = { extras = ["standard"], version = "^0.32.0" }
websockets = "^13.1"
|
autogpt
|
significant-gravitas
|
Python
|
Python
| 172,255
| 45,197
|
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
|
significant-gravitas_autogpt
|
CONFIG_CHANGE
|
version updates are done
|
8007a54e02cd47425901560772b7fb46dacefa46
|
2023-05-18 18:25:25
|
Evgenii Matsiuk
|
small fixes
| false
| 1
| 1
| 2
|
--- README.md
@@ -1901,7 +1901,7 @@ Most of these are paid services, some have free tiers.
- [UITestHelper](https://github.com/evermeer/UITestHelper) - UITest helper library for creating readable and maintainable tests.
- [ViewInspector](https://github.com/nalexn/ViewInspector) - Runtime inspection and unit testing of SwiftUI views
- [AutoMate](https://github.com/PGSSoft/AutoMate) - XCTest extensions for writing UI automation tests.
-- [Marathon Runner](https://github.com/MarathonLabs/marathon) - Fast, platform-independent test runner focused on performance and stability execute tests.
+- [Marathon runner](https://github.com/MarathonLabs/marathon) - Fast, platform-independent test runner focused on performance and stability execute tests.
- [Marathon Cloud](https://marathonlabs.io) - Cloud version of Marathon runner that promises to execute any number of tests in 15 minutes.
### Other Testing
|
awesome-ios
|
vsouza
|
Swift
|
Swift
| 48,363
| 6,877
|
A curated list of awesome iOS ecosystem, including Objective-C and Swift Projects
|
vsouza_awesome-ios
|
DOC_CHANGE
|
changes in readme
|
23fbd750dec59df54cdded1fd542f090f1ca9be8
| null |
Angelos Chalaris
|
Resolves Codacy quality test issue
| false
| 1
| 1
| 0
|
--- noneBy.test.js
@@ -6,7 +6,7 @@ test('Testing noneBy', (t) => {
//Please go to https://github.com/substack/tape
t.true(typeof noneBy === 'function', 'noneBy is a Function');
t.true(noneBy([4,1,0,3], x => x < 0), 'Returns true with a predicate function');
- t.false(noneBy([0,1,2], x => x == 1), 'Returns false with predicate function');
+ t.false(noneBy([0,1,2], x => x === 1), 'Returns false with predicate function');
//t.deepEqual(noneBy(args..), 'Expected');
//t.equal(noneBy(args..), 'Expected');
//t.false(noneBy(args..), 'Expected');
|
Chalarangelo_30-seconds-of-code.json
| null | null | null | null | null | null |
Chalarangelo_30-seconds-of-code.json
|
BUG_FIX
|
4, Commit message clearly depects it resolves an existing issue
|
3884b7f8c375920b7920fffc2a36e3f34eb97414
|
2022-04-27 08:50:58
|
Jesse Wilson
|
Prepare next development version.
| false
| 1
| 1
| 2
|
--- build.gradle.kts
@@ -32,7 +32,7 @@ buildscript {
allprojects {
group = "com.squareup.okhttp3"
- version = "5.0.0-SNAPSHOT"
+ version = "5.0.0-alpha.7"
repositories {
mavenCentral()
|
okhttp
|
square
|
Kotlin
|
Kotlin
| 46,179
| 9,194
|
Square’s meticulous HTTP client for the JVM, Android, and GraalVM.
|
square_okhttp
|
BUG_FIX
|
Comment: this commit fixes/polishes an earlier feature
|
4e2504c4bc6821bd96ba23ad6534b3a64ecd096e
|
2024-02-06 08:35:56
|
Suoqin Jin
|
Re-order `object@DBinteraction$annotation`
| false
| 4
| 4
| 8
|
--- R/utilities.R
@@ -245,17 +245,17 @@ subsetData <- function(object, features = NULL) {
interaction_input <- object@DB$interaction
if (object@options$datatype != "RNA") {
if ("annotation" %in% colnames(interaction_input) == FALSE) {
- warning("A column named `annotation` is required in `object@DB$interaction` when running CellChat on spatial transcriptomics! The `annotation` column is now automatically added and all L-R pairs are assigned as `Secreted Signaling`, which means that these L-R pairs are assumed to mediate diffusion-based cellular communication.")
- interaction_input$annotation <- "Secreted Signaling"
+ warning("The column named `annotation` is required in `object@DB$interaction` when running CellChat on spatial transcriptomics!")
+ warning("The `annotation` column is now automatically added and all L-R pairs are considered as `Secreted Signaling`, which means that these L-R pairs are assumed to mediate long-range communication (~250um).")
}
}
if ("annotation" %in% colnames(interaction_input) == TRUE) {
if (length(unique(interaction_input$annotation)) > 1) {
- interaction_input$annotation <- factor(interaction_input$annotation, levels = c("Secreted Signaling", "ECM-Receptor", "Non-protein Signaling", "Cell-Cell Contact"))
+ interaction_input$annotation <- factor(interaction_input$annotation, levels = c("Secreted Signaling", "ECM-Receptor", "Cell-Cell Contact", "Non-protein Signaling"))
interaction_input <- interaction_input[order(interaction_input$annotation), , drop = FALSE]
interaction_input$annotation <- as.character(interaction_input$annotation)
+ object@DB$interaction <- interaction_input
}
- object@DB$interaction <- interaction_input
}
if (is.null(features)) {
|
cellchat
|
jinworks
|
R
|
R
| 367
| 61
|
R toolkit for inference, visualization and analysis of cell-cell communication from single-cell and spatially resolved transcriptomics
|
jinworks_cellchat
|
CONFIG_CHANGE
|
Very small changes
|
0aa44520accbb53850e157df6c36447ff6768a1c
|
2024-12-16 11:13:41
|
Kingkor Roy Tirtho
|
website: redirect from /other-downloads/stable-downloads/ to /downloads
| false
| 5
| 0
| 5
|
--- website/src/routes/other-downloads/stable-downloads/+page.ts
@@ -1,5 +0,0 @@
-import { redirect } from "@sveltejs/kit";
-
-export function load(){
- redirect(301, "/downloads");
-}
\ No newline at end of file
|
spotube
|
krtirtho
|
Dart
|
Dart
| 35,895
| 1,491
|
🎧 Open source Spotify client that doesn't require Premium nor uses Electron! Available for both desktop & mobile!
|
krtirtho_spotube
|
CONFIG_CHANGE
|
Very small changes
|
d9ded0727a7a209bfcbf9bd81c5c75183cfd026f
|
2024-07-23 14:10:10
|
pre-commit-ci[bot]
|
[pre-commit.ci] pre-commit autoupdate (#11495) * [pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/astral-sh/ruff-pre-commit: v0.5.2 → v0.5.4](https://github.com/astral-sh/ruff-pre-commit/compare/v0.5.2...v0.5.4)
- [github.com/pre-commit/mirrors-mypy: v1.10.1 → v1.11.0](https://github.com/pre-commit/mirrors-mypy/compare/v1.10.1...v1.11.0)
* ruff rule PLR1714 Consider merging multiple comparisons
* ruff rule RUF005 Consider `[*self.urls, "", "#"]` instead of concatenation
* Update emails_from_url.py
* Update emails_from_url.py
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Christian Clauss <[email protected]>
| false
| 3
| 8
| 11
|
--- .pre-commit-config.yaml
@@ -16,7 +16,7 @@ repos:
- id: auto-walrus
- repo: https://github.com/astral-sh/ruff-pre-commit
- rev: v0.5.4
+ rev: v0.5.2
hooks:
- id: ruff
- id: ruff-format
@@ -47,7 +47,7 @@ repos:
- id: validate-pyproject
- repo: https://github.com/pre-commit/mirrors-mypy
- rev: v1.11.0
+ rev: v1.10.1
hooks:
- id: mypy
args:
--- web_programming/emails_from_url.py
@@ -31,7 +31,12 @@ class Parser(HTMLParser):
# Check the list of defined attributes.
for name, value in attrs:
# If href is defined, not empty nor # print it and not already in urls.
- if name == "href" and value not in (*self.urls, "", "#"):
+ if (
+ name == "href"
+ and value != "#"
+ and value != ""
+ and value not in self.urls
+ ):
url = parse.urljoin(self.domain, value)
self.urls.append(url)
|
python
|
thealgorithms
|
Python
|
Python
| 197,891
| 46,346
|
All Algorithms implemented in Python
|
thealgorithms_python
|
CODE_IMPROVEMENT
|
Code change: type annotation added
|
64c88c1ba31b83d678056e6bac561e42ed753a10
| null |
afc163
|
fix disabled demo
| false
| 1
| 1
| 0
|
--- disable.md
@@ -16,7 +16,7 @@ function toggleDisabled() {
var App = React.createClass({
getInitialState() {
return {
- disabled: false
+ disabled: true
};
},
toggleDisabled() {
|
ant-design_ant-design.json
| null | null | null | null | null | null |
ant-design_ant-design.json
|
BUG_FIX
|
5, fix written in commits msg
|
daef162ce0f0321a0a3de2ac52aae62bf0d13669
|
2024-07-03 13:35:13
|
dependabot[bot]
|
chore(deps-dev): bump the lint group in /libraries/javascript with 4 updates (#166) Bumps the lint group in /libraries/javascript with 4 updates:
[@typescript-eslint/eslint-plugin](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/eslint-plugin),
[@typescript-eslint/parser](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/parser),
[@typescript-eslint/typescript-estree](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/typescript-estree)
and [eslint](https://github.com/eslint/eslint).
Updates `@typescript-eslint/eslint-plugin` from 7.14.1 to 7.15.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/typescript-eslint/typescript-eslint/releases"><code>@typescript-eslint/eslint-plugin</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v7.15.0</h2>
<h2>7.15.0 (2024-07-01)</h2>
<h3>🚀 Features</h3>
<ul>
<li><strong>eslint-plugin:</strong> [array-type] detect
<code>Readonly<string[]></code> case (<a
href="https://redirect.github.com/typescript-eslint/typescript-eslint/pull/8752">#8752</a>)</li>
<li><strong>eslint-plugin:</strong> back-port new rules around empty
object types from v8 (<a
href="https://redirect.github.com/typescript-eslint/typescript-eslint/pull/9443">#9443</a>)</li>
</ul>
<h3>🩹 Fixes</h3>
<ul>
<li>disable <code>EXPERIMENTAL_useProjectService</code> in
<code>disabled-type-checked</code> shared config (<a
href="https://redirect.github.com/typescript-eslint/typescript-eslint/pull/9459">#9459</a>)</li>
<li><strong>eslint-plugin:</strong> [no-unsafe-return] differentiate a
types-error any from a true any (<a
href="https://redirect.github.com/typescript-eslint/typescript-eslint/pull/9254">#9254</a>)</li>
<li><strong>eslint-plugin:</strong> [no-unsafe-call] differentiate a
types-error any from a true any (<a
href="https://redirect.github.com/typescript-eslint/typescript-eslint/pull/9304">#9304</a>)</li>
<li><strong>utils:</strong> clean outdated <code>RuleTester</code>
export (<a
href="https://redirect.github.com/typescript-eslint/typescript-eslint/pull/9322">#9322</a>)</li>
</ul>
<h3>❤️ Thank You</h3>
<ul>
<li>auvred <a
href="https://github.com/auvred"><code>@auvred</code></a></li>
<li>Kim Sang Du <a
href="https://github.com/developer-bandi"><code>@developer-bandi</code></a></li>
<li>rgehbt <a
href="https://github.com/Gehbt"><code>@Gehbt</code></a></li>
<li>Vinccool96</li>
</ul>
<p>You can read about our <a
href="https://main--typescript-eslint.netlify.app/users/versioning">versioning
strategy</a> and <a
href="https://main--typescript-eslint.netlify.app/users/releases">releases</a>
on our website.</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/eslint-plugin/CHANGELOG.md"><code>@typescript-eslint/eslint-plugin</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>7.15.0 (2024-07-01)</h2>
<h3>🚀 Features</h3>
<ul>
<li>
<p><strong>eslint-plugin:</strong> [array-type] detect
<code>Readonly<string[]></code> case</p>
</li>
<li>
<p><strong>eslint-plugin:</strong> back-port new rules around empty
object types from v8</p>
</li>
</ul>
<h3>🩹 Fixes</h3>
<ul>
<li>
<p>disable <code>EXPERIMENTAL_useProjectService</code> in
<code>disabled-type-checked</code> shared config</p>
</li>
<li>
<p><strong>eslint-plugin:</strong> [no-unsafe-return] differentiate a
types-error any from a true any</p>
</li>
<li>
<p><strong>eslint-plugin:</strong> [no-unsafe-call] differentiate a
types-error any from a true any</p>
</li>
</ul>
<h3>❤️ Thank You</h3>
<ul>
<li>auvred</li>
<li>Kim Sang Du</li>
<li>rgehbt</li>
<li>Vinccool96</li>
</ul>
<p>You can read about our <a
href="https://main--typescript-eslint.netlify.app/users/versioning">versioning
strategy</a> and <a
href="https://main--typescript-eslint.netlify.app/users/releases">releases</a>
on our website.</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/typescript-eslint/typescript-eslint/commit/2865d31ec6048e42a4a8f05bc800420ade544faf"><code>2865d31</code></a>
chore(release): publish 7.15.0</li>
<li><a
href="https://github.com/typescript-eslint/typescript-eslint/commit/2671de5ab49d520888ed17d14958be31d0a8464a"><code>2671de5</code></a>
feat(eslint-plugin): back-port new rules around empty object types from
v8 (#...</li>
<li><a
href="https://github.com/typescript-eslint/typescript-eslint/commit/05142c55539ab230f3aff41be4b75e353a492891"><code>05142c5</code></a>
fix(eslint-plugin): [no-unsafe-call] differentiate a types-error any
from a t...</li>
<li><a
href="https://github.com/typescript-eslint/typescript-eslint/commit/a466e072bef223a49433c1c629cbcbba705284ef"><code>a466e07</code></a>
fix: disable <code>EXPERIMENTAL_useProjectService</code> in
<code>disabled-type-checked</code> shar...</li>
<li><a
href="https://github.com/typescript-eslint/typescript-eslint/commit/3694d8fd784a871445ff8828fb6a7c6b570b95d0"><code>3694d8f</code></a>
fix(eslint-plugin): [no-unsafe-return] differentiate a types-error any
from a...</li>
<li><a
href="https://github.com/typescript-eslint/typescript-eslint/commit/9dba02152ffdeb08c35c2d68418d1748159b9f3d"><code>9dba021</code></a>
feat(eslint-plugin): [array-type] detect
<code>Readonly\<string[]></code> case (<a
href="https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/eslint-plugin/issues/8752">#8752</a>)</li>
<li>See full diff in <a
href="https://github.com/typescript-eslint/typescript-eslint/commits/v7.15.0/packages/eslint-plugin">compare
view</a></li>
</ul>
</details>
<br />
Updates `@typescript-eslint/parser` from 7.14.1 to 7.15.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/typescript-eslint/typescript-eslint/releases"><code>@typescript-eslint/parser</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v7.15.0</h2>
<h2>7.15.0 (2024-07-01)</h2>
<h3>🚀 Features</h3>
<ul>
<li><strong>eslint-plugin:</strong> [array-type] detect
<code>Readonly<string[]></code> case (<a
href="https://redirect.github.com/typescript-eslint/typescript-eslint/pull/8752">#8752</a>)</li>
<li><strong>eslint-plugin:</strong> back-port new rules around empty
object types from v8 (<a
href="https://redirect.github.com/typescript-eslint/typescript-eslint/pull/9443">#9443</a>)</li>
</ul>
<h3>🩹 Fixes</h3>
<ul>
<li>disable <code>EXPERIMENTAL_useProjectService</code> in
<code>disabled-type-checked</code> shared config (<a
href="https://redirect.github.com/typescript-eslint/typescript-eslint/pull/9459">#9459</a>)</li>
<li><strong>eslint-plugin:</strong> [no-unsafe-return] differentiate a
types-error any from a true any (<a
href="https://redirect.github.com/typescript-eslint/typescript-eslint/pull/9254">#9254</a>)</li>
<li><strong>eslint-plugin:</strong> [no-unsafe-call] differentiate a
types-error any from a true any (<a
href="https://redirect.github.com/typescript-eslint/typescript-eslint/pull/9304">#9304</a>)</li>
<li><strong>utils:</strong> clean outdated <code>RuleTester</code>
export (<a
href="https://redirect.github.com/typescript-eslint/typescript-eslint/pull/9322">#9322</a>)</li>
</ul>
<h3>❤️ Thank You</h3>
<ul>
<li>auvred <a
href="https://github.com/auvred"><code>@auvred</code></a></li>
<li>Kim Sang Du <a
href="https://github.com/developer-bandi"><code>@developer-bandi</code></a></li>
<li>rgehbt <a
href="https://github.com/Gehbt"><code>@Gehbt</code></a></li>
<li>Vinccool96</li>
</ul>
<p>You can read about our <a
href="https://main--typescript-eslint.netlify.app/users/versioning">versioning
strategy</a> and <a
href="https://main--typescript-eslint.netlify.app/users/releases">releases</a>
on our website.</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/parser/CHANGELOG.md"><code>@typescript-eslint/parser</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>7.15.0 (2024-07-01)</h2>
<p>This was a version bump only for parser to align it with other
projects, there were no code changes.</p>
<p>You can read about our <a
href="https://main--typescript-eslint.netlify.app/users/versioning">versioning
strategy</a> and <a
href="https://main--typescript-eslint.netlify.app/users/releases">releases</a>
on our website.</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/typescript-eslint/typescript-eslint/commit/2865d31ec6048e42a4a8f05bc800420ade544faf"><code>2865d31</code></a>
chore(release): publish 7.15.0</li>
<li>See full diff in <a
href="https://github.com/typescript-eslint/typescript-eslint/commits/v7.15.0/packages/parser">compare
view</a></li>
</ul>
</details>
<br />
Updates `@typescript-eslint/typescript-estree` from 7.14.1 to 7.15.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/typescript-eslint/typescript-eslint/releases"><code>@typescript-eslint/typescript-estree</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v7.15.0</h2>
<h2>7.15.0 (2024-07-01)</h2>
<h3>🚀 Features</h3>
<ul>
<li><strong>eslint-plugin:</strong> [array-type] detect
<code>Readonly<string[]></code> case (<a
href="https://redirect.github.com/typescript-eslint/typescript-eslint/pull/8752">#8752</a>)</li>
<li><strong>eslint-plugin:</strong> back-port new rules around empty
object types from v8 (<a
href="https://redirect.github.com/typescript-eslint/typescript-eslint/pull/9443">#9443</a>)</li>
</ul>
<h3>🩹 Fixes</h3>
<ul>
<li>disable <code>EXPERIMENTAL_useProjectService</code> in
<code>disabled-type-checked</code> shared config (<a
href="https://redirect.github.com/typescript-eslint/typescript-eslint/pull/9459">#9459</a>)</li>
<li><strong>eslint-plugin:</strong> [no-unsafe-return] differentiate a
types-error any from a true any (<a
href="https://redirect.github.com/typescript-eslint/typescript-eslint/pull/9254">#9254</a>)</li>
<li><strong>eslint-plugin:</strong> [no-unsafe-call] differentiate a
types-error any from a true any (<a
href="https://redirect.github.com/typescript-eslint/typescript-eslint/pull/9304">#9304</a>)</li>
<li><strong>utils:</strong> clean outdated <code>RuleTester</code>
export (<a
href="https://redirect.github.com/typescript-eslint/typescript-eslint/pull/9322">#9322</a>)</li>
</ul>
<h3>❤️ Thank You</h3>
<ul>
<li>auvred <a
href="https://github.com/auvred"><code>@auvred</code></a></li>
<li>Kim Sang Du <a
href="https://github.com/developer-bandi"><code>@developer-bandi</code></a></li>
<li>rgehbt <a
href="https://github.com/Gehbt"><code>@Gehbt</code></a></li>
<li>Vinccool96</li>
</ul>
<p>You can read about our <a
href="https://main--typescript-eslint.netlify.app/users/versioning">versioning
strategy</a> and <a
href="https://main--typescript-eslint.netlify.app/users/releases">releases</a>
on our website.</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/typescript-estree/CHANGELOG.md"><code>@typescript-eslint/typescript-estree</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>7.15.0 (2024-07-01)</h2>
<p>This was a version bump only for typescript-estree to align it with
other projects, there were no code changes.</p>
<p>You can read about our <a
href="https://main--typescript-eslint.netlify.app/users/versioning">versioning
strategy</a> and <a
href="https://main--typescript-eslint.netlify.app/users/releases">releases</a>
on our website.</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/typescript-eslint/typescript-eslint/commit/2865d31ec6048e42a4a8f05bc800420ade544faf"><code>2865d31</code></a>
chore(release): publish 7.15.0</li>
<li>See full diff in <a
href="https://github.com/typescript-eslint/typescript-eslint/commits/v7.15.0/packages/typescript-estree">compare
view</a></li>
</ul>
</details>
<br />
Updates `eslint` from 9.5.0 to 9.6.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/eslint/eslint/releases">eslint's
releases</a>.</em></p>
<blockquote>
<h2>v9.6.0</h2>
<h2>Features</h2>
<ul>
<li><a
href="https://github.com/eslint/eslint/commit/e2b16e2b72606162dce3d804bc80186b6c5ec0f9"><code>e2b16e2</code></a>
feat: Implement feature flags (<a
href="https://redirect.github.com/eslint/eslint/issues/18516">#18516</a>)
(Nicholas C. Zakas)</li>
<li><a
href="https://github.com/eslint/eslint/commit/8824aa1469ffc572c5e5c1765d1b6da113dfba19"><code>8824aa1</code></a>
feat: add <code>ecmaVersion: 2025</code>, parsing duplicate named
capturing groups (<a
href="https://redirect.github.com/eslint/eslint/issues/18596">#18596</a>)
(Milos Djermanovic)</li>
</ul>
<h2>Bug Fixes</h2>
<ul>
<li><a
href="https://github.com/eslint/eslint/commit/1613e2e586423ec7871617aec4dce5c433f0e9f0"><code>1613e2e</code></a>
fix: Allow escaping characters in config patterns on Windows (<a
href="https://redirect.github.com/eslint/eslint/issues/18628">#18628</a>)
(Milos Djermanovic)</li>
<li><a
href="https://github.com/eslint/eslint/commit/21d3766c3f4efd981d3cc294c2c82c8014815e6e"><code>21d3766</code></a>
fix: <code>no-unused-vars</code> include caught errors pattern in report
message (<a
href="https://redirect.github.com/eslint/eslint/issues/18609">#18609</a>)
(Kirk Waiblinger)</li>
<li><a
href="https://github.com/eslint/eslint/commit/d7a7736937981befc5dfd68ce512f1a6ebf93e68"><code>d7a7736</code></a>
fix: improve <code>no-unused-vars</code> message on unused caught errors
(<a
href="https://redirect.github.com/eslint/eslint/issues/18608">#18608</a>)
(Kirk Waiblinger)</li>
<li><a
href="https://github.com/eslint/eslint/commit/f9e95d2d06c0a7017417a3de4929b14d1008c63c"><code>f9e95d2</code></a>
fix: correct locations of invalid <code>/* eslint */</code> comments (<a
href="https://redirect.github.com/eslint/eslint/issues/18593">#18593</a>)
(Milos Djermanovic)</li>
</ul>
<h2>Documentation</h2>
<ul>
<li><a
href="https://github.com/eslint/eslint/commit/13dbecdea749abf51951ce61662eec2621a4b9af"><code>13dbecd</code></a>
docs: Limit search to just docs (<a
href="https://redirect.github.com/eslint/eslint/issues/18627">#18627</a>)
(Nicholas C. Zakas)</li>
<li><a
href="https://github.com/eslint/eslint/commit/375227f94da3c1c4ff6c61a29b272889fa48ca79"><code>375227f</code></a>
docs: Update getting-started.md - add pnpm to init eslint config (<a
href="https://redirect.github.com/eslint/eslint/issues/18599">#18599</a>)
(Kostiantyn Ochenash)</li>
<li><a
href="https://github.com/eslint/eslint/commit/44915bb95dfa21f946021d77b3b361e7e9b140e0"><code>44915bb</code></a>
docs: Update README (GitHub Actions Bot)</li>
<li><a
href="https://github.com/eslint/eslint/commit/d50db7bcb4c19c0631ab80b120249ecf155824ce"><code>d50db7b</code></a>
docs: Update vscode-eslint info (<a
href="https://redirect.github.com/eslint/eslint/issues/18595">#18595</a>)
(Nicholas C. Zakas)</li>
</ul>
<h2>Chores</h2>
<ul>
<li><a
href="https://github.com/eslint/eslint/commit/b15ee302742e280e8cd019b49e7b50a4f3b88bc0"><code>b15ee30</code></a>
chore: upgrade <code>@eslint/js</code><a
href="https://github.com/9"><code>@9</code></a>.6.0 (<a
href="https://redirect.github.com/eslint/eslint/issues/18632">#18632</a>)
(Milos Djermanovic)</li>
<li><a
href="https://github.com/eslint/eslint/commit/d655503b1fc97acfb4e7c61b3d9b557733c189b7"><code>d655503</code></a>
chore: package.json update for <code>@eslint/js</code> release
(Jenkins)</li>
<li><a
href="https://github.com/eslint/eslint/commit/7c78ad9d9f896354d557f24e2d37710cf79a27bf"><code>7c78ad9</code></a>
refactor: Use language.visitorKeys and check for non-JS SourceCode (<a
href="https://redirect.github.com/eslint/eslint/issues/18625">#18625</a>)
(Nicholas C. Zakas)</li>
<li><a
href="https://github.com/eslint/eslint/commit/69ff64e638c0a56628afbc271dda5c963724aca4"><code>69ff64e</code></a>
refactor: Return value of applyInlineConfig() (<a
href="https://redirect.github.com/eslint/eslint/issues/18623">#18623</a>)
(Nicholas C. Zakas)</li>
<li><a
href="https://github.com/eslint/eslint/commit/d2d06f7a70d9b96b125ecf2de8951bea549db4da"><code>d2d06f7</code></a>
refactor: use <code>/</code> separator when adjusting
<code>ignorePatterns</code> on Windows (<a
href="https://redirect.github.com/eslint/eslint/issues/18613">#18613</a>)
(Milos Djermanovic)</li>
<li><a
href="https://github.com/eslint/eslint/commit/642197346bf02d277c2014144537aa21ab57dc59"><code>6421973</code></a>
refactor: fix disable directives for languages with 0-based lines (<a
href="https://redirect.github.com/eslint/eslint/issues/18605">#18605</a>)
(Milos Djermanovic)</li>
<li><a
href="https://github.com/eslint/eslint/commit/0a135395aca72461eb8b4c6f0866290bcf59916e"><code>0a13539</code></a>
refactor: Allow optional methods for languages (<a
href="https://redirect.github.com/eslint/eslint/issues/18604">#18604</a>)
(Nicholas C. Zakas)</li>
<li><a
href="https://github.com/eslint/eslint/commit/c7ddee0d089e4db7be3f1a09f1a5731dd90b81b1"><code>c7ddee0</code></a>
chore: make internal-rules not being a package (<a
href="https://redirect.github.com/eslint/eslint/issues/18601">#18601</a>)
(Milos Djermanovic)</li>
<li><a
href="https://github.com/eslint/eslint/commit/3379164e8b0cee57caf7da34226982075ebef51a"><code>3379164</code></a>
chore: remove <code>.eslintrc.js</code> (<a
href="https://redirect.github.com/eslint/eslint/issues/18011">#18011</a>)
(唯然)</li>
<li><a
href="https://github.com/eslint/eslint/commit/d0c3a322fbcc2f70cfcd9d5010efef721245c382"><code>d0c3a32</code></a>
chore: update knip (with webdriver-io plugin) (<a
href="https://redirect.github.com/eslint/eslint/issues/18594">#18594</a>)
(Lars Kappert)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/eslint/eslint/blob/main/CHANGELOG.md">eslint's
changelog</a>.</em></p>
<blockquote>
<p>v9.6.0 - June 28, 2024</p>
<ul>
<li><a
href="https://github.com/eslint/eslint/commit/b15ee302742e280e8cd019b49e7b50a4f3b88bc0"><code>b15ee30</code></a>
chore: upgrade <code>@eslint/js</code><a
href="https://github.com/9"><code>@9</code></a>.6.0 (<a
href="https://redirect.github.com/eslint/eslint/issues/18632">#18632</a>)
(Milos Djermanovic)</li>
<li><a
href="https://github.com/eslint/eslint/commit/d655503b1fc97acfb4e7c61b3d9b557733c189b7"><code>d655503</code></a>
chore: package.json update for <code>@eslint/js</code> release
(Jenkins)</li>
<li><a
href="https://github.com/eslint/eslint/commit/1613e2e586423ec7871617aec4dce5c433f0e9f0"><code>1613e2e</code></a>
fix: Allow escaping characters in config patterns on Windows (<a
href="https://redirect.github.com/eslint/eslint/issues/18628">#18628</a>)
(Milos Djermanovic)</li>
<li><a
href="https://github.com/eslint/eslint/commit/13dbecdea749abf51951ce61662eec2621a4b9af"><code>13dbecd</code></a>
docs: Limit search to just docs (<a
href="https://redirect.github.com/eslint/eslint/issues/18627">#18627</a>)
(Nicholas C. Zakas)</li>
<li><a
href="https://github.com/eslint/eslint/commit/7c78ad9d9f896354d557f24e2d37710cf79a27bf"><code>7c78ad9</code></a>
refactor: Use language.visitorKeys and check for non-JS SourceCode (<a
href="https://redirect.github.com/eslint/eslint/issues/18625">#18625</a>)
(Nicholas C. Zakas)</li>
<li><a
href="https://github.com/eslint/eslint/commit/e2b16e2b72606162dce3d804bc80186b6c5ec0f9"><code>e2b16e2</code></a>
feat: Implement feature flags (<a
href="https://redirect.github.com/eslint/eslint/issues/18516">#18516</a>)
(Nicholas C. Zakas)</li>
<li><a
href="https://github.com/eslint/eslint/commit/69ff64e638c0a56628afbc271dda5c963724aca4"><code>69ff64e</code></a>
refactor: Return value of applyInlineConfig() (<a
href="https://redirect.github.com/eslint/eslint/issues/18623">#18623</a>)
(Nicholas C. Zakas)</li>
<li><a
href="https://github.com/eslint/eslint/commit/375227f94da3c1c4ff6c61a29b272889fa48ca79"><code>375227f</code></a>
docs: Update getting-started.md - add pnpm to init eslint config (<a
href="https://redirect.github.com/eslint/eslint/issues/18599">#18599</a>)
(Kostiantyn Ochenash)</li>
<li><a
href="https://github.com/eslint/eslint/commit/44915bb95dfa21f946021d77b3b361e7e9b140e0"><code>44915bb</code></a>
docs: Update README (GitHub Actions Bot)</li>
<li><a
href="https://github.com/eslint/eslint/commit/d2d06f7a70d9b96b125ecf2de8951bea549db4da"><code>d2d06f7</code></a>
refactor: use <code>/</code> separator when adjusting
<code>ignorePatterns</code> on Windows (<a
href="https://redirect.github.com/eslint/eslint/issues/18613">#18613</a>)
(Milos Djermanovic)</li>
<li><a
href="https://github.com/eslint/eslint/commit/21d3766c3f4efd981d3cc294c2c82c8014815e6e"><code>21d3766</code></a>
fix: <code>no-unused-vars</code> include caught errors pattern in report
message (<a
href="https://redirect.github.com/eslint/eslint/issues/18609">#18609</a>)
(Kirk Waiblinger)</li>
<li><a
href="https://github.com/eslint/eslint/commit/642197346bf02d277c2014144537aa21ab57dc59"><code>6421973</code></a>
refactor: fix disable directives for languages with 0-based lines (<a
href="https://redirect.github.com/eslint/eslint/issues/18605">#18605</a>)
(Milos Djermanovic)</li>
<li><a
href="https://github.com/eslint/eslint/commit/d7a7736937981befc5dfd68ce512f1a6ebf93e68"><code>d7a7736</code></a>
fix: improve <code>no-unused-vars</code> message on unused caught errors
(<a
href="https://redirect.github.com/eslint/eslint/issues/18608">#18608</a>)
(Kirk Waiblinger)</li>
<li><a
href="https://github.com/eslint/eslint/commit/0a135395aca72461eb8b4c6f0866290bcf59916e"><code>0a13539</code></a>
refactor: Allow optional methods for languages (<a
href="https://redirect.github.com/eslint/eslint/issues/18604">#18604</a>)
(Nicholas C. Zakas)</li>
<li><a
href="https://github.com/eslint/eslint/commit/f9e95d2d06c0a7017417a3de4929b14d1008c63c"><code>f9e95d2</code></a>
fix: correct locations of invalid <code>/* eslint */</code> comments (<a
href="https://redirect.github.com/eslint/eslint/issues/18593">#18593</a>)
(Milos Djermanovic)</li>
<li><a
href="https://github.com/eslint/eslint/commit/8824aa1469ffc572c5e5c1765d1b6da113dfba19"><code>8824aa1</code></a>
feat: add <code>ecmaVersion: 2025</code>, parsing duplicate named
capturing groups (<a
href="https://redirect.github.com/eslint/eslint/issues/18596">#18596</a>)
(Milos Djermanovic)</li>
<li><a
href="https://github.com/eslint/eslint/commit/c7ddee0d089e4db7be3f1a09f1a5731dd90b81b1"><code>c7ddee0</code></a>
chore: make internal-rules not being a package (<a
href="https://redirect.github.com/eslint/eslint/issues/18601">#18601</a>)
(Milos Djermanovic)</li>
<li><a
href="https://github.com/eslint/eslint/commit/3379164e8b0cee57caf7da34226982075ebef51a"><code>3379164</code></a>
chore: remove <code>.eslintrc.js</code> (<a
href="https://redirect.github.com/eslint/eslint/issues/18011">#18011</a>)
(唯然)</li>
<li><a
href="https://github.com/eslint/eslint/commit/d0c3a322fbcc2f70cfcd9d5010efef721245c382"><code>d0c3a32</code></a>
chore: update knip (with webdriver-io plugin) (<a
href="https://redirect.github.com/eslint/eslint/issues/18594">#18594</a>)
(Lars Kappert)</li>
<li><a
href="https://github.com/eslint/eslint/commit/d50db7bcb4c19c0631ab80b120249ecf155824ce"><code>d50db7b</code></a>
docs: Update vscode-eslint info (<a
href="https://redirect.github.com/eslint/eslint/issues/18595">#18595</a>)
(Nicholas C. Zakas)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/eslint/eslint/commit/473d1bb7c3dfcf629ac048ca811f4b5eef04a692"><code>473d1bb</code></a>
9.6.0</li>
<li><a
href="https://github.com/eslint/eslint/commit/f435566baf7b3eaddb7955cb1aff7648dd308a7e"><code>f435566</code></a>
Build: changelog update for 9.6.0</li>
<li><a
href="https://github.com/eslint/eslint/commit/b15ee302742e280e8cd019b49e7b50a4f3b88bc0"><code>b15ee30</code></a>
chore: upgrade <code>@eslint/js</code><a
href="https://github.com/9"><code>@9</code></a>.6.0 (<a
href="https://redirect.github.com/eslint/eslint/issues/18632">#18632</a>)</li>
<li><a
href="https://github.com/eslint/eslint/commit/d655503b1fc97acfb4e7c61b3d9b557733c189b7"><code>d655503</code></a>
chore: package.json update for <code>@eslint/js</code> release</li>
<li><a
href="https://github.com/eslint/eslint/commit/1613e2e586423ec7871617aec4dce5c433f0e9f0"><code>1613e2e</code></a>
fix: Allow escaping characters in config patterns on Windows (<a
href="https://redirect.github.com/eslint/eslint/issues/18628">#18628</a>)</li>
<li><a
href="https://github.com/eslint/eslint/commit/13dbecdea749abf51951ce61662eec2621a4b9af"><code>13dbecd</code></a>
docs: Limit search to just docs (<a
href="https://redirect.github.com/eslint/eslint/issues/18627">#18627</a>)</li>
<li><a
href="https://github.com/eslint/eslint/commit/7c78ad9d9f896354d557f24e2d37710cf79a27bf"><code>7c78ad9</code></a>
refactor: Use language.visitorKeys and check for non-JS SourceCode (<a
href="https://redirect.github.com/eslint/eslint/issues/18625">#18625</a>)</li>
<li><a
href="https://github.com/eslint/eslint/commit/e2b16e2b72606162dce3d804bc80186b6c5ec0f9"><code>e2b16e2</code></a>
feat: Implement feature flags (<a
href="https://redirect.github.com/eslint/eslint/issues/18516">#18516</a>)</li>
<li><a
href="https://github.com/eslint/eslint/commit/69ff64e638c0a56628afbc271dda5c963724aca4"><code>69ff64e</code></a>
refactor: Return value of applyInlineConfig() (<a
href="https://redirect.github.com/eslint/eslint/issues/18623">#18623</a>)</li>
<li><a
href="https://github.com/eslint/eslint/commit/375227f94da3c1c4ff6c61a29b272889fa48ca79"><code>375227f</code></a>
docs: Update getting-started.md - add pnpm to init eslint config (<a
href="https://redirect.github.com/eslint/eslint/issues/18599">#18599</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/eslint/eslint/compare/v9.5.0...v9.6.0">compare
view</a></li>
</ul>
</details>
<br />
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
<details>
<summary>Dependabot commands and options</summary>
<br />
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions
</details>
Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
| false
| 74
| 74
| 148
|
--- libraries/javascript/yarn.lock
@@ -432,14 +432,14 @@
resolved "https://registry.yarnpkg.com/@eslint-community/regexpp/-/regexpp-4.10.0.tgz#548f6de556857c8bb73bbee70c35dc82a2e74d63"
integrity sha512-Cu96Sd2By9mCNTx2iyKOmq10v22jUVQv0lQnlGNy16oE9589yE+QADPbrMGCkA51cKZSg3Pu/aTJVTGfL/qjUA==
-"@eslint/config-array@^0.17.0":
- version "0.17.0"
- resolved "https://registry.yarnpkg.com/@eslint/config-array/-/config-array-0.17.0.tgz#ff305e1ee618a00e6e5d0485454c8d92d94a860d"
- integrity sha512-A68TBu6/1mHHuc5YJL0U0VVeGNiklLAL6rRmhTCP2B5XjWLMnrX+HkO+IAXyHvks5cyyY1jjK5ITPQ1HGS2EVA==
+"@eslint/config-array@^0.16.0":
+ version "0.16.0"
+ resolved "https://registry.yarnpkg.com/@eslint/config-array/-/config-array-0.16.0.tgz#bb3364fc39ee84ec3a62abdc4b8d988d99dfd706"
+ integrity sha512-/jmuSd74i4Czf1XXn7wGRWZCuyaUZ330NH1Bek0Pplatt4Sy1S5haN21SCLLdbeKslQ+S0wEJ+++v5YibSi+Lg==
dependencies:
"@eslint/object-schema" "^2.1.4"
debug "^4.3.1"
- minimatch "^3.1.2"
+ minimatch "^3.0.5"
"@eslint/eslintrc@^3.1.0":
version "3.1.0"
@@ -456,10 +456,10 @@
minimatch "^3.1.2"
strip-json-comments "^3.1.1"
-"@eslint/[email protected]":
- version "9.6.0"
- resolved "https://registry.yarnpkg.com/@eslint/js/-/js-9.6.0.tgz#5b0cb058cc13d9c92d4e561d3538807fa5127c95"
- integrity sha512-D9B0/3vNg44ZeWbYMpBoXqNP4j6eQD5vNwIlGAuFRRzK/WtT/jvDQW3Bi9kkf3PMDMlM7Yi+73VLUsn5bJcl8A==
+"@eslint/[email protected]":
+ version "9.5.0"
+ resolved "https://registry.yarnpkg.com/@eslint/js/-/js-9.5.0.tgz#0e9c24a670b8a5c86bff97b40be13d8d8f238045"
+ integrity sha512-A7+AOT2ICkodvtsWnxZP4Xxk3NbZ3VMHd8oihydLRGrJgqqdEz1qSeEgXYyT/Cu8h1TWWsQRejIx48mtjZ5y1w==
"@eslint/object-schema@^2.1.4":
version "2.1.4"
@@ -864,61 +864,61 @@
"@types/yargs-parser" "*"
"@typescript-eslint/eslint-plugin@^7.1.0":
- version "7.15.0"
- resolved "https://registry.yarnpkg.com/@typescript-eslint/eslint-plugin/-/eslint-plugin-7.15.0.tgz#8eaf396ac2992d2b8f874b68eb3fcd6b179cb7f3"
- integrity sha512-uiNHpyjZtFrLwLDpHnzaDlP3Tt6sGMqTCiqmxaN4n4RP0EfYZDODJyddiFDF44Hjwxr5xAcaYxVKm9QKQFJFLA==
+ version "7.14.1"
+ resolved "https://registry.yarnpkg.com/@typescript-eslint/eslint-plugin/-/eslint-plugin-7.14.1.tgz#90e2f76a5930d553ede124e1f541a39b4417465e"
+ integrity sha512-aAJd6bIf2vvQRjUG3ZkNXkmBpN+J7Wd0mfQiiVCJMu9Z5GcZZdcc0j8XwN/BM97Fl7e3SkTXODSk4VehUv7CGw==
dependencies:
"@eslint-community/regexpp" "^4.10.0"
- "@typescript-eslint/scope-manager" "7.15.0"
- "@typescript-eslint/type-utils" "7.15.0"
- "@typescript-eslint/utils" "7.15.0"
- "@typescript-eslint/visitor-keys" "7.15.0"
+ "@typescript-eslint/scope-manager" "7.14.1"
+ "@typescript-eslint/type-utils" "7.14.1"
+ "@typescript-eslint/utils" "7.14.1"
+ "@typescript-eslint/visitor-keys" "7.14.1"
graphemer "^1.4.0"
ignore "^5.3.1"
natural-compare "^1.4.0"
ts-api-utils "^1.3.0"
"@typescript-eslint/parser@^7.1.0":
- version "7.15.0"
- resolved "https://registry.yarnpkg.com/@typescript-eslint/parser/-/parser-7.15.0.tgz#f4a536e5fc6a1c05c82c4d263a2bfad2da235c80"
- integrity sha512-k9fYuQNnypLFcqORNClRykkGOMOj+pV6V91R4GO/l1FDGwpqmSwoOQrOHo3cGaH63e+D3ZiCAOsuS/D2c99j/A==
- dependencies:
- "@typescript-eslint/scope-manager" "7.15.0"
- "@typescript-eslint/types" "7.15.0"
- "@typescript-eslint/typescript-estree" "7.15.0"
- "@typescript-eslint/visitor-keys" "7.15.0"
+ version "7.14.1"
+ resolved "https://registry.yarnpkg.com/@typescript-eslint/parser/-/parser-7.14.1.tgz#13d97f357aed3c5719f259a6cc3d1a1f065d3692"
+ integrity sha512-8lKUOebNLcR0D7RvlcloOacTOWzOqemWEWkKSVpMZVF/XVcwjPR+3MD08QzbW9TCGJ+DwIc6zUSGZ9vd8cO1IA==
+ dependencies:
+ "@typescript-eslint/scope-manager" "7.14.1"
+ "@typescript-eslint/types" "7.14.1"
+ "@typescript-eslint/typescript-estree" "7.14.1"
+ "@typescript-eslint/visitor-keys" "7.14.1"
debug "^4.3.4"
-"@typescript-eslint/[email protected]":
- version "7.15.0"
- resolved "https://registry.yarnpkg.com/@typescript-eslint/scope-manager/-/scope-manager-7.15.0.tgz#201b34b0720be8b1447df17b963941bf044999b2"
- integrity sha512-Q/1yrF/XbxOTvttNVPihxh1b9fxamjEoz2Os/Pe38OHwxC24CyCqXxGTOdpb4lt6HYtqw9HetA/Rf6gDGaMPlw==
+"@typescript-eslint/[email protected]":
+ version "7.14.1"
+ resolved "https://registry.yarnpkg.com/@typescript-eslint/scope-manager/-/scope-manager-7.14.1.tgz#63de7a577bc6fe8ee6e412a5b85499f654b93ee5"
+ integrity sha512-gPrFSsoYcsffYXTOZ+hT7fyJr95rdVe4kGVX1ps/dJ+DfmlnjFN/GcMxXcVkeHDKqsq6uAcVaQaIi3cFffmAbA==
dependencies:
- "@typescript-eslint/types" "7.15.0"
- "@typescript-eslint/visitor-keys" "7.15.0"
+ "@typescript-eslint/types" "7.14.1"
+ "@typescript-eslint/visitor-keys" "7.14.1"
-"@typescript-eslint/[email protected]":
- version "7.15.0"
- resolved "https://registry.yarnpkg.com/@typescript-eslint/type-utils/-/type-utils-7.15.0.tgz#5b83c904c6de91802fb399305a50a56d10472c39"
- integrity sha512-SkgriaeV6PDvpA6253PDVep0qCqgbO1IOBiycjnXsszNTVQe5flN5wR5jiczoEoDEnAqYFSFFc9al9BSGVltkg==
+"@typescript-eslint/[email protected]":
+ version "7.14.1"
+ resolved "https://registry.yarnpkg.com/@typescript-eslint/type-utils/-/type-utils-7.14.1.tgz#c183f2f28c4c8578eb80aebc4ac9ace400160af6"
+ integrity sha512-/MzmgNd3nnbDbOi3LfasXWWe292+iuo+umJ0bCCMCPc1jLO/z2BQmWUUUXvXLbrQey/JgzdF/OV+I5bzEGwJkQ==
dependencies:
- "@typescript-eslint/typescript-estree" "7.15.0"
- "@typescript-eslint/utils" "7.15.0"
+ "@typescript-eslint/typescript-estree" "7.14.1"
+ "@typescript-eslint/utils" "7.14.1"
debug "^4.3.4"
ts-api-utils "^1.3.0"
-"@typescript-eslint/[email protected]":
- version "7.15.0"
- resolved "https://registry.yarnpkg.com/@typescript-eslint/types/-/types-7.15.0.tgz#fb894373a6e3882cbb37671ffddce44f934f62fc"
- integrity sha512-aV1+B1+ySXbQH0pLK0rx66I3IkiZNidYobyfn0WFsdGhSXw+P3YOqeTq5GED458SfB24tg+ux3S+9g118hjlTw==
+"@typescript-eslint/[email protected]":
+ version "7.14.1"
+ resolved "https://registry.yarnpkg.com/@typescript-eslint/types/-/types-7.14.1.tgz#a43a540dbe5df7f2a11269683d777fc50b4350aa"
+ integrity sha512-mL7zNEOQybo5R3AavY+Am7KLv8BorIv7HCYS5rKoNZKQD9tsfGUpO4KdAn3sSUvTiS4PQkr2+K0KJbxj8H9NDg==
-"@typescript-eslint/[email protected]", "@typescript-eslint/typescript-estree@^7.0.0":
- version "7.15.0"
- resolved "https://registry.yarnpkg.com/@typescript-eslint/typescript-estree/-/typescript-estree-7.15.0.tgz#e323bfa3966e1485b638ce751f219fc1f31eba37"
- integrity sha512-gjyB/rHAopL/XxfmYThQbXbzRMGhZzGw6KpcMbfe8Q3nNQKStpxnUKeXb0KiN/fFDR42Z43szs6rY7eHk0zdGQ==
+"@typescript-eslint/[email protected]", "@typescript-eslint/typescript-estree@^7.0.0":
+ version "7.14.1"
+ resolved "https://registry.yarnpkg.com/@typescript-eslint/typescript-estree/-/typescript-estree-7.14.1.tgz#ba7c9bac8744487749d19569e254d057754a1575"
+ integrity sha512-k5d0VuxViE2ulIO6FbxxSZaxqDVUyMbXcidC8rHvii0I56XZPv8cq+EhMns+d/EVIL41sMXqRbK3D10Oza1bbA==
dependencies:
- "@typescript-eslint/types" "7.15.0"
- "@typescript-eslint/visitor-keys" "7.15.0"
+ "@typescript-eslint/types" "7.14.1"
+ "@typescript-eslint/visitor-keys" "7.14.1"
debug "^4.3.4"
globby "^11.1.0"
is-glob "^4.0.3"
@@ -926,22 +926,22 @@
semver "^7.6.0"
ts-api-utils "^1.3.0"
-"@typescript-eslint/[email protected]":
- version "7.15.0"
- resolved "https://registry.yarnpkg.com/@typescript-eslint/utils/-/utils-7.15.0.tgz#9e6253c4599b6e7da2fb64ba3f549c73eb8c1960"
- integrity sha512-hfDMDqaqOqsUVGiEPSMLR/AjTSCsmJwjpKkYQRo1FNbmW4tBwBspYDwO9eh7sKSTwMQgBw9/T4DHudPaqshRWA==
+"@typescript-eslint/[email protected]":
+ version "7.14.1"
+ resolved "https://registry.yarnpkg.com/@typescript-eslint/utils/-/utils-7.14.1.tgz#3307b8226f99103dca2133d0ebcae38419d82c9d"
+ integrity sha512-CMmVVELns3nak3cpJhZosDkm63n+DwBlDX8g0k4QUa9BMnF+lH2lr3d130M1Zt1xxmB3LLk3NV7KQCq86ZBBhQ==
dependencies:
"@eslint-community/eslint-utils" "^4.4.0"
- "@typescript-eslint/scope-manager" "7.15.0"
- "@typescript-eslint/types" "7.15.0"
- "@typescript-eslint/typescript-estree" "7.15.0"
+ "@typescript-eslint/scope-manager" "7.14.1"
+ "@typescript-eslint/types" "7.14.1"
+ "@typescript-eslint/typescript-estree" "7.14.1"
-"@typescript-eslint/[email protected]":
- version "7.15.0"
- resolved "https://registry.yarnpkg.com/@typescript-eslint/visitor-keys/-/visitor-keys-7.15.0.tgz#1da0726201a859343fe6a05742a7c1792fff5b66"
- integrity sha512-Hqgy/ETgpt2L5xueA/zHHIl4fJI2O4XUE9l4+OIfbJIRSnTJb/QscncdqqZzofQegIJugRIF57OJea1khw2SDw==
+"@typescript-eslint/[email protected]":
+ version "7.14.1"
+ resolved "https://registry.yarnpkg.com/@typescript-eslint/visitor-keys/-/visitor-keys-7.14.1.tgz#cc79b5ea154aea734b2a13b983670749f5742274"
+ integrity sha512-Crb+F75U1JAEtBeQGxSKwI60hZmmzaqA3z9sYsVm8X7W5cwLEm5bRe0/uXS6+MR/y8CVpKSR/ontIAIEPFcEkA==
dependencies:
- "@typescript-eslint/types" "7.15.0"
+ "@typescript-eslint/types" "7.14.1"
eslint-visitor-keys "^3.4.3"
acorn-jsx@^5.3.2:
@@ -949,10 +949,10 @@ acorn-jsx@^5.3.2:
resolved "https://registry.yarnpkg.com/acorn-jsx/-/acorn-jsx-5.3.2.tgz#7ed5bb55908b3b2f1bc55c6af1653bada7f07937"
integrity sha512-rq9s+JNhf0IChjtDXxllJ7g41oZk5SlXtp0LHwyA5cejwn7vKmKp4pPri6YEePv2PU65sAsegbXtIinmDFDXgQ==
-acorn@^8.12.0:
- version "8.12.0"
- resolved "https://registry.yarnpkg.com/acorn/-/acorn-8.12.0.tgz#1627bfa2e058148036133b8d9b51a700663c294c"
- integrity sha512-RTvkC4w+KNXrM39/lWCUaG0IbRkWdCv7W/IOW9oU6SawyxulvkQy5HQPVTKxEjczcUvapcrw3cFx/60VN/NRNw==
+acorn@^8.11.3:
+ version "8.11.3"
+ resolved "https://registry.yarnpkg.com/acorn/-/acorn-8.11.3.tgz#71e0b14e13a4ec160724b38fb7b0f233b1b81d7a"
+ integrity sha512-Y9rRfJG5jcKOE0CLisYbojUjIrIEE7AGMzA/Sm4BslANhbS+cDMpgBdcPT91oJ7OuJ9hYJBx59RjbhxVnrF8Xg==
ajv@^6.12.4:
version "6.12.6"
@@ -1388,15 +1388,15 @@ eslint-visitor-keys@^4.0.0:
integrity sha512-OtIRv/2GyiF6o/d8K7MYKKbXrOUBIK6SfkIRM4Z0dY3w+LiQ0vy3F57m0Z71bjbyeiWFiHJ8brqnmE6H6/jEuw==
eslint@^9.0.0:
- version "9.6.0"
- resolved "https://registry.yarnpkg.com/eslint/-/eslint-9.6.0.tgz#9f54373afa15e1ba356656a8d96233182027fb49"
- integrity sha512-ElQkdLMEEqQNM9Njff+2Y4q2afHk7JpkPvrd7Xh7xefwgQynqPxwf55J7di9+MEibWUGdNjFF9ITG9Pck5M84w==
+ version "9.5.0"
+ resolved "https://registry.yarnpkg.com/eslint/-/eslint-9.5.0.tgz#11856034b94a9e1a02cfcc7e96a9f0956963cd2f"
+ integrity sha512-+NAOZFrW/jFTS3dASCGBxX1pkFD0/fsO+hfAkJ4TyYKwgsXZbqzrw+seCYFCcPCYXvnD67tAnglU7GQTz6kcVw==
dependencies:
"@eslint-community/eslint-utils" "^4.2.0"
"@eslint-community/regexpp" "^4.6.1"
- "@eslint/config-array" "^0.17.0"
+ "@eslint/config-array" "^0.16.0"
"@eslint/eslintrc" "^3.1.0"
- "@eslint/js" "9.6.0"
+ "@eslint/js" "9.5.0"
"@humanwhocodes/module-importer" "^1.0.1"
"@humanwhocodes/retry" "^0.3.0"
"@nodelib/fs.walk" "^1.2.8"
@@ -1407,7 +1407,7 @@ eslint@^9.0.0:
escape-string-regexp "^4.0.0"
eslint-scope "^8.0.1"
eslint-visitor-keys "^4.0.0"
- espree "^10.1.0"
+ espree "^10.0.1"
esquery "^1.5.0"
esutils "^2.0.2"
fast-deep-equal "^3.1.3"
@@ -1427,12 +1427,12 @@ eslint@^9.0.0:
strip-ansi "^6.0.1"
text-table "^0.2.0"
-espree@^10.0.1, espree@^10.1.0:
- version "10.1.0"
- resolved "https://registry.yarnpkg.com/espree/-/espree-10.1.0.tgz#8788dae611574c0f070691f522e4116c5a11fc56"
- integrity sha512-M1M6CpiE6ffoigIOWYO9UDP8TMUw9kqb21tf+08IgDYjCsOvCuDt4jQcZmoYxx+w7zlKw9/N0KXfto+I8/FrXA==
+espree@^10.0.1:
+ version "10.0.1"
+ resolved "https://registry.yarnpkg.com/espree/-/espree-10.0.1.tgz#600e60404157412751ba4a6f3a2ee1a42433139f"
+ integrity sha512-MWkrWZbJsL2UwnjxTX3gG8FneachS/Mwg7tdGXce011sJd5b0JG54vat5KHnfSBODZ3Wvzd2WnjxyzsRoVv+ww==
dependencies:
- acorn "^8.12.0"
+ acorn "^8.11.3"
acorn-jsx "^5.3.2"
eslint-visitor-keys "^4.0.0"
@@ -2380,7 +2380,7 @@ mimic-fn@^2.1.0:
resolved "https://registry.yarnpkg.com/mimic-fn/-/mimic-fn-2.1.0.tgz#7ed2c2ccccaf84d3ffcb7a69b57711fc2083401b"
integrity sha512-OqbOk5oEQeAZ8WXWydlu9HJjz9WVdEIvamMCcXmuqUYjTknH/sqsWvhQ3vgwKFRR1HpjvNBKQ37nbJgYzGqGcg==
-minimatch@^3.0.4, minimatch@^3.1.1, minimatch@^3.1.2:
+minimatch@^3.0.4, minimatch@^3.0.5, minimatch@^3.1.1, minimatch@^3.1.2:
version "3.1.2"
resolved "https://registry.yarnpkg.com/minimatch/-/minimatch-3.1.2.tgz#19cd194bfd3e428f049a70817c038d89ab4be35b"
integrity sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw==
|
standard-webhooks
|
standard-webhooks
|
Elixir
|
Elixir
| 1,390
| 37
|
The Standard Webhooks specification
|
standard-webhooks_standard-webhooks
|
CONFIG_CHANGE
|
just some version changes
|
e99309641994aebec6b0199012350486f2fe63af
|
2022-03-07 14:47:18
|
Qimiao Chen
|
添加详细说明,避免产生歧义 指出释放操作并不是多线程执行的,只是因为IO读写与释放命令执行不是一个线程,释放命令不会阻塞网络IO读写,因此才提高了性能
| false
| 1
| 1
| 2
|
--- docs/high-concurrency/redis-single-thread-model.md
@@ -70,4 +70,4 @@ Redis 内部使用文件事件处理器 `file event handler` ,这个文件事
### 总结
-Redis 选择使用单线程模型处理客户端的请求主要还是因为 CPU 不是 Redis 服务器的瓶颈,所以使用多线程模型带来的性能提升并不能抵消它带来的开发成本和维护成本,系统的性能瓶颈也主要在网络 I/O 操作上;而 Redis 引入多线程操作也是出于性能上的考虑,对于一些大键值对的删除操作,通过多线程非阻塞地释放内存空间(释放操作不会阻塞网络IO读写,因为网络IO读写与释放的命令执行不是同一个线程)也能减少对 Redis 主线程阻塞的时间,提高执行的效率。
+Redis 选择使用单线程模型处理客户端的请求主要还是因为 CPU 不是 Redis 服务器的瓶颈,所以使用多线程模型带来的性能提升并不能抵消它带来的开发成本和维护成本,系统的性能瓶颈也主要在网络 I/O 操作上;而 Redis 引入多线程操作也是出于性能上的考虑,对于一些大键值对的删除操作,通过多线程非阻塞地释放内存空间也能减少对 Redis 主线程阻塞的时间,提高执行的效率。
|
advanced-java
|
doocs
|
Java
|
Java
| 77,149
| 19,158
|
😮 Core Interview Questions & Answers For Experienced Java(Backend) Developers | 互联网 Java 工程师进阶知识完全扫盲:涵盖高并发、分布式、高可用、微服务、海量数据处理等领域知识
|
doocs_advanced-java
|
CODE_IMPROVEMENT
|
Refactoring Java code for better maintainability
|
7bf89340971c38f4757cf58671d8ae83ca15574c
| null |
Will Ayd
|
Suppressed FutureWarning in create_pet_tf_record
| false
| 1
| 1
| 0
|
--- dataset_util.py
@@ -72,7 +72,7 @@ def recursive_parse_xml_to_dict(xml):
Returns:
Python dictionary holding XML contents.
"""
- if not xml:
+ if not len(xml):
return {xml.tag: xml.text}
result = {}
for child in xml:
|
tensorflow_models.json
| null | null | null | null | null | null |
tensorflow_models.json
|
BUG_FIX
|
4, fixed a FutureWarning in create_pet_tf_record
|
7522f77cc5d04437a2c9612eebbd4793bbb9c1ef
|
2024-03-27 17:58:37
|
Easy
|
更新readme
| false
| 2
| 2
| 4
|
--- README.md
@@ -2,8 +2,8 @@
## 对第一版的改进
-1. 从长文到一本近6万字的小书,从有感而发的分享到两年迭代而得的**完整方法论**
-1. 不再局限在独立开发,发展为**更为通用的方法论**,非技术读者也可基于NoCode/开源项目+AI辅助构建副业
+1. 从长文到一本近6万字的小书,从有感而发的分享到两年迭代而得的完整方法论
+1. 不再局限在独立开发,发展为更为通用的方法论,即使不懂技术的读者也可用于经营副业

|
one-person-businesses-methodology-v2.0
|
easychen
|
PHP
|
PHP
| 5,272
| 464
|
《一人企业方法论》第二版,也适合做其他副业(比如自媒体、电商、数字商品)的非技术人群。
|
easychen_one-person-businesses-methodology-v2.0
|
CONFIG_CHANGE
|
Very small changes
|
3cceccfa14bf1bfd351751c7c21090af5ede2e29
|
2024-11-18 05:11:52
|
github-actions[bot]
|
chore(main): release 2.38.1 (#635) Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
| false
| 7
| 0
| 7
|
--- CHANGELOG.md
@@ -1,12 +1,5 @@
# Changelog
-## [2.38.1](https://github.com/ellite/Wallos/compare/v2.38.0...v2.38.1) (2024-11-17)
-
-
-### Bug Fixes
-
-* bug introduced on 2.38.0 on the subscriptions dashboard ([#634](https://github.com/ellite/Wallos/issues/634)) ([f63c543](https://github.com/ellite/Wallos/commit/f63c543cdd7512b216004db3b279884dbda87ce4))
-
## [2.38.0](https://github.com/ellite/Wallos/compare/v2.37.1...v2.38.0) (2024-11-17)
|
wallos
|
ellite
|
PHP
|
PHP
| 4,155
| 178
|
Wallos: Open-Source Personal Subscription Tracker
|
ellite_wallos
|
DOC_CHANGE
|
Obvious
|
26390bee56be2ed85d7c35b62394def077851615
|
2025-02-06 01:17:29
|
Dario-DC
|
chore(curriculum): add async js transcripts (#58573) Co-authored-by: Naomi <[email protected]>
| false
| 310
| 0
| 310
|
--- curriculum/challenges/english/25-front-end-development/lecture-understanding-asynchronous-programming/6733b072bd8f5b06ccdbd9e2.md
@@ -10,46 +10,6 @@ dashedName: what-is-asynchronous-javascript-and-how-does-it-differ-from-synchron
Watch the video lecture and answer the questions below.
-# --transcript--
-
-What is asynchronous JavaScript and how does it differ from synchronous JavaScript?
-
-Let’s learn about asynchronous JavaScript. Asynchronous events occur at different times, independently from each other.
-
-In the context of software development, "asynchronous" refers to tasks that run in the background, independently from the main flow of the program. The main advantage of asynchronous processes is that they don’t block the execution of the main program.
-
-That’s particularly helpful for tasks that may take a long time to run, such as fetching data from a remote server, processing large files, handling user input, and performing complex calculations. This is what we know as asynchronous programming.
-
-This approach contrasts with the traditional synchronous programming technique that you have been working with so far. When you write multiple lines of code, you can usually predict what will happen and when it will happen. The first line will be executed, then the second line, and so on in the order that you write them.
-
-In this example, the first line where you define the variable will be executed first, then the second line, and finally the third line where the string is logged to the console:
-
-```js
-const topic = "JavaScript";
-const learning = `I'm learning ${topic}!`;
-console.log(learning);
-```
-
-Everything is sequential and predictable. Each line is completed before the next one starts. This type of JavaScript program is known as single-threaded.
-
-The concept of threads is very important. A thread is a unit of execution within a process. It’s like a separate flow of control within the program.
-
-In synchronous programming, threads execute sequentially, one after the other. If a thread is blocked, like waiting for user input, the entire process is blocked until the thread is completed.
-
-In asynchronous programming, threads can be executed concurrently, running multiple threads at the same time. This way, the program can continue running multiple tasks simultaneously without making the main program unresponsive, even if one of the threads is blocked.
-
-Asynchronous programming often involves callbacks, promises, or async/await to handle non-blocking operations.
-
-As you learned in earlier lecture videos, a callback is a function that you pass to another function to be called later. Event handlers are a particular type of callback that you’ve worked with before. They used to be the most common way to implement asynchronous functions in JavaScript. However, if the callback function also takes other callback functions, the code and logic can get quite complicated very quickly.
-
-Currently, the most commonly used technique for implementing asynchronous programming in JavaScript is the promise. A `Promise` is an object that represents the eventual completion (or failure) of an asynchronous process and its value.
-
-The value of a promise is not known when the promise is created. It’s only known when the asynchronous process is completed. For example, the process of fetching data from a remote API for your web application may take some time. Meanwhile, you want your users to have a nice user experience, right?
-
-To implement this, you could create a promise to keep the user interface active and interactive while the asynchronous process is running. When the process is completed, the promise will contain the data that was sent back by the API, so the application can handle it or render it appropriately when it’s available.
-
-Asynchronous programming is a powerful tool for building efficient JavaScript applications. By understanding the differences between synchronous and asynchronous programming, you can choose the right approach for your application and write more efficient code.
-
# --questions--
## --text--
--- curriculum/challenges/english/25-front-end-development/lecture-understanding-asynchronous-programming/67340798c2c1776709d8a5fe.md
@@ -10,38 +10,6 @@ dashedName: how-does-the-async-attribute-work-inside-script-elements-and-how-doe
Watch the lecture video and answer the questions below.
-# --transcript--
-
-How does the `async` attribute work inside `script` elements and how does it differ from the `defer` attribute?
-
-The `async` and `defer` attributes in HTML `script` elements play a crucial role in how JavaScript files are loaded and executed in web pages. Understanding them can improve your website's performance and user experience. When you include a `script` in your HTML file, it looks like this:
-
-```js
-<script src="example.js"></script>
-```
-
-When the browser finds this `script` tag, it stops parsing the HTML, downloads the script, executes it, and then continues parsing the HTML. This can slow down the loading of your web page, especially if you have large scripts.
-
-This is where the `async` and `defer` attributes come in. They provide ways to load scripts more efficiently.
-
-Let's start with the `async` attribute:
-
-```js
-<script src="example.js" async></script>
-```
-
-By adding the `async` attribute to a `script` tag, the browser will continue parsing the HTML while the script is being downloaded. Once the script is fully downloaded, the browser will pause HTML parsing, execute the script, and then resume parsing the HTML. This can significantly speed up page loading.
-
-It's important to note that async scripts are executed as soon as they're downloaded, which means they might not run in the correct order which we desire. This is where the `defer` attribute comes in for the rescue. Let's look at how the `defer` attribute looks like:
-
-```js
-<script src="example.js" defer></script>
-```
-
-The `defer` attribute is similar to `async` attribute. However, defer scripts are not executed immediately after they're downloaded. Instead, they wait until the HTML parsing is complete. Furthermore, defer scripts execute in the order they appear in the HTML code.
-
-In short, use `async` for scripts where the order of execution doesn't matter, and use `defer` when you need to ensure that scripts run in the correct order. Both attributes can significantly improve page load times by allowing the browser to continue parsing HTML while scripts are being downloaded in the background.
-
# --questions--
## --text--
--- curriculum/challenges/english/25-front-end-development/lecture-understanding-asynchronous-programming/673407a223891b6734563c89.md
@@ -10,30 +10,6 @@ dashedName: what-is-the-fetch-api-and-what-are-common-types-of-resources-that-ar
Watch the lecture video and answer the questions below.
-# --transcript--
-
-What is the Fetch API and what are common types of resources that are fetched from the network?
-
-The Fetch API allows web apps to make network requests, typically to retrieve or send data to the server. This API provides a `fetch()` method that you can use to make these requests. Let's look at a basic example of how to use `fetch`:
-
-```js
-fetch('https://api.example.com/data')
-```
-
-In this example, we're making a `GET` request to `https://api.example.com/data`. This will then return us some data that we need to convert to JSON format and can use anywhere we want to.
-
-By default, the Fetch API uses the `GET` method to retrieve data. This will be covered in the next lecture, along with other common HTTP request methods.
-
-Now, let's discuss some common types of resources that are fetched from the network.
-
-In our web apps, we need some common data like weather data, professions list data, country names list, country code or country flag icons/images. Using these data we can make our app more informative and interactive. Thanks to Fetch API, we can get these resources from the network.
-
-Images are some frequently fetched resources. You might use fetch to load images statically or dynamically based on user actions, and display them on your web app.
-
-Text files are another type of resource often fetched. This could include configuration files, log files, or even entire documents that you want to display on your webpage.
-
-In some cases, you might fetch audio or video files. The Fetch API can handle these types of resources as well, allowing you to work with a wide variety of data types.
-
# --questions--
## --text--
--- curriculum/challenges/english/25-front-end-development/lecture-understanding-asynchronous-programming/673407be6af21d6766ed4b96.md
@@ -10,74 +10,6 @@ dashedName: how-does-the-fetch-api-work-with-common-http-methods-and-res-json
Watch the lecture video and answer the questions below.
-# --transcript--
-
-How does the Fetch API work with common HTTP methods?
-
-In the previous lecture, we saw what the Fetch API is and how to use it. In this lecture, we will discuss about the `GET`, `POST`, `PUT` and `DELETE` HTTP methods of Fetch API.
-
-Let's start with the most common HTTP method which is the `GET` method. This is used to retrieve data from a server. When you use `fetch()` without specifying a method, it defaults to `GET`.
-
-```js
-fetch('https://api.example.com/data')
-```
-
-In this code, we're making a `GET` request to `https://api.example.com/data`. Now, please note that you cannot use this data directly, you have to convert the response to a JSON format. Only then you can use it anywhere you want in your project. Here’s an example of how to do it:
-
-```js
-fetch('https://api.example.com/data')
- .then(response => response.json())
- .then(data => console.log(data))
-```
-
-In this code, the response coming from the Fetch API is a promise, and we are using a `.then` handler to convert the response to a JSON format. In a prior lecture video, you learned that a `Promise` is an object that represents the eventual completion (or failure) of an asynchronous process and its value.
-
-The value of a promise is not known when the promise is created. It’s only known when the asynchronous process is completed. When we chain the two `.then` handlers to the fetch call, this is something called promise chaining which will be taught in the next lecture.
-
-So far we have been retrieving resources from a server. But, did you know that we can also send data to the server? The `POST` method is used to send data to a server to create a resource. Here’s an example of a `POST` method which is used to create data into the server:
-
-```js
-fetch('https://api.example.com/users', {
- method: '`POST`',
- headers: {
- 'Content-Type': 'application/json',
- },
- body: JSON.stringify({
- name: 'John Doe',
- email: '[email protected]'
- })
-})
-```
-
-In this example, we're sending a `POST` request to create a new user. We specify the method as `POST`, set the appropriate headers, and include a `body` with the data we want to send. The body needs to be a string, so we use `JSON.stringify()` to convert our object to a JSON string.
-
-The `PUT` method is used to update existing resources of a server. Here’s an example:
-
-```js
-fetch('https://api.example.com/users/45', {
- method: '`PUT`',
- headers: {
- 'Content-Type': 'application/json',
- },
- body: JSON.stringify({
- name: 'John Smith',
- email: '[email protected]'
- })
-})
-```
-
-In this example, look carefully at the URL, you can see a `45` at the end. This is typically used as a unique ID to identify the data we are trying to update. We used the `PUT` method on the code and also specified the data as the `body` which will be used to update the identified data.
-
-The `DELETE` method is used to delete a resource from the server. Here’s an example:
-
-```js
-fetch('https://api.example.com/users/45', {
- method: '`DELETE`',
-})
-```
-
-In this code, we are including the `DELETE` method and an ID at the end of the url to identify the data which needs to be deleted.
-
# --questions--
## --text--
--- curriculum/challenges/english/25-front-end-development/lecture-understanding-asynchronous-programming/673407ca21117a67cf9521ca.md
@@ -10,54 +10,6 @@ dashedName: what-are-promises-and-how-does-promise-chaining-work
Watch the lecture video and answer the questions below.
-# --transcript--
-
-What are promises and how does promise chaining work?
-
-A `Promise` is an object that represents the eventual completion or failure of an asynchronous operation. It's initially in a pending state. It can then transition to either a fulfilled state when the operation is successful, or a rejected state when the operation fails. Here's an example of creating a `Promise`:
-
-```js
-const aPromise = new Promise((resolve, reject) => {
- setTimeout(() => {
- resolve("Operation successful!");
- }, 1000);
-});
-```
-
-In this example, we create a promise that simulates an asynchronous operation using `setTimeout`. After one second, the promise is resolved with the message `Operation successful!`.
-
-Another way to work with promises is to use the `.then` and `.catch` methods.
-
-The `.then()` method is used in a promise to specify what should happen when the promise is fulfilled, while `.catch()` is used to handle any errors that occur. Let's see how we can use `.then` and `.catch` with our promise:
-
-```js
-aPromise.then((result) => {
- console.log(result); // Outputs: "Operation successful!"
-}).catch((error) => {
- console.error(error);
-});
-```
-
-In this code, or instructions of what to do when the promise is fulfilled, the function passed to `.then()` will be called with the resolved value of the promise. If an error occurs, the function passed to `.catch()` will be called instead.
-
-Now, let's talk about promise chaining. One of the powerful features of promises is that we can chain multiple asynchronous operations together. Each `.then()` can return a new promise, allowing you to perform a sequence of asynchronous operations one after the other. Here’s an example:
-
-```js
-fetch('https://api.example.com/data')
- .then(response => response.json())
- .then(data => {
- console.log(data);
- return fetch('https://api.example.com/data2');
- })
- .then(response => response.json())
- .then(data2 => console.log(data2))
- .catch(error => console.error('Error:', error));
-```
-
-In this example, we're making two API calls in sequence. The first `.then()` parses the response as JSON. The second `.then()` logs the data and makes another API call. The third `.then()` parses the response of the second API call, and the fourth `.then()` logs that data. If an error occurs at any point in this chain, it will be caught by the `.catch()` at the end.
-
-It's important to note that `.catch()` will catch errors from any of the previous steps in the chain. This means you don't need to add error handling to each individual step, which can greatly simplify your code.
-
# --questions--
## --text--
--- curriculum/challenges/english/25-front-end-development/lecture-understanding-asynchronous-programming/673407d56c3dce67fa97969b.md
@@ -10,47 +10,6 @@ dashedName: what-is-async-await-and-how-does-it-work
Watch the lecture video and answer the questions below.
-# --transcript--
-
-What is `async`/`await` and how does it work?
-
-In the previous lecture videos, you learned about asynchronous programming which allows other code to run while we wait for some time-consuming tasks to complete, like fetching data from a server, reading data from a file, and so on.
-
-`async`/`await`, built on top of promises, makes writing and reading asynchronous code easier. When you put the `async` keyword before a function, it means that function will always return a `Promise`. Only inside an `async` function, you can use the `await` keyword, which allows you to wait for a `Promise` to resolve before moving on to the next line of code. Here's an example to illustrate how `async`/`await` works:
-
-```js
-async function delayedGreeting(name) {
- console.log("A Messenger entered the chat...");
- await new Promise(resolve => setTimeout(resolve, 2000));
- console.log(`Hello, ${name}!`);
-}
-
-delayedGreeting("Alice");
-console.log("First Printed Message!");
-```
-
-In this code, we define an `async` function called `delayedGreeting`. Inside this function, we use `await` to pause the execution for 2 seconds. After the delay, it prints a greeting.
-
-When we call this function, you'll see `First Printed Message!` appear before the greeting. This is because the function is asynchronous - it doesn't block the rest of the code from running.
-
-One of the biggest advantages of `async`/`await` is error handling. With promises, we often had to use `.catch()` method to handle errors. With `async`/`await`, we can use `try`/`catch` blocks. Here's an example:
-
-```js
-async function fetchUserData() {
- try {
- let response = await fetch(`https://api.example.com/users`);
- let userData = await response.json();
- console.log(userData);
- } catch (error) {
- console.log("Error fetching user data:", error);
- }
-}
-
-fetchUserData();
-```
-
-In this example, we're using `async`/`await` to fetch user data from an API. The `await` keyword is used twice: once to wait for the fetch operation to complete, and again to wait for the JSON parsing to finish. If any error occurs during this process, it will be caught in the `catch` block.
-
# --questions--
## --text--
--- curriculum/challenges/english/25-front-end-development/lecture-understanding-asynchronous-programming/673407e02bcf0d682b9a49a9.md
@@ -10,33 +10,6 @@ dashedName: how-does-the-javascript-engine-work-and-what-is-a-javascript-runtime
Watch the lecture video and answer the questions below.
-# --transcript--
-
-How does the JavaScript engine work and what is a JavaScript runtime?
-
-The JavaScript engine has the ability to read, understand, and execute your code. It works like a converter that takes your code, turns it into instructions that the computer can understand and work accordingly.
-
-One of the most well-known JavaScript engines is V8, developed by Google, used in Chrome and Node.js. The JavaScript engine works in a few steps. First, it parses your code, reading it line by line to make sure there’s no mistake in the JavaScript code. Then, it converts this code into bytecode, which is a simpler, intermediate version of your code that’s easier for the computer to understand and execute. Finally, it runs this bytecode to execute your program's instructions. Here's an example of JavaScript code:
-
-```js
-const greeting = "Hello, World!";
-console.log(greeting);
-```
-
-When you run this code, the JavaScript engine first parses it to check for any syntax errors. Parsing means the engine reads the code and breaks it down into a structure it can understand, checking for mistakes along the way.
-
-Then, it compiles the code into an intermediate format (often bytecode or machine code, depending on the engine). Compiling is the process of converting the human-readable code into a more efficient format that the computer can execute faster.
-
-Finally, the engine executes the code, printing `Hello, World!` to the console.
-
-Now, let's talk about the JavaScript runtime. The JavaScript runtime is the environment in which your JavaScript code is executed. It includes the JavaScript engine (like V8 in Chrome or SpiderMonkey in Firefox), which processes and executes the code, as well as additional features provided by the environment (such as a web browser or Node.js, which you will learn more about in future lectures).
-
-While the core JavaScript language handles things like variables, loops, and functions, the runtime provides extra tools that allow JavaScript to interact with things outside of the language itself, like the DOM (for web pages) or the Fetch API (for making network requests).
-
-In short, the runtime is what allows JavaScript to do more than just basic programming tasks – like interacting with web pages or handling time-based actions – by providing these extra features beyond the language itself.
-
-While you don't need to know every detail of engines and runtimes to write JavaScript, having a basic understanding can help you write more efficient code and debug problems more effectively.
-
# --questions--
## --text--
--- curriculum/challenges/english/25-front-end-development/lecture-understanding-asynchronous-programming/673407eb10ca9d68634e81d9.md
@@ -10,36 +10,6 @@ dashedName: what-is-the-geolocation-api-and-how-does-the-getcurrentposition-work
Watch the lecture video and answer the questions below.
-# --transcript--
-
-What is the Geolocation API and how does the `getCurrentPosition` work?
-
-The Geolocation API provides a way for websites to request the user's location. It's important to note that for privacy reasons, the user has to give permission before their location can be accessed via the website.
-
-The main method we'll be focusing on today is `getCurrentPosition`. This method is used to collect the geographic location of the device. Here's an example of how you might use `getCurrentPosition`:
-
-```js
-navigator.geolocation.getCurrentPosition(
- (position) => {
- console.log("Latitude: " + position.coords.latitude);
- console.log("Longitude: " + position.coords.longitude);
- },
- (error) => {
- console.log("Error: " + error.message);
- }
-);
-```
-
-In this code, we're calling `getCurrentPosition` and passing it a function which will be called when the position is successfully obtained. This position object contains various pieces of information, but we're focusing on `latitude` and `longitude` only.
-
-If there is an issue with getting the position, then the error will be logged to the console.
-
-The `getCurrentPosition` method uses GPS, Wi-Fi networks, or IP address geolocation, depending on the device and its settings. Once the location is found, the success callback function is called with a position object.
-
-The position object contains a various properties, where the most commonly used are `latitude` and `longitude`, but it can also include `altitude`, `accuracy`, `speed`, and `heading`, and so on.
-
-One important consideration when using geolocation is user privacy. Explain to your users why you need their location data and how you'll use it.
-
# --questions--
## --text--
|
freecodecamp
|
freecodecamp
|
TypeScript
|
TypeScript
| 410,748
| 39,092
|
freeCodeCamp.org's open-source codebase and curriculum. Learn to code for free.
|
freecodecamp_freecodecamp
|
DOC_CHANGE
|
changes in md file
|
618f204dddd7b6309d10efa8b9227c71e11c99e7
| null |
David Luzar
|
feat: allow zooming up to 3000% (#4358)
| false
| 1
| 1
| 0
|
--- zoom.ts
@@ -25,6 +25,6 @@ export const getNewZoom = (
export const getNormalizedZoom = (zoom: number): NormalizedZoomValue => {
const normalizedZoom = parseFloat(zoom.toFixed(2));
- const clampedZoom = Math.max(0.1, Math.min(normalizedZoom, 10));
+ const clampedZoom = Math.max(0.1, Math.min(normalizedZoom, 30));
return clampedZoom as NormalizedZoomValue;
};
|
excalidraw_excalidraw.json
| null | null | null | null | null | null |
excalidraw_excalidraw.json
|
NEW_FEAT
|
5, Allowing zoom up to 3000%
|
436d0580b434191b046c94c28ef40193f78eebdd
|
2024-11-26 08:12:59
|
Gustaf
|
Add AI Engineer introduction video (#7788) * Added Introduction Video
* Changed formatting
* Update src/data/roadmaps/ai-engineer/content/introduction@_hYN0gEi9BL24nptEtXWU.md
---------
Co-authored-by: Kamran Ahmed <[email protected]>
| false
| 5
| 1
| 6
|
--- src/data/roadmaps/ai-engineer/content/introduction@_hYN0gEi9BL24nptEtXWU.md
@@ -1,7 +1,3 @@
# Introduction
-AI Engineering is the process of designing and implementing AI systems using pre-trained models and existing AI tools to solve practical problems. AI Engineers focus on applying AI in real-world scenarios, improving user experiences, and automating tasks, without developing new models from scratch. They work to ensure AI systems are efficient, scalable, and can be seamlessly integrated into business applications, distinguishing their role from AI Researchers and ML Engineers, who concentrate more on creating new models or advancing AI theory.
-
-Learn more from the following resources:
-
-- [@video@AI vs Machine Learning](https://www.youtube.com/watch?v=4RixMPF4xis)
+AI Engineering is the process of designing and implementing AI systems using pre-trained models and existing AI tools to solve practical problems. AI Engineers focus on applying AI in real-world scenarios, improving user experiences, and automating tasks, without developing new models from scratch. They work to ensure AI systems are efficient, scalable, and can be seamlessly integrated into business applications, distinguishing their role from AI Researchers and ML Engineers, who concentrate more on creating new models or advancing AI theory.
\ No newline at end of file
|
developer-roadmap
|
kamranahmedse
|
TypeScript
|
TypeScript
| 309,677
| 40,429
|
Interactive roadmaps, guides and other educational content to help developers grow in their careers.
|
kamranahmedse_developer-roadmap
|
DOC_CHANGE
|
changes in md file
|
7bcac37d987c84b77daaf2eb89bcb6d54ba0b189
|
2024-02-10 09:24:51
|
César D. Rodas
|
Improve cli_examples macro (#5589) ## Description
Add macro to test argument parsing rather than testing the external
command through building and spawning a separated process. This is an
improved version of #5519
## Checklist
- [x] I have linked to any relevant issues.
- [x] I have commented my code, particularly in hard-to-understand
areas.
- [x] I have updated the documentation where relevant (API docs, the
reference, and the Sway book).
- [x] I have added tests that prove my fix is effective or that my
feature works.
- [x] I have added (or requested a maintainer to add) the necessary
`Breaking*` or `New Feature` labels where relevant.
- [x] I have done my best to ensure that my PR adheres to [the Fuel Labs
Code Review
Standards](https://github.com/FuelLabs/rfcs/blob/master/text/code-standards/external-contributors.md).
- [x] I have requested a review from the relevant team or maintainers.
| false
| 237
| 292
| 529
|
--- forc-plugins/forc-client/src/cmd/deploy.rs
@@ -8,11 +8,9 @@ pub use forc_util::tx_utils::Salt;
use crate::NodeTarget;
forc_util::cli_examples! {
- super::Command {
- [ Deploy a single contract => "forc deploy bc09bfa7a11a04ce42b0a5abf04fd437387ee49bf4561d575177e2946468b408" ]
- [ Deploy a single contract from a different path => "forc deploy bc09bfa7a11a04ce42b0a5abf04fd437387ee49bf4561d575177e2946468b408 --path {path}" ]
- [ Deploy to a custom network => "forc deploy --node-url https://beta-5.fuel.network/graphql" ]
- }
+ [ Deploy a single contract => deploy "bc09bfa7a11a04ce42b0a5abf04fd437387ee49bf4561d575177e2946468b408" => r#".*Error making HTTP request.*"# ]
+ [ Deploy a single contract from a different path => deploy "bc09bfa7a11a04ce42b0a5abf04fd437387ee49bf4561d575177e2946468b408 --path ../tests/" => r#".*Error making HTTP request.*"# ]
+ [ Deploy to a custom network => deploy "--node-url https://beta-5.fuel.network/graphql" => ".*Refused to create a new wallet.*" ]
}
#[derive(Debug, Default, Parser)]
--- forc-plugins/forc-client/src/cmd/submit.rs
@@ -3,13 +3,11 @@ use devault::Devault;
use std::path::PathBuf;
forc_util::cli_examples! {
- super::Command {
- [ Submit a transaction from a json file => "forc submit {path}/mint.json" ]
- [ Submit a transaction from a json file and wait for confirmation => "forc submit {path}/mint.json --await true" ]
- [ Submit a transaction from a json file and get output in json => "forc submit {path}/mint.json --tx-status-json true" ]
- [ Submit a transaction from a json file to testnet => "forc submit {path}/mint.json --testnet" ]
- [ Submit a transaction from a json file to a local net => "forc submit {path}/mint.json --target local" ]
- }
+ [ Submit a transaction from a json file => submit "./mint.json" => "Submission of tx or awaiting commit failed" ]
+ [ Submit a transaction from a json file and wait for confirmation => submit "./mint.json --await true" => "Submission of tx or awaiting commit failed" ]
+ [ Submit a transaction from a json file and get output in json => submit "./mint.json --tx-status-json true" => "Submission of tx or awaiting commit failed" ]
+ [ Submit a transaction from a json file to testnet => submit "./mint.json --testnet" => "Submission of tx or awaiting commit failed" ]
+ [ Submit a transaction from a json file to a local net => submit "./mint.json --target local" => "Submission of tx or awaiting commit failed" ]
}
/// Submit a transaction to the specified fuel node.
--- forc-plugins/forc-client/tests/.gitignore
@@ -0,0 +1,2 @@
+out
+target
--- forc-plugins/forc-client/tests/Forc.lock
@@ -0,0 +1,13 @@
+[[package]]
+name = "core"
+source = "path+from-root-F252333F9C4A5D78"
+
+[[package]]
+name = "std"
+source = "path+from-root-F252333F9C4A5D78"
+dependencies = ["core"]
+
+[[package]]
+name = "tests"
+source = "member"
+dependencies = ["std"]
--- forc-plugins/forc-client/tests/Forc.toml
@@ -0,0 +1,8 @@
+[project]
+authors = ["Fuel Labs <[email protected]>"]
+entry = "main.sw"
+license = "Apache-2.0"
+name = "tests"
+
+[dependencies]
+std = { path = "../../../sway-lib-std/" }
--- forc-plugins/forc-client/tests/mint.json
@@ -0,0 +1,29 @@
+{
+ "Mint": {
+ "tx_pointer": {
+ "block_height": 0,
+ "tx_index": 0
+ },
+ "input_contract": {
+ "utxo_id": {
+ "tx_id": "c49d65de61cf04588a764b557d25cc6c6b4bc0d7429227e2a21e61c213b3a3e2",
+ "output_index": 0
+ },
+ "balance_root": "2cafad611543e0265d89f1c2b60d9ebf5d56ad7e23d9827d6b522fd4d6e44bc31",
+ "state_root": "2cafad611543e0265d89f1c2b60d9ebf5d56ad7e23d9827d6b522fd4d6e44bc31",
+ "contract_id": "2cafad611543e0265d89f1c2b60d9ebf5d56ad7e23d9827d6b522fd4d6e44bc31",
+ "tx_pointer": {
+ "block_height": 0,
+ "tx_index": 0
+ }
+ },
+ "output_contract": {
+ "balance_root": "2cafad611543e0265d89f1c2b60d9ebf5d56ad7e23d9827d6b522fd4d6e44bc31",
+ "state_root": "2cafad611543e0265d89f1c2b60d9ebf5d56ad7e23d9827d6b522fd4d6e44bc31",
+ "contract_id": "2cafad611543e0265d89f1c2b60d9ebf5d56ad7e23d9827d6b522fd4d6e44bc31",
+ "input_index": 0
+ },
+ "mint_amount": 2123123121,
+ "mint_asset_id": "2cafad611543e0265d89f1c2b60d9ebf5d56ad7e23d9827d6b522fd4d6e44bc31"
+ }
+}
--- forc-plugins/forc-client/tests/src/main.sw
@@ -0,0 +1,11 @@
+contract;
+
+abi MyContract {
+ fn test_function() -> bool;
+}
+
+impl MyContract for Contract {
+ fn test_function() -> bool {
+ true
+ }
+}
--- forc-plugins/forc-crypto/src/address.rs
@@ -5,9 +5,7 @@ use serde_json::json;
use std::str::{from_utf8, FromStr};
forc_util::cli_examples! {
- crate::Command {
- [ Convert an address to another format => "forc crypto address fuel12e0xwx34nfp7jrzvn9mp5qkac3yvp7h8fx37ghl7klf82vv2wkys6wd523" ]
- }
+ [ Convert an address to another format => crypto "address fuel12e0xwx34nfp7jrzvn9mp5qkac3yvp7h8fx37ghl7klf82vv2wkys6wd523" ]
}
#[derive(Debug, clap::Args)]
--- forc-plugins/forc-crypto/src/args.rs
@@ -7,12 +7,10 @@ use std::{
};
forc_util::cli_examples! {
- crate::Command {
- [ Hashes an argument with SHA256 => "forc crypto sha256 test" ]
- [ Hashes an argument with Keccak256 => "forc crypto keccak256 test" ]
- [ Hashes a file path with SHA256 => "forc crypto sha256 {file}" ]
- [ Hashes a file path with Keccak256 => "forc crypto keccak256 {file}" ]
- }
+ [ Hashes an argument with SHA256 => crypto "sha256 test" ]
+ [ Hashes an argument with Keccak256 => crypto "keccak256 test" ]
+ [ Hashes a file path with SHA256 => crypto "sha256 src/args.rs" ]
+ [ Hashes a file path with Keccak256 => crypto "keccak256 src/args.rs" ]
}
#[derive(Debug, Clone, clap::Args)]
--- forc-plugins/forc-crypto/src/keys/get_public_key.rs
@@ -5,11 +5,9 @@ use fuels_core::types::bech32::Bech32Address;
use serde_json::json;
forc_util::cli_examples! {
- crate::Command {
- [ Get the public key from a message and its signature => r#"forc crypto get-public-key \
- 0x1eff08081394b72239a0cf7ff6b499213dcb7a338bedbd75d072d504588ef27a1f74d5ceb2f111ec02ede097fb09ed00aa9867922ed39299dae0b1afc0fa8661 \
- "This is a message that is signed""# ]
- }
+ [ Get the public key from a message and its signature => crypto r#"get-public-key \
+ 0x1eff08081394b72239a0cf7ff6b499213dcb7a338bedbd75d072d504588ef27a1f74d5ceb2f111ec02ede097fb09ed00aa9867922ed39299dae0b1afc0fa8661 \
+ "This is a message that is signed""# ]
}
/// Parse a secret key to view the associated public key
--- forc-plugins/forc-crypto/src/keys/new_key.rs
@@ -14,11 +14,9 @@ use std::ops::Deref;
const ABOUT: &str = "Creates a new key for use with fuel-core";
forc_util::cli_examples! {
- crate::Command {
- [ Creates a new key default for block production => "forc crypto new-key" ]
- [ Creates a new key for peering => "forc crypto new-key -k peering" ]
- [ Creates a new key for block production => "forc crypto new-key -k block-production" ]
- }
+ [ Creates a new key default for block production => crypto "new-key" ]
+ [ Creates a new key for peering => crypto "new-key -k peering" ]
+ [ Creates a new key for block production => crypto "new-key -k block-production" ]
}
/// Generate a random new secret & public key in the format expected by fuel-core
--- forc-plugins/forc-crypto/src/keys/parse_secret.rs
@@ -8,10 +8,8 @@ use std::{ops::Deref, str::FromStr};
const ABOUT: &str = "Parses a private key to view the associated public key";
forc_util::cli_examples! {
- crate::Command {
- [ Parses the secret of a block production => "forc crypto parse-secret \"f5204427d0ab9a311266c96a377f7c329cb8a41b9088225b6fcf40eefb423e28\"" ]
- [ Parses the secret of a peering => "forc crypto parse-secret -k peering \"f5204427d0ab9a311266c96a377f7c329cb8a41b9088225b6fcf40eefb423e28\"" ]
- }
+ [ Parses the secret of a block production => crypto "parse-secret \"f5204427d0ab9a311266c96a377f7c329cb8a41b9088225b6fcf40eefb423e28\"" ]
+ [ Parses the secret of a peering => crypto "parse-secret -k peering \"f5204427d0ab9a311266c96a377f7c329cb8a41b9088225b6fcf40eefb423e28\"" ]
}
/// Parse a secret key to view the associated public key
--- forc-plugins/forc-doc/src/cli.rs
@@ -5,13 +5,11 @@ use forc_pkg::source::IPFSNode;
const ABOUT: &str = "Forc plugin for building a Sway package's documentation";
forc_util::cli_examples! {
- crate::Command {
- [ Build the docs for a project in the current path => "forc doc"]
- [ Build the docs for a project in the current path and open it in the browser => "forc doc --open" ]
- [ Build the docs for a project located in another path => "forc doc --manifest-path {path}" ]
- [ Build the docs for the current project exporting private types => "forc doc --document-private-items" ]
- [ Build the docs offline without downloading any dependency from the network => "forc doc --offline" ]
- }
+ [ Build the docs for a project in the current path => doc ""]
+ [ Build the docs for a project in the current path and open it in the browser => doc "--open" ]
+ [ Build the docs for a project located in another path => doc "--manifest-path ../tests_project2" ]
+ [ Build the docs for the current project exporting private types => doc "--document-private-items" ]
+ [ Build the docs offline without downloading any dependency from the network => doc "--offline" ]
}
#[derive(Debug, Parser, Default)]
--- forc-plugins/forc-doc/tests/.gitignore
@@ -0,0 +1,2 @@
+out
+target
--- forc-plugins/forc-doc/tests/Forc.lock
@@ -0,0 +1,13 @@
+[[package]]
+name = "core"
+source = "path+from-root-F252333F9C4A5D78"
+
+[[package]]
+name = "std"
+source = "path+from-root-F252333F9C4A5D78"
+dependencies = ["core"]
+
+[[package]]
+name = "tests"
+source = "member"
+dependencies = ["std"]
--- forc-plugins/forc-doc/tests/Forc.toml
@@ -0,0 +1,8 @@
+[project]
+authors = ["Fuel Labs <[email protected]>"]
+entry = "main.sw"
+license = "Apache-2.0"
+name = "tests"
+
+[dependencies]
+std = { path = "../../../sway-lib-std/" }
--- forc-plugins/forc-doc/tests/src/main.sw
@@ -0,0 +1,11 @@
+contract;
+
+abi MyContract {
+ fn test_function() -> bool;
+}
+
+impl MyContract for Contract {
+ fn test_function() -> bool {
+ true
+ }
+}
--- forc-plugins/forc-fmt/src/main.rs
@@ -19,14 +19,12 @@ use sway_utils::{constants, find_parent_manifest_dir, get_sway_files, is_sway_fi
use swayfmt::Formatter;
forc_util::cli_examples! {
- crate::App {
- [ Run the formatter in check mode on the current directory => "forc fmt --check"]
- [ Run the formatter in check mode on the current directory with short format => "forc fmt -c"]
- [ Run formatter against a given file => "forc fmt --file {path}/src/main.sw"]
- [ Run formatter against a given file with short format => "forc fmt -f {path}/src/main.sw"]
- [ Run formatter against a given dir => "forc fmt --path {path}"]
- [ Run formatter against a given dir with short format => "forc fmt -p {path}"]
- }
+ [ Run the formatter in check mode on the current directory => fmt "--check"]
+ [ Run the formatter in check mode on the current directory with short format => fmt "-c"]
+ [ Run formatter against a given file => fmt "--file src/main.sw"]
+ [ Run formatter against a given file with short format => fmt "-f src/main.sw"]
+ [ Run formatter against a given dir => fmt "--path ../tests/"]
+ [ Run formatter against a given dir with short format => fmt "-p ../tests"]
}
#[derive(Debug, Parser)]
--- forc-plugins/forc-fmt/tests/Forc.lock
@@ -0,0 +1,13 @@
+[[package]]
+name = "core"
+source = "path+from-root-F252333F9C4A5D78"
+
+[[package]]
+name = "std"
+source = "path+from-root-F252333F9C4A5D78"
+dependencies = ["core"]
+
+[[package]]
+name = "tests"
+source = "member"
+dependencies = ["std"]
--- forc-plugins/forc-fmt/tests/Forc.toml
@@ -0,0 +1,8 @@
+[project]
+authors = ["Fuel Labs <[email protected]>"]
+entry = "main.sw"
+license = "Apache-2.0"
+name = "tests"
+
+[dependencies]
+std = { path = "../../../sway-lib-std/" }
--- forc-plugins/forc-fmt/tests/src/main.sw
@@ -0,0 +1,11 @@
+contract;
+
+abi MyContract {
+ fn test_function() -> bool;
+}
+
+impl MyContract for Contract {
+ fn test_function() -> bool {
+ true
+ }
+}
--- forc-plugins/forc-tx/src/lib.rs
@@ -13,14 +13,10 @@ use std::path::PathBuf;
use thiserror::Error;
forc_util::cli_examples! {
- {
- // This parser has a custom parser
- super::Command::try_parse_from_args
- } {
- [ Script example => r#"forc tx script --bytecode "{path}/out/debug/name.bin" --data "{path}/data.bin" \
+ [ Script example => tx r#"script --bytecode "out/debug/tests.bin" --data "data.bin" \
--receipts-root 0x2222222222222222222222222222222222222222222222222222222222222222"# ]
- [ Multiple inputs => r#"forc tx create --bytecode "{name}/out/debug/name.bin"
- --storage-slots "{path}/out/debug/name-storage_slots.json"
+ [ Multiple inputs => tx r#"create --bytecode "out/debug/tests.bin"
+ --storage-slots out/debug/tests-storage_slots.json
--script-gas-limit 100 \
--gas-price 0 \
--maturity 0 \
@@ -63,9 +59,9 @@ forc_util::cli_examples! {
--state-root 0x0000000000000000000000000000000000000000000000000000000000000000
"#
]
- [ An example constructing a create transaction => r#"forc tx create \
- --bytecode {path}/out/debug/name.bin \
- --storage-slots {path}/out/debug/name-storage_slots.json \
+ [ An example constructing a create transaction => tx "create \
+ --bytecode ./my-contract/out/debug/my-contract.bin \
+ --storage-slots out/debug/tests-storage_slots.json
--script-gas-limit 100 \
--gas-price 0 \
--maturity 0 \
@@ -92,9 +88,9 @@ forc_util::cli_examples! {
--recipient 0x2222222222222222222222222222222222222222222222222222222222222222 \
--amount 1 \
--nonce 0xBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB \
- --msg-data {path}/message.dat \
- --predicate {path}/my-predicate2.bin \
- --predicate-data {path}/my-predicate2.dat \
+ --msg-data ./message.dat \
+ --predicate ./my-predicate2.bin \
+ --predicate-data ./my-predicate2.dat \
output coin \
--to 0x2222222222222222222222222222222222222222222222222222222222222222 \
--amount 100 \
@@ -113,9 +109,8 @@ forc_util::cli_examples! {
--asset-id 0x0000000000000000000000000000000000000000000000000000000000000000 \
output contract-created \
--contract-id 0xCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCC \
- --state-root 0x0000000000000000000000000000000000000000000000000000000000000000"#
+ --state-root 0x0000000000000000000000000000000000000000000000000000000000000000"
]
- }
}
/// The top-level `forc tx` command.
--- forc-util/src/cli.rs
@@ -1,116 +1,141 @@
#[macro_export]
// Let the user format the help and parse it from that string into arguments to create the unit test
macro_rules! cli_examples {
- ($st:path { $( [ $($description:ident)* => $command:stmt ] )* }) => {
- forc_util::cli_examples! {
- {
- $crate::paste::paste! {
- use clap::Parser;
- $st::try_parse_from
- }
- } {
- $( [ $($description)* => $command ] )*
- }
- }
- };
- ( $code:block { $( [ $($description:ident)* => $command:stmt ] )* }) => {
- $crate::paste::paste! {
- #[cfg(test)]
- mod cli_parsing {
+ ($( [ $($description:ident)* => $command:tt $args:expr $( => $output:expr )? ] )*) => {
+ #[cfg(test)]
+ mod cli_examples {
+ use $crate::serial_test;
$(
- #[test]
- fn [<$($description:lower _)*:snake example>] () {
+ $crate::paste::paste! {
+ #[test]
+ #[serial_test::serial]
+ #[allow(unreachable_code)]
+ fn [<$($description:lower _)*:snake example>] () {
+ let mut proc = std::process::Command::new("cargo");
+ proc.arg("run");
+ proc.arg("--bin");
+ proc.arg(if stringify!($command) == "forc" {
+ "forc".to_owned()
+ } else {
+ format!("forc-{}", stringify!($command))
+ });
+ proc.arg("--");
+
+ super::parse_args($args).into_iter().for_each(|arg| {
+ proc.arg(arg);
+ });
+
+ let path = std::path::Path::new("tests");
+ if path.is_dir() {
+ // a tests folder exists, move the cwd of the process to
+ // be executed there. In that folder all files needed to
+ // run the cmd should be stored
+ proc.current_dir(path);
+ }
+ let output = proc.output().expect(stringify!($command));
- let cli_parser = $code;
- let mut args = parse_args($command);
- if cli_parser(args.clone()).is_err() {
- // Failed to parse, it maybe a plugin. To execute a plugin the first argument needs to be removed, `forc`.
- args.remove(0);
- cli_parser(args).expect("valid subcommand");
+ $(
+ let expected_output = $crate::Regex::new($output).expect("valid regex");
+ let stdout = String::from_utf8_lossy(&output.stdout);
+ let stderr = String::from_utf8_lossy(&output.stderr);
+
+ assert!(
+ expected_output.is_match(&stdout) ||
+ expected_output.is_match(&stderr),
+ "expected_output: {}\nStdOut:\n{}\nStdErr:\n{}\n",
+ expected_output,
+ stdout,
+ stderr,
+ );
+ return;
+ )?
+ // We don't know what to get or how to parse the output, all
+ // we care is to get a valid exit code
+ assert!(output.status.success(), "{}: {:?}", stringify!($($description)*), output);
}
}
-
)*
+ }
- #[cfg(test)]
- fn parse_args(input: &str) -> Vec<String> {
- let mut chars = input.chars().peekable().into_iter();
- let mut args = vec![];
+ #[cfg(test)]
+ fn parse_args(input: &str) -> Vec<String> {
+ let mut chars = input.chars().peekable().into_iter();
+ let mut args = vec![];
- loop {
- let character = if let Some(c) = chars.next() { c } else { break };
+ loop {
+ let character = if let Some(c) = chars.next() { c } else { break };
- match character {
- ' ' | '\\' | '\t' | '\n' => loop {
+ match character {
+ ' ' | '\\' | '\t' | '\n' => loop {
+ match chars.peek() {
+ Some(' ') | Some('\t') | Some('\n') => chars.next(),
+ _ => break,
+ };
+ },
+ '=' => {
+ args.push("=".to_string());
+ }
+ '"' | '\'' => {
+ let end_character = character;
+ let mut current_word = String::new();
+ loop {
match chars.peek() {
- Some(' ') | Some('\t') | Some('\n') => chars.next(),
- _ => break,
- };
- },
- '=' => {
- args.push("=".to_string());
- }
- '"' | '\'' => {
- let end_character = character;
- let mut current_word = String::new();
- loop {
- match chars.peek() {
- Some(character) => {
- if *character == end_character {
- let _ = chars.next();
- args.push(current_word);
- break;
- } else if *character == '\\' {
- let _ = chars.next();
- if let Some(character) = chars.next() {
- current_word.push(character);
- }
- } else {
- current_word.push(*character);
- chars.next();
- }
- }
- None => {
- break;
- }
- }
- }
- }
- character => {
- let mut current_word = character.to_string();
- loop {
- match chars.peek() {
- Some(' ') | Some('\t') | Some('\n') | Some('=') | Some('\'')
- | Some('"') | None => {
+ Some(character) => {
+ if *character == end_character {
+ let _ = chars.next();
args.push(current_word);
break;
- }
- Some(character) => {
+ } else if *character == '\\' {
+ let _ = chars.next();
+ if let Some(character) = chars.next() {
+ current_word.push(character);
+ }
+ } else {
current_word.push(*character);
chars.next();
}
}
+ None => {
+ break;
+ }
+ }
+ }
+ }
+ character => {
+ let mut current_word = character.to_string();
+ loop {
+ match chars.peek() {
+ Some(' ') | Some('\t') | Some('\n') | Some('=') | Some('\'')
+ | Some('"') | None => {
+ args.push(current_word);
+ break;
+ }
+ Some(character) => {
+ current_word.push(*character);
+ chars.next();
+ }
}
}
}
}
-
- args
}
- }
+ args
}
-
fn help() -> &'static str {
- Box::leak(format!("{}\n{}", forc_util::ansi_term::Colour::Yellow.paint("EXAMPLES:"), examples()).into_boxed_str())
+ Box::leak(format!("EXAMPLES:\n{}", examples()).into_boxed_str())
}
pub fn examples() -> &'static str {
Box::leak( [
$(
$crate::paste::paste! {
- format!(" # {}\n {}\n\n", stringify!($($description)*), $command)
+ if stringify!($command) == "forc" {
+ format!(" #{}\n forc {}\n\n", stringify!($($description)*), $args )
+ } else {
+ format!(" #{}\n forc {} {}\n\n", stringify!($($description)*), stringify!($command), $args )
+ }
},
)*
].concat().into_boxed_str())
--- forc-util/src/lib.rs
@@ -31,7 +31,6 @@ pub mod restricted;
#[macro_use]
pub mod cli;
-pub use ansi_term;
pub use paste;
pub use regex::Regex;
pub use serial_test;
--- forc/src/cli/commands/build.rs
@@ -3,11 +3,9 @@ use clap::Parser;
use forc_util::ForcResult;
forc_util::cli_examples! {
- crate::cli::Opt {
- [ Compile the current projectx => "forc build" ]
- [ Compile the current project from a different path => "forc build --path <PATH>" ]
- [ Compile the current project without updating dependencies => "forc build --path <PATH> --locked" ]
- }
+ [ Compile the current project => forc "build" => r#".*could not find `Forc.toml`.*"# ]
+ [ Compile the current project with a different path => forc "build --path ../tests/" => r#".*could not find `Forc.toml`.*"# ]
+ [ Compile the current project without updating dependencies => forc "build --locked" => r#".*could not find `Forc.toml`.*"# ]
}
/// Compile the current or target project.
--- forc/src/cli/commands/check.rs
@@ -5,11 +5,9 @@ use forc_util::{forc_result_bail, ForcResult};
use sway_core::{BuildTarget, Engines};
forc_util::cli_examples! {
- crate::cli::Opt {
- [ Check the current project => "forc check" ]
- [ Check the current project with a different path => "forc check --path <PATH>" ]
- [ Check the current project without updating dependencies => "forc check --locked" ]
- }
+ [ Check the current project => forc "check" => r#".*could not find `Forc.toml`.*"# ]
+ [ Check the current project with a different path => forc "check --path ../tests/" => r#".*could not find `Forc.toml`.*"# ]
+ [ Check the current project without updating dependencies => forc "check --locked" => r#".*could not find `Forc.toml`.*"# ]
}
/// Check the current or target project and all of its dependencies for errors.
--- forc/src/cli/commands/clean.rs
@@ -2,16 +2,8 @@ use crate::ops::forc_clean;
use clap::Parser;
use forc_util::ForcResult;
-forc_util::cli_examples! {
- crate::cli::Opt {
- [Clean project => "forc clean"]
- [Clean project with a custom path => "forc clean --path <PATH>"]
- }
-}
-
/// Removes the default forc compiler output artifact directory, i.e. `<project-name>/out`.
#[derive(Debug, Parser)]
-#[clap(bin_name = "forc clean", version, after_help = help())]
pub struct Command {
/// Path to the project, if not specified, current working directory will be used.
#[clap(short, long)]
--- forc/src/cli/commands/contract_id.rs
@@ -5,16 +5,8 @@ use crate::{
use clap::Parser;
use forc_util::{tx_utils::Salt, ForcResult};
-forc_util::cli_examples! {
- crate::cli::Opt {
- [Get contract id => "forc contract-id"]
- [Get contract id from a different path => "forc contract-id --path <PATH>"]
- }
-}
-
/// Determine contract-id for a contract. For workspaces outputs all contract ids in the workspace.
#[derive(Debug, Parser)]
-#[clap(bin_name = "forc contract-id", version, after_help = help())]
pub struct Command {
#[clap(flatten)]
pub pkg: Pkg,
--- forc/src/cli/commands/init.rs
@@ -2,18 +2,8 @@ use crate::ops::forc_init;
use clap::Parser;
use forc_util::ForcResult;
-forc_util::cli_examples! {
- crate::cli::Opt {
- [Initialize a new Forc project => "forc init --path <PATH>"]
- [Initialize a new Forc project as workspace => "forc init --path <PATH> --workspace"]
- [Initialize a new Forc project with a predicate => "forc init --path <PATH> --predicate"]
- [Initialize a new Forc library project => "forc init --path <PATH> --library"]
- }
-}
-
/// Create a new Forc project in an existing directory.
#[derive(Debug, Parser)]
-#[clap(bin_name = "forc init", version, after_help = help())]
pub struct Command {
/// The directory in which the forc project will be initialized.
#[clap(long)]
--- forc/src/cli/commands/new.rs
@@ -4,18 +4,8 @@ use clap::Parser;
use forc_util::{forc_result_bail, validate_name, ForcResult};
use std::path::{Path, PathBuf};
-forc_util::cli_examples! {
- crate::cli::Opt {
- [Create a new project => "forc new --contract --name my_project <PATH>"]
- [Create a new workspace => "forc new --workspace --name my_workspace <PATH>"]
- [Create a new Forc project with a predicate => "forc new <PATH> --predicate"]
- [Create a new Forc library project => "forc new <PATH> --library"]
- }
-}
-
/// Create a new Forc project at `<path>`.
#[derive(Debug, Parser)]
-#[clap(bin_name = "forc new", version, after_help = help())]
pub struct Command {
/// The default program type. Excluding all flags or adding this flag creates a basic contract
/// program.
--- forc/src/cli/commands/parse_bytecode.rs
@@ -7,15 +7,8 @@ use term_table::row::Row;
use term_table::table_cell::{Alignment, TableCell};
use tracing::info;
-forc_util::cli_examples! {
- crate::cli::Opt {
- [Parse bytecode => "forc parse-bytecode <PATH>"]
- }
-}
-
/// Parse bytecode file into a debug format.
#[derive(Debug, Parser)]
-#[clap(bin_name = "forc parse-bytecode", version, after_help = help())]
pub(crate) struct Command {
file_path: String,
}
--- forc/src/cli/commands/plugins.rs
@@ -10,19 +10,17 @@ use std::{
use tracing::info;
forc_util::cli_examples! {
- crate::cli::Opt {
- [ List all plugins => "forc plugins" ]
- [ List all plugins with their paths => "forc plugins --paths" ]
- [ List all plugins with their descriptions => "forc plugins --describe" ]
- [ List all plugins with their paths and descriptions => "forc plugins --paths --describe" ]
- }
+ [ List all plugins => forc "plugins" => r#".*Installed Plugins.*"# ]
+ [ List all plugins with their paths => forc "plugins --paths" => r#".*Installed Plugins.*"# ]
+ [ List all plugins with their descriptions => forc "plugins --describe" => r#".*Installed Plugins.*"# ]
+ [ List all plugins with their paths and descriptions => forc "plugins --paths --describe" => r#".*Installed Plugins.*"# ]
}
/// Find all forc plugins available via `PATH`.
///
/// Prints information about each discovered plugin.
#[derive(Debug, Parser)]
-#[clap(name = "forc plugins", about = "List all forc plugins", version, after_help = help())]
+#[clap(name = "plugins", about = "List all forc plugins", version, after_help = help())]
pub struct Command {
/// Prints the absolute path to each discovered plugin.
#[clap(long = "paths", short = 'p')]
--- forc/src/cli/commands/predicate_root.rs
@@ -4,16 +4,9 @@ use forc_util::ForcResult;
pub use crate::cli::shared::{BuildOutput, BuildProfile, Minify, Pkg, Print};
use crate::ops::forc_predicate_root;
-forc_util::cli_examples! {
- crate::cli::Opt {
- [Get predicate root => "forc predicate-root"]
- }
-}
-
/// Determine predicate-root for a predicate. For workspaces outputs all predicate roots in the
/// workspace.
#[derive(Debug, Parser)]
-#[clap(bin_name = "forc predicate-root", version, after_help = help())]
pub struct Command {
#[clap(flatten)]
pub pkg: Pkg,
--- forc/src/cli/commands/template.rs
@@ -2,15 +2,8 @@ use crate::ops::forc_template;
use clap::Parser;
use forc_util::ForcResult;
-forc_util::cli_examples! {
- crate::cli::Opt {
- [Create a new Forc project from an option template => "forc template new-path --template-name option"]
- }
-}
-
/// Create a new Forc project from a git template.
#[derive(Debug, Parser)]
-#[clap(bin_name = "forc template", version, after_help = help())]
pub struct Command {
/// The template url, should be a git repo.
#[clap(long, short, default_value = "https://github.com/fuellabs/sway")]
--- forc/src/cli/commands/test.rs
@@ -8,12 +8,10 @@ use pkg::manifest::ExperimentalFlags;
use tracing::info;
forc_util::cli_examples! {
- crate::cli::Opt {
- [ Run test => "forc test" ]
- [ Run test with a filter => "forc test $filter" ]
- [ Run test without any output => "forc test --silent" ]
- [ Run test without creating or update the lock file => "forc test --locked" ]
- }
+ [ Run test => forc "test" => ".*could not find `Forc.toml`.*" ]
+ [ Run test with a filter => forc "test $filter" => ".*could not find `Forc.toml`.*" ]
+ [ Run test without any output => forc "test --silent" => "^$" ]
+ [ Run test without creating or update the lock file => forc "test --locked" => ".*could not find `Forc.toml`.*" ]
}
/// Run the Sway unit tests for the current project.
@@ -34,7 +32,6 @@ forc_util::cli_examples! {
/// considered a failure in the case that a revert (`rvrt`) instruction is encountered during
/// execution. Otherwise, it is considered a success.
#[derive(Debug, Parser)]
-#[clap(bin_name = "forc test", version, after_help = help())]
pub struct Command {
#[clap(flatten)]
pub build: cli::shared::Build,
--- forc/src/cli/commands/update.rs
@@ -3,17 +3,8 @@ use clap::Parser;
use forc_pkg::source::IPFSNode;
use forc_util::ForcResult;
-forc_util::cli_examples! {
- crate::cli::Opt {
- [Update dependencies => "forc update"]
- [Update a specific dependency => "forc update -d std"]
- [Check if dependencies have newer versions => "forc update --check"]
- }
-}
-
/// Update dependencies in the Forc dependencies directory.
-#[derive(Debug, Default, Parser)]
-#[clap(bin_name = "forc update", version, after_help = help())]
+#[derive(Debug, Parser)]
pub struct Command {
/// Path to the project, if not specified, current working directory will be used.
#[clap(short, long)]
|
sway
|
fuellabs
|
Rust
|
Rust
| 62,435
| 5,382
|
🌴 Empowering everyone to build reliable and efficient smart contracts.
|
fuellabs_sway
|
CODE_IMPROVEMENT
|
Non-functional code changes to improve readability, migration etc.
|
1f49265f29459295bc1b66c53da3de9e99a9370e
|
2022-01-08 20:37:04
|
longpanda
|
1.0.64 release
| false
| 0
| 0
| 0
|
(error extracting diff)
|
ventoy
|
ventoy
|
C
|
C
| 65,265
| 4,197
|
A new bootable USB solution.
|
ventoy_ventoy
|
CONFIG_CHANGE
|
version changes are done
|
1a6806c5d063bd161713d061d0c647afac6726e3
|
2025-03-19 14:52:34
|
Nikolay Lunyak
|
[Tests] Update `progressiveMode*` CLI tests. I believe this way they make more sense. Though, ideally we should find a way of writing such tests without mentioning specific language features, so that we don't need to come back and update them again. ^KT-75487
| false
| 11
| 21
| 32
|
--- compiler/testData/cli/jvm/progressive/progressiveTest.kt
@@ -1,3 +0,0 @@
-internal open class Foo {}
-
-fun <T : Foo> foo(x: T?) = x
--- compiler/testData/cli/jvm/progressive/tailrecOnVirtualMember.kt
@@ -0,0 +1,3 @@
+open class A {
+ tailrec open fun foo(x: Int) {}
+}
\ No newline at end of file
--- compiler/testData/cli/jvm/progressive/typeParametersInAnonymousObjects.kt
@@ -0,0 +1,3 @@
+fun test() {
+ val x = object<T> { }
+}
--- compiler/testData/cli/jvm/progressiveModeOff.args
@@ -2,4 +2,4 @@ $TESTDATA_DIR$/progressive
-d
$TEMP_DIR$
-language-version
-2.1
+1.7
--- compiler/testData/cli/jvm/progressiveModeOff.out
@@ -1,4 +1,8 @@
-compiler/testData/cli/jvm/progressive/progressiveTest.kt:3:10: warning: 'public' generic exposes its 'internal' parameter bound type 'Foo'. This will be prohibited in the future.
-fun <T : Foo> foo(x: T?) = x
- ^^^
-OK
+warning: language version 1.7 is deprecated and its support will be removed in a future version of Kotlin
+compiler/testData/cli/jvm/progressive/tailrecOnVirtualMember.kt:2:5: error: tailrec is not allowed on open members
+ tailrec open fun foo(x: Int) {}
+ ^^^^^^^
+compiler/testData/cli/jvm/progressive/typeParametersInAnonymousObjects.kt:2:19: error: type parameters are not allowed for objects
+ val x = object<T> { }
+ ^^^
+COMPILATION_ERROR
--- compiler/testData/cli/jvm/progressiveModeOn.out
@@ -1,4 +1,7 @@
-compiler/testData/cli/jvm/progressive/progressiveTest.kt:3:10: error: 'public' generic exposes its 'internal' parameter bound type 'Foo'.
-fun <T : Foo> foo(x: T?) = x
- ^^^
+compiler/testData/cli/jvm/progressive/tailrecOnVirtualMember.kt:2:5: error: tailrec is prohibited on open members.
+ tailrec open fun foo(x: Int) {}
+ ^^^^^^^
+compiler/testData/cli/jvm/progressive/typeParametersInAnonymousObjects.kt:2:19: error: type parameters are prohibited for objects.
+ val x = object<T> { }
+ ^^^
COMPILATION_ERROR
|
kotlin
|
jetbrains
|
Kotlin
|
Kotlin
| 50,115
| 5,861
|
The Kotlin Programming Language.
|
jetbrains_kotlin
|
CODE_IMPROVEMENT
|
just the way of writing code changed
|
bee2a8487e14627afe8b034994c8c06fda4a5bb8
|
2024-03-15 13:43:50
|
Christopher Helmerich
|
Update README.md
| false
| 20
| 2
| 22
|
--- README.md
@@ -1,20 +1,2 @@
-# WarpFactory - A numerical toolkit for analyzing warp drive spacetimes.
-
-Warp Factory is a set of functions written in MATLAB which is designed to model warp spacetimes using Einstein's theory of General Relativity. Its development is focused on providing a numerical framework to analyze the physicality of spacetime, which is a unique focus in the development of warp solutions.
-
-## Key Features of Warp Factory
-Energy condition evaluations for the point-wise Null, Weak, Dominant, and Strong
-Metric scalar evaluation for the shear, expansion, and vorticity.
-3D finite difference solver for the stress-energy tensor
-GPU utilization
-
-
-## License
-Warp Factory is published using the MIT license. We do ask that all users of Warp Factory please cite their use of the code in published work.
-
-## Development Team
-Christopher Helmerich
-Jared Fuchs
-
-In addition, we would like to thank the following people for their contributions of providing methods and reviewing the codebase:
-Alexey Bobrick, Luke Sellers, Brandon Melcher, Justin Feng, Gianni Martire
+# WarpFactory
+A numerical toolkit for analyzing warp drive spacetimes
|
warpfactory
|
nerdswithattitudes
|
MATLAB
|
MATLAB
| 298
| 41
|
WarpFactory is a numerical toolkit for analyzing warp drive spacetimes.
|
nerdswithattitudes_warpfactory
|
DOC_CHANGE
|
changes in readme
|
70af867cde25cdffc543a6f59ba6b465f0ec3eee
|
2025-03-01 20:53:20
|
Garrick Aden-Buie
|
fix: Ollama tool calls with empty properties (#348) Need to be dropped or ollama throws
| false
| 7
| 3
| 10
|
--- NEWS.md
@@ -20,7 +20,7 @@
* `chat_gemini()` now detects viewer-based credentials when running on Posit
Connect (#320, @atheriel).
-* `chat_ollama()` now works with `tool()` definitions with optional arguments or empty properties (#342, #348, @gadenbuie).
+* `chat_ollama()` now works with `tool()` definitions with optional arguments (#342, @gadenbuie).
# ellmer 0.1.1
--- R/provider-ollama.R
@@ -117,11 +117,11 @@ method(as_json, list(ProviderOllama, TypeObject)) <- function(provider, x) {
# Unlike OpenAI, Ollama uses the `required` field to list required tool args
required <- map_lgl(x@properties, function(prop) prop@required)
- compact(list(
+ list(
type = "object",
description = x@description %||% "",
properties = as_json(provider, x@properties),
required = as.list(names2(x@properties)[required]),
additionalProperties = FALSE
- ))
+ )
}
--- tests/testthat/test-provider-ollama.R
@@ -28,10 +28,6 @@ test_that("can chat with tool request", {
)
)
- # Tool with no properties
- current_time <- function() Sys.time()
- chat$register_tool(tool(current_time, "Current system time"))
-
# Ollama tool calling is very inconsistent, esp. with small models, so we
# just test that the model still works when a tool call is registered.
expect_no_error(
|
ellmer
|
tidyverse
|
R
|
R
| 401
| 55
|
Call LLM APIs from R
|
tidyverse_ellmer
|
BUG_FIX
|
Matched \bfix(e[ds]|ing)?\b in message
|
6860d09c722b3afc3b5b5de124e3fc80f4e5e42c
| null |
Rich-Harris
|
only delete applicable transition animations - fixes #1290
| false
| 1
| 1
| 0
|
--- transitions.js
@@ -199,7 +199,7 @@ export var transitionManager = {
node.style.animation = node.style.animation
.split(', ')
.filter(function(anim) {
- return !/__svelte/.test(anim);
+ return anim.indexOf(name) === -1;
})
.join(', ');
}
|
sveltejs_svelte.json
| null | null | null | null | null | null |
sveltejs_svelte.json
|
BUG_FIX
|
5, fixed a bug
|
69aeb87eca1795f760ca0390c9cd4d999adc2e45
|
2025-03-15 01:12:42
|
Um Changyong
|
update error message in get_backend() more detail_ (#141796) Fixes #ISSUE_NUMBER When attempting to reconfigure the environment without properly handling the PyTorch-related settings, you may encounter the following message. ``` │ /root/.cache/pypoetry/virtualenvs/app-rag-sample-9TtSrW0h-py3.10/lib/python3.10/site-packages/torch/distributed/distribut │ │ ed_c10d.py:1215 in get_backend │ │ │ │ 1212 │ if _rank_not_in_group(pg): │ │ 1213 │ │ raise ValueError("Invalid process group specified") │ │ 1214 │ pg_store = _world.pg_map[pg] if pg in _world.pg_map else None │ │ ❱ 1215 │ return Backend(not_none(pg_store)[0]) │ │ 1216 │ │ 1217 │ │ 1218 def _get_process_group_uid(pg: ProcessGroup) -> int: │ │ │ │ /root/.cache/pypoetry/virtualenvs/app-rag-sample-9TtSrW0h-py3.10/lib/python3.10/site-packages/torch/utils/_typing_utils.p │ │ y:13 in not_none │ │ │ │ 10 │ │ 11 def not_none(obj: Optional[T]) -> T: │ │ 12 │ if obj is None: │ │ ❱ 13 │ │ raise TypeError("Invariant encountered: value was None when it should not be") │ │ 14 │ return obj │ │ 15 │ ╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ TypeError: Invariant encountered: value was None when it should not be Exception ignored in: <function Vllm.__del__ at 0x7f35f96b6dd0> ``` Since this message can cause confusion for multiple developers, the purpose of this PR is to suggest additional details to help clarify the situation. Pull Request resolved: https://github.com/pytorch/pytorch/pull/141796 Approved by: https://github.com/kwen2501
| false
| 7
| 1
| 8
|
--- torch/distributed/distributed_c10d.py
@@ -1359,13 +1359,7 @@ def get_backend(group: Optional[ProcessGroup] = None) -> Backend:
pg = group or _get_default_group()
if _rank_not_in_group(pg):
raise ValueError("Invalid process group specified")
-
- pg_store = _world.pg_map.get(pg, None)
- if pg_store is None:
- raise ValueError(
- f"Process group {pg} is not initialized in the world group map. Please initialize the group first."
- )
-
+ pg_store = _world.pg_map[pg] if pg in _world.pg_map else None
return Backend(not_none(pg_store)[0])
|
pytorch
| null |
python
|
Python
| null | null |
Tensors and Dynamic neural networks in Python with strong GPU acceleration
|
_pytorch
|
CODE_IMPROVEMENT
|
error message updated
|
43126298efe43922f28e2ee15ad1bb9a7362a137
|
2023-08-26 17:11:09
|
Winter
|
update: lazy theta*
| false
| 250
| 1
| 251
|
--- README.md
@@ -83,7 +83,6 @@ Planner | Version | Animation
**A*** | [](https://github.com/ai-winter/matlab_motion_planning/blob/master/global_planner/graph_search/a_star.m) | 
**JPS** | [](https://github.com/ai-winter/matlab_motion_planning/blob/master/global_planner/graph_search/jps.m) | 
**Theta\*** | [](https://github.com/ai-winter/matlab_motion_planning/blob/master/global_planner/graph_search/theta_star.m) | 
-**Lazy Theta\*** | [](https://github.com/ai-winter/matlab_motion_planning/blob/master/global_planner/graph_search/lazy_theta_star.m) | 
**D*** | [](https://github.com/ai-winter/matlab_motion_planning/blob/master/global_planner/graph_search/d_star.m) | 
**LPA*** |  | 
**D\* Lite** |  |
@@ -121,7 +120,6 @@ Planner | Version | Animation
* [D*: ](http://web.mit.edu/16.412j/www/html/papers/original_dstar_icra94.pdf) Optimal and Efficient Path Planning for Partially-Known Environments
* [D* Lite: ](http://idm-lab.org/bib/abstracts/papers/aaai02b.pdf) D* Lite
* [Theta*: ](https://www.jair.org/index.php/jair/article/view/10676) Theta*: Any-Angle Path Planning on Grids
-* [Lazy Theta*: ](https://ojs.aaai.org/index.php/AAAI/article/view/7566) Lazy Theta*: Any-Angle Path Planning and Path Length Analysis in 3D
## Sample-based Planning
* [RRT: ](http://msl.cs.uiuc.edu/~lavalle/papers/Lav98c.pdf) Rapidly-Exploring Random Trees: A New Tool for Path Planning
--- examples/simulation_global.mlx
Binary files a/examples/simulation_global.mlx and b/examples/simulation_global.mlx differ
--- gif/lazy_theta_star_matlab.png
Binary files a/gif/lazy_theta_star_matlab.png and /dev/null differ
--- global_planner/graph_search/lazy_theta_star.m
@@ -1,246 +0,0 @@
-function [path, goal_reached, cost, EXPAND] = lazy_theta_star(map, start, goal)
-% @file: lazy_theta_star.m
-% @breif: Lazy Theta* motion planning
-% @paper: Lazy Theta*: Any-Angle Path Planning and Path Length Analysis in 3D
-% @author: Winter
-% @update: 2023.8.26
-
-%
-% == OPEN and CLOSED ==
-% [x, y, g, h, px, py]
-% =====================
-%
-
-% initialize
-OPEN = [];
-CLOSED = [];
-EXPAND = [];
-
-cost = 0;
-goal_reached = false;
-motion = [-1, -1, sqrt(2); ...
- 0, -1, 1; ...
- 1, -1, sqrt(2); ...
- -1, 0, 1; ...
- 1, 0, 1; ...
- -1, 1, sqrt(2); ...
- 0, 1, 1; ...
- 1, 1, sqrt(2)];
-
-motion_num = size(motion, 1);
-
-node_s = [start, 0, h(start, goal), start];
-OPEN = [OPEN; node_s];
-
-while ~isempty(OPEN)
- % pop
- f = OPEN(:, 3) + OPEN(:, 4);
- [~, index] = min(f);
- cur_node = OPEN(index, :);
- OPEN(index, :) = [];
-
- % set vertex: path 1
- p_index = loc_list(cur_node(5: 6), CLOSED, [1, 2]);
- if p_index
- node_p = CLOSED(p_index, :);
- if line_of_sight(map, node_p, cur_node)
- cur_node(3) = inf;
- for i = 1:motion_num
- node_n_x = cur_node(1) + motion(i, 1);
- node_n_y = cur_node(2) + motion(i, 2);
- np_index = loc_list([node_n_x, node_n_y], CLOSED, [1, 2]);
- if np_index
- node_n_p = CLOSED(np_index, :);
- if cur_node(3) > node_n_p(3) + dist(node_n_p(1: 2), cur_node(1: 2)')
- cur_node(3) = node_n_p(3) + dist(node_n_p(1: 2), cur_node(1: 2)');
- cur_node(5) = node_n_x;
- cur_node(6) = node_n_y;
- end
- end
- end
-
- end
- end
-
- % exists in CLOSED set
- if loc_list(cur_node, CLOSED, [1, 2])
- continue
- end
-
- % update expand zone
- if ~loc_list(cur_node, EXPAND, [1, 2])
- EXPAND = [EXPAND; cur_node(1:2)];
- end
-
- % goal found
- if cur_node(1) == goal(1) && cur_node(2) == goal(2)
- CLOSED = [cur_node; CLOSED];
- goal_reached = true;
- cost = cur_node(3);
- break
- end
- if (cur_node(1) ==17) &&(cur_node(2) == 26)
- cur_node(1);
- end
- % explore neighbors
- for i = 1:motion_num
- % path 1
- node_n = [
- cur_node(1) + motion(i, 1), ...
- cur_node(2) + motion(i, 2), ...
- cur_node(3) + motion(i, 3), ...
- 0, ...
- cur_node(1), cur_node(2)];
- node_n(4) = h(node_n(1:2), goal);
-
- % exists in CLOSED set
- if loc_list(node_n, CLOSED, [1, 2])
- continue
- end
-
- % obstacle
- if map(node_n(1), node_n(2)) == 2
- continue
- end
-
- p_index = loc_list(cur_node(5: 6), CLOSED, [1, 2]);
- if p_index
- node_p = CLOSED(p_index, :);
- else
- node_p = 0;
- end
-
- if node_p ~= 0
- node_n = update_vertex(node_p, node_n);
- end
-
- % update OPEN set
- OPEN = [OPEN; node_n];
- end
- CLOSED = [cur_node; CLOSED];
-end
-
-% extract path
-path = extract_path(CLOSED, start);
-end
-
-%%
-function h_val = h(node, goal)
-% @breif: heuristic function (Euclidean distance)
-h_val = dist(node(1: 2), goal');
-end
-
-function index = loc_list(node, list, range)
-% @breif: locate the node in given list
-num = size(list);
-index = 0;
-
-if ~num(1)
- return
-else
- for i = 1:num(1)
- if isequal(node(range), list(i, range))
- index = i;
- return
- end
- end
-end
-end
-
-function node_c = update_vertex(node_p, node_c)
-% @breif: Update extend node information with current node's parent node.
- % path 2
- if node_p(3) + dist(node_c(1: 2), node_p(1: 2)') <= node_c(3)
- node_c(3) = node_p(3) + dist(node_c(1: 2), node_p(1: 2)');
- node_c(5: 6) = node_p(1: 2);
- end
-end
-
-function flag = line_of_sight(map, node1, node2)
-% @breif: Judge collision when moving from node1 to node2 using Bresenham.
- if (map(node1(1), node1(2)) == 2) || (map(node2(1), node2(2)) == 2)
- flag = true;
- return
- end
- x1 = node1(1); y1 = node1(2);
- x2 = node2(1); y2 = node2(2);
-
- d_x = abs(x2 - x1);
- d_y = abs(y2 - y1);
- if (x2 - x1) == 0
- s_x = 0;
- else
- s_x = (x2 - x1) / d_x;
- end
- if (y2 - y1) == 0
- s_y = 0;
- else
- s_y = (y2 - y1) / d_y;
- end
- x = x1; y = y1; e = 0;
-
- % check if any obstacle exists between node1 and node2
- if d_x > d_y
- tao = (d_y - d_x) / 2;
- while x ~= x2
- if e > tao
- x = x + s_x;
- e = e - d_y;
- elseif e < tao
- y = y + s_y;
- e = e + d_x;
- else
- x = x + s_x;
- y = y + s_y;
- e = e + d_x - d_y;
- end
- if map(x, y) == 2
- flag = true;
- return;
- end
- end
- % swap x and y
- else
- tao = (d_x - d_y) / 2;
- while y ~= y2
- if e > tao
- y = y + s_y;
- e = e - d_x;
- elseif e < tao
- x = x + s_x;
- e = e + d_y;
- else
- x = x + s_x;
- y = y + s_y;
- e = e + d_y - d_x;
- end
- if map(x, y) == 2
- flag = true;
- return;
- end
- end
- end
- flag = false;
-end
-
-function path = extract_path(close, start)
-% @breif: Extract the path based on the CLOSED set.
-path = [];
-closeNum = size(close, 1);
-index = 1;
-
-while 1
- path = [path; close(index, 1:2)];
-
- if isequal(close(index, 1:2), start)
- break
- end
-
- for i = 1:closeNum
- if isequal(close(i, 1:2), close(index, 5:6))
- index = i;
- break
- end
- end
-end
-end
--- utils/plot/plot_expand.m
@@ -11,8 +11,7 @@ function plot_expand(expand, map_size, G, planner_name)
strcmp(planner_name, 'dijkstra') || ...
strcmp(planner_name, 'jps') || ...
strcmp(planner_name, 'd_star') || ...
- strcmp(planner_name, 'theta_star') || ...
- strcmp(planner_name, 'lazy_theta_star')
+ strcmp(planner_name, 'theta_star')
plot_square(expand, map_size, G, "#ddd");
end
|
matlab_motion_planning
|
ai-winter
|
MATLAB
|
MATLAB
| 419
| 66
|
Motion planning and Navigation of AGV/AMR:matlab implementation of Dijkstra, A*, Theta*, JPS, D*, LPA*, D* Lite, RRT, RRT*, RRT-Connect, Informed RRT*, ACO, Voronoi, PID, LQR, MPC, APF, RPP, DWA, DDPG, Bezier, B-spline, Dubins, Reeds-Shepp etc.
|
ai-winter_matlab_motion_planning
|
NEW_FEAT
|
Code change: new js function
|
b402db50a6fae54eb709fc440766ced593a69359
|
2023-02-06 18:26:38
|
Richard McElreath
|
lecture 11 slides
| false
| 1
| 1
| 2
|
--- README.md
@@ -36,7 +36,7 @@ Note about slides: In some browsers, the slides don't show correctly. If points
| Week 03 | 20 January | Chapters 5 and 6 | [5] <[Elemental Confounds](https://www.youtube.com/watch?v=mBEA7PKDmiY&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=5)> <[Slides](https://speakerdeck.com/rmcelreath/statistical-rethinking-2023-lecture-05)> <br> [6] <[Good and Bad Controls](https://www.youtube.com/watch?v=uanZZLlzKHw&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=6)> <[Slides](https://speakerdeck.com/rmcelreath/statistical-rethinking-2023-lecture-06)>
| Week 04 | 27 January | Chapters 7,8,9 | [7] <[Overfitting](https://www.youtube.com/watch?v=1VgYIsANQck&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=7)> <[Slides](https://speakerdeck.com/rmcelreath/statistical-rethinking-2023-lecture-07)> <br> [8] <[MCMC](https://www.youtube.com/watch?v=rZk2FqX2XnY&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=8)> <[Slides](https://speakerdeck.com/rmcelreath/statistical-rethinking-2023-lecture-08)>
| Week 05 | 03 February | Chapters 10 and 11 | [9] <[Modeling Events](https://www.youtube.com/watch?v=Zi6N3GLUJmw&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=9)> <[Slides](https://speakerdeck.com/rmcelreath/statistical-rethinking-2023-lecture-09)> <br> [10] <[Counts and Confounds](https://www.youtube.com/watch?v=jokxu18egu0&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=10)> <[Slides](https://speakerdeck.com/rmcelreath/statistical-rethinking-2023-lecture-10)>
-| Week 06 | 10 February | Chapters 11 and 12 | [11] <[Ordered Categories]> <[Slides](https://github.com/rmcelreath/stat_rethinking_2023/blob/main/slides/Lecture_11-ord_logit.pdf)> <br> [12] Multilevel Models
+| Week 06 | 10 February | Chapters 11 and 12 | [11] Ordered Categories <br> [12] Multilevel Models
| Week 07 | 17 February | Chapter 13 | [13] Multi-Multilevel Models <br> [14] More Multilevel Models
| Week 08 | 24 February | Chapter 14 | [15] Social networks <br> [16] Gaussian Processes
| Week 09 | 03 March | Chapter 15 | [17] Measurement Error <br> [18] Missing Data
|
stat_rethinking_2024
|
rmcelreath
|
R
|
R
| 1,474
| 151
| null |
rmcelreath_stat_rethinking_2024
|
DOC_CHANGE
|
changes in readme
|
e1d980a3d1f5045b212f268b72c337e3e150132e
|
2025-03-12 21:34:22
|
TengYao Chi
|
MINOR: Remove unused ConfigCommandOptions#forceOpt (#19170) This field is unused, and we should remove it. Reviewers: Chia-Ping Tsai <[email protected]>
| false
| 6
| 1
| 7
|
--- core/src/main/scala/kafka/admin/ConfigCommand.scala
@@ -536,6 +536,7 @@ object ConfigCommand extends Logging {
.withRequiredArg
.ofType(classOf[String])
.withValuesSeparatedBy(',')
+ val forceOpt: OptionSpecBuilder = parser.accepts("force", "Suppress console prompts")
val topic: OptionSpec[String] = parser.accepts("topic", "The topic's name.")
.withRequiredArg
.ofType(classOf[String])
--- docs/upgrade.html
@@ -30,12 +30,6 @@
</li>
</ul>
</li>
- <li><b>Command</b>
- <ul>
- <li>The <code>force</code> option of <code>ConfigCommand</code> has been removed, as it has been non-operational since version 0.10.1.0.
- </li>
- </ul>
- </li>
</ul>
<h4><a id="upgrade_4_0_0" href="#upgrade_4_0_0">Upgrading to 4.0.0</a></h4>
|
apache-kafka
| null |
Java
|
Java
| null | null |
a distributed, open-source streaming platform designed for building real-time data pipelines and streaming applications
|
_apache-kafka
|
CONFIG_CHANGE
|
Very small changes
|
106140a99fbdb7acf19723473621e0ccaa03c158
|
2024-12-30 20:16:30
|
Patrick Steinhardt
|
builtin/fast-import: fix segfault with unsafe SHA1 backend Same as with the preceding commit, git-fast-import(1) is using the safe variant to initialize a hashfile checkpoint. This leads to a segfault when passing the checkpoint into the hashfile subsystem because it would use the unsafe variants instead: ++ git --git-dir=R/.git fast-import --big-file-threshold=1 AddressSanitizer:DEADLYSIGNAL ================================================================= ==577126==ERROR: AddressSanitizer: SEGV on unknown address 0x000000000040 (pc 0x7ffff7a01a99 bp 0x5070000009c0 sp 0x7fffffff5b30 T0) ==577126==The signal is caused by a READ memory access. ==577126==Hint: address points to the zero page. #0 0x7ffff7a01a99 in EVP_MD_CTX_copy_ex (/nix/store/h1ydpxkw9qhjdxjpic1pdc2nirggyy6f-openssl-3.3.2/lib/libcrypto.so.3+0x201a99) (BuildId: 41746a580d39075fc85e8c8065b6c07fb34e97d4) #1 0x555555ddde56 in openssl_SHA1_Clone ../sha1/openssl.h:40:2 #2 0x555555dce2fc in git_hash_sha1_clone_unsafe ../object-file.c:123:2 #3 0x555555c2d5f8 in hashfile_checkpoint ../csum-file.c:211:2 #4 0x5555559647d1 in stream_blob ../builtin/fast-import.c:1110:2 #5 0x55555596247b in parse_and_store_blob ../builtin/fast-import.c:2031:3 #6 0x555555967f91 in file_change_m ../builtin/fast-import.c:2408:5 #7 0x55555595d8a2 in parse_new_commit ../builtin/fast-import.c:2768:4 #8 0x55555595bb7a in cmd_fast_import ../builtin/fast-import.c:3614:4 #9 0x555555b1f493 in run_builtin ../git.c:480:11 #10 0x555555b1bfef in handle_builtin ../git.c:740:9 #11 0x555555b1e6f4 in run_argv ../git.c:807:4 #12 0x555555b1b87a in cmd_main ../git.c:947:19 #13 0x5555561649e6 in main ../common-main.c:64:11 #14 0x7ffff742a1fb in __libc_start_call_main (/nix/store/65h17wjrrlsj2rj540igylrx7fqcd6vq-glibc-2.40-36/lib/libc.so.6+0x2a1fb) (BuildId: bf320110569c8ec2425e9a0c5e4eb7e97f1fb6e4) #15 0x7ffff742a2b8 in __libc_start_main@GLIBC_2.2.5 (/nix/store/65h17wjrrlsj2rj540igylrx7fqcd6vq-glibc-2.40-36/lib/libc.so.6+0x2a2b8) (BuildId: bf320110569c8ec2425e9a0c5e4eb7e97f1fb6e4) #16 0x555555772c84 in _start (git+0x21ec84) ==577126==Register values: rax = 0x0000511000000cc0 rbx = 0x0000000000000000 rcx = 0x000000000000000c rdx = 0x0000000000000000 rdi = 0x0000000000000000 rsi = 0x00005070000009c0 rbp = 0x00005070000009c0 rsp = 0x00007fffffff5b30 r8 = 0x0000000000000000 r9 = 0x0000000000000000 r10 = 0x0000000000000000 r11 = 0x00007ffff7a01a30 r12 = 0x0000000000000000 r13 = 0x00007fffffff6b60 r14 = 0x00007ffff7ffd000 r15 = 0x00005555563b9910 AddressSanitizer can not provide additional info. SUMMARY: AddressSanitizer: SEGV (/nix/store/h1ydpxkw9qhjdxjpic1pdc2nirggyy6f-openssl-3.3.2/lib/libcrypto.so.3+0x201a99) (BuildId: 41746a580d39075fc85e8c8065b6c07fb34e97d4) in EVP_MD_CTX_copy_ex ==577126==ABORTING ./test-lib.sh: line 1039: 577126 Aborted git --git-dir=R/.git fast-import --big-file-threshold=1 < input error: last command exited with $?=134 not ok 167 - R: blob bigger than threshold The segfault is only exposed in case the unsafe and safe backends are different from one another. Fix the issue by initializing the context with the unsafe SHA1 variant. Signed-off-by: Patrick Steinhardt <[email protected]> Signed-off-by: Junio C Hamano <[email protected]>
| false
| 1
| 1
| 2
|
--- builtin/fast-import.c
@@ -1102,7 +1102,7 @@ static void stream_blob(uintmax_t len, struct object_id *oidout, uintmax_t mark)
|| (pack_size + PACK_SIZE_THRESHOLD + len) < pack_size)
cycle_packfile();
- the_hash_algo->unsafe_init_fn(&checkpoint.ctx);
+ the_hash_algo->init_fn(&checkpoint.ctx);
hashfile_checkpoint(pack_file, &checkpoint);
offset = checkpoint.offset;
|
git
| null |
C
|
C
| null | null |
Version control
|
_git
|
BUG_FIX
|
fix written in the commit msg
|
baa81beb9fe44572c8183f19d68c523c91d2a883
|
2025-03-14 21:02:22
|
Alexander Pivovarov
|
PR #23718: Add missing hlo_argument_modes to the error message Imported from GitHub PR https://github.com/openxla/xla/pull/23718 The `multihost_hlo_runner` supports five `--hlo_argument_mode` options, but existing error message lists only three modes. This PR adds missing modes to the error message - specifically "use_zeros_as_input" and "uninitialized" Additionally, the error message now uses raw string literals (R"()") to simplify the code and avoid unnecessary escape characters. Copybara import of the project: -- f0abbde8f3cd514acfdfb68701483c73e77ef50e by Alexander Pivovarov <[email protected]>: Add missing hlo_argument_modes to error message Merging this change closes #23718 PiperOrigin-RevId: 736868382
| false
| 4
| 4
| 8
|
--- third_party/xla/xla/tools/multihost_hlo_runner/hlo_runner_main.cc
@@ -133,10 +133,10 @@ ArgumentModeFromString(absl::string_view text) {
return FunctionalHloRunner::ModuleArgumentMode::kUninitialized;
}
return absl::InvalidArgumentError(
- absl::StrCat(R"(Invalid --hlo_argument_mode specified. Expected one of: )"
- R"("use_device_id_as_input", "use_random_inputs", )"
- R"("use_shared_random_inputs", "use_zeros_as_input", or )",
- R"("uninitialized". Got: )", text));
+ absl::StrCat("Unrecognized module argument mode specified. Expect "
+ "\"use_device_id_as_input\", \"use_random_inputs\", or "
+ "\"use_shared_random_inputs\"., got: ",
+ text));
}
static absl::StatusOr<FunctionalHloRunner::PreprocessingOptions>
|
tensorflow
|
tensorflow
|
C++
|
C++
| 188,388
| 74,565
|
An Open Source Machine Learning Framework for Everyone
|
nan_tensorflow
|
BUG_FIX
|
Matched \berror\b in message
|
11b0ea33d1c770d69b1418bc78b27a936cc5a17e
|
2024-01-13 23:17:54
|
Gautam krishna R
|
fix(python): add support for cli args (#12159)
| false
| 1
| 1
| 2
|
--- plugins/python/python.plugin.zsh
@@ -44,7 +44,7 @@ function pyuserpaths() {
alias pygrep='grep -nr --include="*.py"'
# Run proper IPython regarding current virtualenv (if any)
-alias ipython='python3 -c "import IPython, sys; sys.exit(IPython.start_ipython())"'
+alias ipython="python3 -c 'import IPython; IPython.terminal.ipapp.launch_new_instance()'"
# Share local directory as a HTTP server
alias pyserver="python3 -m http.server"
|
ohmyzsh
|
ohmyzsh
|
Shell
|
Shell
| 176,465
| 26,013
|
🙃 A delightful community-driven (with 2,400+ contributors) framework for managing your zsh configuration. Includes 300+ optional plugins (rails, git, macOS, hub, docker, homebrew, node, php, python, etc), 140+ themes to spice up your morning, and an auto-update tool that makes it easy to keep up with the latest updates from the community.
|
ohmyzsh_ohmyzsh
|
BUG_FIX
|
Obvious
|
c6cdbd9962896eda4fb3b3b7afe1de7897e2a564
|
2022-12-10 16:52:30
|
renovate[bot]
|
Update dependency org.junit-pioneer:junit-pioneer to v1.9.1 (#7517) Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
| false
| 1
| 1
| 2
|
--- gradle/libs.versions.toml
@@ -56,7 +56,7 @@ junit-jupiter-engine = { module = "org.junit.jupiter:junit-jupiter-engine", vers
junit-jupiter-params = { module = "org.junit.jupiter:junit-jupiter-params", version.ref = "org-junit-jupiter" }
junit-platform-console = "org.junit.platform:junit-platform-console:1.9.1"
junit-vintage-engine = "org.junit.vintage:junit-vintage-engine:5.9.1"
-junit-pioneer = "org.junit-pioneer:junit-pioneer:1.9.1"
+junit-pioneer = "org.junit-pioneer:junit-pioneer:1.8.0"
junit5android-core = { module = "de.mannodermaus.junit5:android-test-core", version.ref = "de-mannodermaus-junit5" }
junit5android-runner = { module = "de.mannodermaus.junit5:android-test-runner", version.ref = "de-mannodermaus-junit5" }
kotlin-junit5 = { module = "org.jetbrains.kotlin:kotlin-test-junit5", version.ref = "org-jetbrains-kotlin" }
|
okhttp
|
square
|
Kotlin
|
Kotlin
| 46,179
| 9,194
|
Square’s meticulous HTTP client for the JVM, Android, and GraalVM.
|
square_okhttp
|
CONFIG_CHANGE
|
dependencies updated
|
40176749857cf2662428ea53f538de7364213352
|
2022-02-07 21:52:09
|
Matheus Felipe
|
Set badges for new workflows
| false
| 4
| 4
| 8
|
--- README.md
@@ -8,11 +8,11 @@
<div align="center">
<sup>Status</sup>
<br />
- <a href="https://github.com/public-apis/public-apis/actions/workflows/test_of_push_and_pull.yml">
- <img alt="Tests of push and pull" src="https://github.com/public-apis/public-apis/actions/workflows/test_of_push_and_pull.yml/badge.svg" />
+ <a href="https://github.com/public-apis/public-apis/actions?query=workflow%3A%22Run+tests%22">
+ <img alt="Run tests" src="https://github.com/public-apis/public-apis/workflows/Run%20tests/badge.svg" />
</a>
- <a href="https://github.com/public-apis/public-apis/actions/workflows/validate_links.yml">
- <img alt="Validate links" src="https://github.com/public-apis/public-apis/actions/workflows/validate_links.yml/badge.svg" />
+ <a href="https://github.com/public-apis/public-apis/actions?query=workflow%3A%22Validate+links%22">
+ <img alt="Validate links" src="https://github.com/public-apis/public-apis/workflows/Validate%20links/badge.svg?branch=master" />
</a>
<a href="https://github.com/public-apis/public-apis">
<img alt="Number of Categories" src="https://img.shields.io/badge/dynamic/json?url=https://api.publicapis.org/categories&label=Number%20of%20Categories&query=$.count&color=informational" />
|
public-apis
|
public-apis
|
Python
|
Python
| 329,015
| 34,881
|
A collective list of free APIs
|
public-apis_public-apis
|
DOC_CHANGE
|
changes in readme
|
80966ce5c44dcf79b7617c592044469db85b1d59
|
2025-02-19 19:52:58
|
Stanislav Láznička
|
integration: svm: use consistent path args pattern in etcd fetch functions Use function argument order at which the strings would appear in the etcd path.
| false
| 8
| 8
| 16
|
--- test/integration/storageversionmigrator/storageversionmigrator_test.go
@@ -89,7 +89,7 @@ func TestStorageVersionMigration(t *testing.T) {
}
wantPrefix := "k8s:enc:aescbc:v1:key2"
- etcdSecret, err := svmTest.getRawSecretFromETCD(t, secret.Namespace, secret.Name)
+ etcdSecret, err := svmTest.getRawSecretFromETCD(t, secret.Name, secret.Namespace)
if err != nil {
t.Fatalf("Failed to get secret from etcd: %v", err)
}
--- test/integration/storageversionmigrator/util.go
@@ -387,9 +387,9 @@ func (svm *svmTest) createSecret(ctx context.Context, t *testing.T, name, namesp
return svm.client.CoreV1().Secrets(secret.Namespace).Create(ctx, secret, metav1.CreateOptions{})
}
-func (svm *svmTest) getRawSecretFromETCD(t *testing.T, namespace, name string) ([]byte, error) {
+func (svm *svmTest) getRawSecretFromETCD(t *testing.T, name, namespace string) ([]byte, error) {
t.Helper()
- secretETCDPath := getETCDPathForResource(t, svm.storageConfig.Prefix, "", "secrets", namespace, name)
+ secretETCDPath := getETCDPathForResource(t, svm.storageConfig.Prefix, "", "secrets", name, namespace)
etcdResponse, err := svm.readRawRecordFromETCD(t, secretETCDPath)
if err != nil {
return nil, fmt.Errorf("failed to read %s from etcd: %w", secretETCDPath, err)
@@ -397,7 +397,7 @@ func (svm *svmTest) getRawSecretFromETCD(t *testing.T, namespace, name string) (
return etcdResponse.Kvs[0].Value, nil
}
-func getETCDPathForResource(t *testing.T, storagePrefix, group, resource, namespaceName, name string) string {
+func getETCDPathForResource(t *testing.T, storagePrefix, group, resource, name, namespaceName string) string {
t.Helper()
groupResource := resource
if group != "" {
@@ -431,9 +431,9 @@ func (svm *svmTest) readRawRecordFromETCD(t *testing.T, path string) (*clientv3.
return response, nil
}
-func (svm *svmTest) getRawCRFromETCD(t *testing.T, crdGroup, crdName, namespace, name string) ([]byte, error) {
+func (svm *svmTest) getRawCRFromETCD(t *testing.T, name, namespace, crdGroup, crdName string) ([]byte, error) {
t.Helper()
- crdETCDPath := getETCDPathForResource(t, svm.storageConfig.Prefix, crdGroup, crdName, namespace, name)
+ crdETCDPath := getETCDPathForResource(t, svm.storageConfig.Prefix, crdGroup, crdName, name, namespace)
etcdResponse, err := svm.readRawRecordFromETCD(t, crdETCDPath)
if err != nil {
t.Fatalf("failed to read %s from etcd: %v", crdETCDPath, err)
@@ -1056,7 +1056,7 @@ func (svm *svmTest) setupServerCert(t *testing.T) *certContext {
func (svm *svmTest) isCRStoredAtVersion(t *testing.T, version, crName string) bool {
t.Helper()
- data, err := svm.getRawCRFromETCD(t, crdGroup, crdName+"s", defaultNamespace, crName)
+ data, err := svm.getRawCRFromETCD(t, crName, defaultNamespace, crdGroup, crdName+"s")
if err != nil {
t.Fatalf("Failed to get CR from etcd: %v", err)
}
@@ -1135,7 +1135,7 @@ func (svm *svmTest) validateRVAndGeneration(ctx context.Context, t *testing.T, c
for crName, version := range crVersions {
// get CR from etcd
- data, err := svm.getRawCRFromETCD(t, crdGroup, crdName+"s", defaultNamespace, crName)
+ data, err := svm.getRawCRFromETCD(t, crName, defaultNamespace, crdGroup, crdName+"s")
if err != nil {
t.Fatalf("Failed to get CR from etcd: %v", err)
}
|
kubernetes
|
kubernetes
|
Go
|
Go
| 113,460
| 40,344
|
Production-Grade Container Scheduling and Management
|
kubernetes_kubernetes
|
BUG_FIX
|
Obvious
|
888e993534e1dfbcb7a9a17d28b8e0b9630b1795
|
2024-07-12 19:34:11
|
fufesou
|
fix: unable to close on fullscreen (#8690) Signed-off-by: fufesou <[email protected]>
| false
| 221
| 178
| 399
|
--- flutter/lib/desktop/widgets/tabbar_widget.dart
@@ -230,7 +230,8 @@ typedef LabelGetter = Rx<String> Function(String key);
int _lastClickTime =
DateTime.now().millisecondsSinceEpoch - bind.getDoubleClickTime() - 1000;
-class DesktopTab extends StatefulWidget {
+// ignore: must_be_immutable
+class DesktopTab extends StatelessWidget {
final bool showLogo;
final bool showTitle;
final bool showMinimize;
@@ -251,8 +252,12 @@ class DesktopTab extends StatefulWidget {
final DesktopTabController controller;
+ Rx<DesktopTabState> get state => controller.state;
final _scrollDebounce = Debouncer(delay: Duration(milliseconds: 50));
+ late final DesktopTabType tabType;
+ late final bool isMainWindow;
+
final RxList<String> invisibleTabKeys = RxList.empty();
DesktopTab({
@@ -274,232 +279,18 @@ class DesktopTab extends StatefulWidget {
this.unSelectedTabBackgroundColor,
this.selectedBorderColor,
this.blockTab,
- }) : super(key: key);
-
- static RxString tablabelGetter(String peerId) {
- final alias = bind.mainGetPeerOptionSync(id: peerId, key: 'alias');
- return RxString(getDesktopTabLabel(peerId, alias));
- }
-
- @override
- State<DesktopTab> createState() {
- return _DesktopTabState();
+ }) : super(key: key) {
+ tabType = controller.tabType;
+ isMainWindow = tabType == DesktopTabType.main ||
+ tabType == DesktopTabType.cm ||
+ tabType == DesktopTabType.install;
}
-}
-
-// ignore: must_be_immutable
-class _DesktopTabState extends State<DesktopTab>
- with MultiWindowListener, WindowListener {
- final _saveFrameDebounce = Debouncer(delay: Duration(seconds: 1));
- Timer? _macOSCheckRestoreTimer;
- int _macOSCheckRestoreCounter = 0;
-
- bool get showLogo => widget.showLogo;
- bool get showTitle => widget.showTitle;
- bool get showMinimize => widget.showMinimize;
- bool get showMaximize => widget.showMaximize;
- bool get showClose => widget.showClose;
- Widget Function(Widget pageView)? get pageViewBuilder =>
- widget.pageViewBuilder;
- TabMenuBuilder? get tabMenuBuilder => widget.tabMenuBuilder;
- Widget? get tail => widget.tail;
- Future<bool> Function()? get onWindowCloseButton =>
- widget.onWindowCloseButton;
- TabBuilder? get tabBuilder => widget.tabBuilder;
- LabelGetter? get labelGetter => widget.labelGetter;
- double? get maxLabelWidth => widget.maxLabelWidth;
- Color? get selectedTabBackgroundColor => widget.selectedTabBackgroundColor;
- Color? get unSelectedTabBackgroundColor =>
- widget.unSelectedTabBackgroundColor;
- Color? get selectedBorderColor => widget.selectedBorderColor;
- RxBool? get blockTab => widget.blockTab;
- DesktopTabController get controller => widget.controller;
- RxList<String> get invisibleTabKeys => widget.invisibleTabKeys;
- Debouncer get _scrollDebounce => widget._scrollDebounce;
-
- Rx<DesktopTabState> get state => controller.state;
-
- DesktopTabType get tabType => controller.tabType;
- bool get isMainWindow =>
- tabType == DesktopTabType.main ||
- tabType == DesktopTabType.cm ||
- tabType == DesktopTabType.install;
-
- _DesktopTabState() : super();
static RxString tablabelGetter(String peerId) {
final alias = bind.mainGetPeerOptionSync(id: peerId, key: 'alias');
return RxString(getDesktopTabLabel(peerId, alias));
}
- @override
- void initState() {
- super.initState();
- DesktopMultiWindow.addListener(this);
- windowManager.addListener(this);
-
- Future.delayed(Duration(milliseconds: 500), () {
- if (isMainWindow) {
- windowManager.isMaximized().then((maximized) {
- if (stateGlobal.isMaximized.value != maximized) {
- WidgetsBinding.instance.addPostFrameCallback(
- (_) => setState(() => stateGlobal.setMaximized(maximized)));
- }
- });
- } else {
- final wc = WindowController.fromWindowId(kWindowId!);
- wc.isMaximized().then((maximized) {
- debugPrint("isMaximized $maximized");
- if (stateGlobal.isMaximized.value != maximized) {
- WidgetsBinding.instance.addPostFrameCallback(
- (_) => setState(() => stateGlobal.setMaximized(maximized)));
- }
- });
- }
- });
- }
-
- @override
- void dispose() {
- DesktopMultiWindow.removeListener(this);
- windowManager.removeListener(this);
- _macOSCheckRestoreTimer?.cancel();
- super.dispose();
- }
-
- void _setMaximized(bool maximize) {
- stateGlobal.setMaximized(maximize);
- _saveFrameDebounce.call(_saveFrame);
- setState(() {});
- }
-
- @override
- void onWindowFocus() {
- stateGlobal.isFocused.value = true;
- }
-
- @override
- void onWindowBlur() {
- stateGlobal.isFocused.value = false;
- }
-
- @override
- void onWindowMinimize() {
- stateGlobal.setMinimized(true);
- stateGlobal.setMaximized(false);
- super.onWindowMinimize();
- }
-
- @override
- void onWindowMaximize() {
- stateGlobal.setMinimized(false);
- _setMaximized(true);
- super.onWindowMaximize();
- }
-
- @override
- void onWindowUnmaximize() {
- stateGlobal.setMinimized(false);
- _setMaximized(false);
- super.onWindowUnmaximize();
- }
-
- _saveFrame() async {
- if (tabType == DesktopTabType.main) {
- await saveWindowPosition(WindowType.Main);
- } else if (kWindowType != null && kWindowId != null) {
- await saveWindowPosition(kWindowType!, windowId: kWindowId);
- }
- }
-
- @override
- void onWindowMoved() {
- _saveFrameDebounce.call(_saveFrame);
- super.onWindowMoved();
- }
-
- @override
- void onWindowResized() {
- _saveFrameDebounce.call(_saveFrame);
- super.onWindowMoved();
- }
-
- @override
- void onWindowClose() async {
- mainWindowClose() async => await windowManager.hide();
- notMainWindowClose(WindowController windowController) async {
- if (controller.length != 0) {
- debugPrint("close not empty multiwindow from taskbar");
- if (isWindows) {
- await windowController.show();
- await windowController.focus();
- final res = await onWindowCloseButton?.call() ?? true;
- if (!res) return;
- }
- controller.clear();
- }
- await windowController.hide();
- await rustDeskWinManager
- .call(WindowType.Main, kWindowEventHide, {"id": kWindowId!});
- }
-
- macOSWindowClose(
- Future<bool> Function() checkFullscreen,
- Future<void> Function() closeFunc,
- ) async {
- _macOSCheckRestoreCounter = 0;
- _macOSCheckRestoreTimer =
- Timer.periodic(Duration(milliseconds: 30), (timer) async {
- _macOSCheckRestoreCounter++;
- if (!await checkFullscreen() || _macOSCheckRestoreCounter >= 30) {
- _macOSCheckRestoreTimer?.cancel();
- _macOSCheckRestoreTimer = null;
- Timer(Duration(milliseconds: 700), () async => await closeFunc());
- }
- });
- }
-
- // hide window on close
- if (isMainWindow) {
- if (rustDeskWinManager.getActiveWindows().contains(kMainWindowId)) {
- await rustDeskWinManager.unregisterActiveWindow(kMainWindowId);
- }
- // macOS specific workaround, the window is not hiding when in fullscreen.
- if (isMacOS && await windowManager.isFullScreen()) {
- await windowManager.setFullScreen(false);
- await macOSWindowClose(
- () async => await windowManager.isFullScreen(),
- mainWindowClose,
- );
- } else {
- await mainWindowClose();
- }
- } else {
- // it's safe to hide the subwindow
- final controller = WindowController.fromWindowId(kWindowId!);
- if (isMacOS) {
- // onWindowClose() maybe called multiple times because of loopCloseWindow() in remote_tab_page.dart.
- // use ??= to make sure the value is set on first call.
-
- if (await onWindowCloseButton?.call() ?? true) {
- if (await controller.isFullScreen()) {
- await controller.setFullscreen(false);
- stateGlobal.setFullscreen(false, procWnd: false);
- await macOSWindowClose(
- () async => await controller.isFullScreen(),
- () async => await notMainWindowClose(controller),
- );
- } else {
- await notMainWindowClose(controller);
- }
- }
- } else {
- await notMainWindowClose(controller);
- }
- }
- super.onWindowClose();
- }
-
@override
Widget build(BuildContext context) {
return Column(children: [
@@ -677,6 +468,7 @@ class _DesktopTabState extends State<DesktopTab>
// hide simulated action buttons when we in compatible ui mode, because of reusing system title bar.
WindowActionPanel(
isMainWindow: isMainWindow,
+ tabType: tabType,
state: state,
tabController: controller,
invisibleTabKeys: invisibleTabKeys,
@@ -694,6 +486,7 @@ class _DesktopTabState extends State<DesktopTab>
class WindowActionPanel extends StatefulWidget {
final bool isMainWindow;
+ final DesktopTabType tabType;
final Rx<DesktopTabState> state;
final DesktopTabController tabController;
@@ -709,6 +502,7 @@ class WindowActionPanel extends StatefulWidget {
const WindowActionPanel(
{Key? key,
required this.isMainWindow,
+ required this.tabType,
required this.state,
required this.tabController,
required this.invisibleTabKeys,
@@ -726,17 +520,180 @@ class WindowActionPanel extends StatefulWidget {
}
}
-class WindowActionPanelState extends State<WindowActionPanel> {
+class WindowActionPanelState extends State<WindowActionPanel>
+ with MultiWindowListener, WindowListener {
+ final _saveFrameDebounce = Debouncer(delay: Duration(seconds: 1));
+ Timer? _macOSCheckRestoreTimer;
+ int _macOSCheckRestoreCounter = 0;
+
@override
void initState() {
super.initState();
+ DesktopMultiWindow.addListener(this);
+ windowManager.addListener(this);
+
+ Future.delayed(Duration(milliseconds: 500), () {
+ if (widget.isMainWindow) {
+ windowManager.isMaximized().then((maximized) {
+ if (stateGlobal.isMaximized.value != maximized) {
+ WidgetsBinding.instance.addPostFrameCallback(
+ (_) => setState(() => stateGlobal.setMaximized(maximized)));
+ }
+ });
+ } else {
+ final wc = WindowController.fromWindowId(kWindowId!);
+ wc.isMaximized().then((maximized) {
+ debugPrint("isMaximized $maximized");
+ if (stateGlobal.isMaximized.value != maximized) {
+ WidgetsBinding.instance.addPostFrameCallback(
+ (_) => setState(() => stateGlobal.setMaximized(maximized)));
+ }
+ });
+ }
+ });
}
@override
void dispose() {
+ DesktopMultiWindow.removeListener(this);
+ windowManager.removeListener(this);
+ _macOSCheckRestoreTimer?.cancel();
super.dispose();
}
+ void _setMaximized(bool maximize) {
+ stateGlobal.setMaximized(maximize);
+ _saveFrameDebounce.call(_saveFrame);
+ setState(() {});
+ }
+
+ @override
+ void onWindowFocus() {
+ stateGlobal.isFocused.value = true;
+ }
+
+ @override
+ void onWindowBlur() {
+ stateGlobal.isFocused.value = false;
+ }
+
+ @override
+ void onWindowMinimize() {
+ stateGlobal.setMinimized(true);
+ stateGlobal.setMaximized(false);
+ super.onWindowMinimize();
+ }
+
+ @override
+ void onWindowMaximize() {
+ stateGlobal.setMinimized(false);
+ _setMaximized(true);
+ super.onWindowMaximize();
+ }
+
+ @override
+ void onWindowUnmaximize() {
+ stateGlobal.setMinimized(false);
+ _setMaximized(false);
+ super.onWindowUnmaximize();
+ }
+
+ _saveFrame() async {
+ if (widget.tabType == DesktopTabType.main) {
+ await saveWindowPosition(WindowType.Main);
+ } else if (kWindowType != null && kWindowId != null) {
+ await saveWindowPosition(kWindowType!, windowId: kWindowId);
+ }
+ }
+
+ @override
+ void onWindowMoved() {
+ _saveFrameDebounce.call(_saveFrame);
+ super.onWindowMoved();
+ }
+
+ @override
+ void onWindowResized() {
+ _saveFrameDebounce.call(_saveFrame);
+ super.onWindowMoved();
+ }
+
+ @override
+ void onWindowClose() async {
+ mainWindowClose() async => await windowManager.hide();
+ notMainWindowClose(WindowController controller) async {
+ if (widget.tabController.length != 0) {
+ debugPrint("close not empty multiwindow from taskbar");
+ if (isWindows) {
+ await controller.show();
+ await controller.focus();
+ final res = await widget.onClose?.call() ?? true;
+ if (!res) return;
+ }
+ widget.tabController.clear();
+ }
+ await controller.hide();
+ await rustDeskWinManager
+ .call(WindowType.Main, kWindowEventHide, {"id": kWindowId!});
+ }
+
+ macOSWindowClose(
+ Future<bool> Function() checkFullscreen,
+ Future<void> Function() closeFunc,
+ ) async {
+ _macOSCheckRestoreCounter = 0;
+ _macOSCheckRestoreTimer =
+ Timer.periodic(Duration(milliseconds: 30), (timer) async {
+ _macOSCheckRestoreCounter++;
+ if (!await checkFullscreen() || _macOSCheckRestoreCounter >= 30) {
+ _macOSCheckRestoreTimer?.cancel();
+ _macOSCheckRestoreTimer = null;
+ Timer(Duration(milliseconds: 700), () async => await closeFunc());
+ }
+ });
+ }
+
+ // hide window on close
+ if (widget.isMainWindow) {
+ if (rustDeskWinManager.getActiveWindows().contains(kMainWindowId)) {
+ await rustDeskWinManager.unregisterActiveWindow(kMainWindowId);
+ }
+ // macOS specific workaround, the window is not hiding when in fullscreen.
+ if (isMacOS && await windowManager.isFullScreen()) {
+ await windowManager.setFullScreen(false);
+ await macOSWindowClose(
+ () async => await windowManager.isFullScreen(),
+ mainWindowClose,
+ );
+ } else {
+ await mainWindowClose();
+ }
+ } else {
+ // it's safe to hide the subwindow
+ final controller = WindowController.fromWindowId(kWindowId!);
+ if (isMacOS) {
+ // onWindowClose() maybe called multiple times because of loopCloseWindow() in remote_tab_page.dart.
+ // use ??= to make sure the value is set on first call.
+
+ if (await widget.onClose?.call() ?? true) {
+ if (await controller.isFullScreen()) {
+ await controller.setFullscreen(false);
+ stateGlobal.setFullscreen(false, procWnd: false);
+ await macOSWindowClose(
+ () async => await controller.isFullScreen(),
+ () async => await notMainWindowClose(controller),
+ );
+ } else {
+ await notMainWindowClose(controller);
+ }
+ }
+ } else {
+ await notMainWindowClose(controller);
+ }
+ }
+ super.onWindowClose();
+ }
+
bool showTabDowndown() {
return widget.tabController.state.value.tabs.length > 1 &&
(widget.tabController.tabType == DesktopTabType.remoteScreen ||
|
rustdesk
|
rustdesk
|
Rust
|
Rust
| 83,345
| 11,693
|
An open-source remote desktop application designed for self-hosting, as an alternative to TeamViewer.
|
rustdesk_rustdesk
|
BUG_FIX
|
obvious
|
7117dfca6c482b5584a1c0a54d0ee522d5b12229
|
2024-04-29 23:02:44
|
Constantin Graf
|
Added S3 flysystem
| false
| 282
| 1
| 283
|
--- composer.json
@@ -20,7 +20,6 @@
"laravel/octane": "^2.3",
"laravel/passport": "^12.0",
"laravel/tinker": "^2.8",
- "league/flysystem-aws-s3-v3": "^3.0",
"nwidart/laravel-modules": "dev-feature/fixed_path",
"pxlrbt/filament-environment-indicator": "^2.0",
"spatie/temporary-directory": "^2.2",
--- composer.lock
@@ -4,7 +4,7 @@
"Read more about it at https://getcomposer.org/doc/01-basic-usage.md#installing-dependencies",
"This file is @generated automatically"
],
- "content-hash": "fc86decd60e047a790f4b27de3cb9e2d",
+ "content-hash": "5779fb7e87efa88e5db2b37e64fe89e3",
"packages": [
{
"name": "anourvalar/eloquent-serialize",
@@ -72,155 +72,6 @@
},
"time": "2024-03-22T12:56:46+00:00"
},
- {
- "name": "aws/aws-crt-php",
- "version": "v1.2.5",
- "source": {
- "type": "git",
- "url": "https://github.com/awslabs/aws-crt-php.git",
- "reference": "0ea1f04ec5aa9f049f97e012d1ed63b76834a31b"
- },
- "dist": {
- "type": "zip",
- "url": "https://api.github.com/repos/awslabs/aws-crt-php/zipball/0ea1f04ec5aa9f049f97e012d1ed63b76834a31b",
- "reference": "0ea1f04ec5aa9f049f97e012d1ed63b76834a31b",
- "shasum": ""
- },
- "require": {
- "php": ">=5.5"
- },
- "require-dev": {
- "phpunit/phpunit": "^4.8.35||^5.6.3||^9.5",
- "yoast/phpunit-polyfills": "^1.0"
- },
- "suggest": {
- "ext-awscrt": "Make sure you install awscrt native extension to use any of the functionality."
- },
- "type": "library",
- "autoload": {
- "classmap": [
- "src/"
- ]
- },
- "notification-url": "https://packagist.org/downloads/",
- "license": [
- "Apache-2.0"
- ],
- "authors": [
- {
- "name": "AWS SDK Common Runtime Team",
- "email": "[email protected]"
- }
- ],
- "description": "AWS Common Runtime for PHP",
- "homepage": "https://github.com/awslabs/aws-crt-php",
- "keywords": [
- "amazon",
- "aws",
- "crt",
- "sdk"
- ],
- "support": {
- "issues": "https://github.com/awslabs/aws-crt-php/issues",
- "source": "https://github.com/awslabs/aws-crt-php/tree/v1.2.5"
- },
- "time": "2024-04-19T21:30:56+00:00"
- },
- {
- "name": "aws/aws-sdk-php",
- "version": "3.305.4",
- "source": {
- "type": "git",
- "url": "https://github.com/aws/aws-sdk-php.git",
- "reference": "fc26a2ebf720e0b75a353d7e8fe206796671e00b"
- },
- "dist": {
- "type": "zip",
- "url": "https://api.github.com/repos/aws/aws-sdk-php/zipball/fc26a2ebf720e0b75a353d7e8fe206796671e00b",
- "reference": "fc26a2ebf720e0b75a353d7e8fe206796671e00b",
- "shasum": ""
- },
- "require": {
- "aws/aws-crt-php": "^1.2.3",
- "ext-json": "*",
- "ext-pcre": "*",
- "ext-simplexml": "*",
- "guzzlehttp/guzzle": "^6.5.8 || ^7.4.5",
- "guzzlehttp/promises": "^1.4.0 || ^2.0",
- "guzzlehttp/psr7": "^1.9.1 || ^2.4.5",
- "mtdowling/jmespath.php": "^2.6",
- "php": ">=7.2.5",
- "psr/http-message": "^1.0 || ^2.0"
- },
- "require-dev": {
- "andrewsville/php-token-reflection": "^1.4",
- "aws/aws-php-sns-message-validator": "~1.0",
- "behat/behat": "~3.0",
- "composer/composer": "^1.10.22",
- "dms/phpunit-arraysubset-asserts": "^0.4.0",
- "doctrine/cache": "~1.4",
- "ext-dom": "*",
- "ext-openssl": "*",
- "ext-pcntl": "*",
- "ext-sockets": "*",
- "nette/neon": "^2.3",
- "paragonie/random_compat": ">= 2",
- "phpunit/phpunit": "^5.6.3 || ^8.5 || ^9.5",
- "psr/cache": "^1.0",
- "psr/simple-cache": "^1.0",
- "sebastian/comparator": "^1.2.3 || ^4.0",
- "yoast/phpunit-polyfills": "^1.0"
- },
- "suggest": {
- "aws/aws-php-sns-message-validator": "To validate incoming SNS notifications",
- "doctrine/cache": "To use the DoctrineCacheAdapter",
- "ext-curl": "To send requests using cURL",
- "ext-openssl": "Allows working with CloudFront private distributions and verifying received SNS messages",
- "ext-sockets": "To use client-side monitoring"
- },
- "type": "library",
- "extra": {
- "branch-alias": {
- "dev-master": "3.0-dev"
- }
- },
- "autoload": {
- "files": [
- "src/functions.php"
- ],
- "psr-4": {
- "Aws\\": "src/"
- }
- },
- "notification-url": "https://packagist.org/downloads/",
- "license": [
- "Apache-2.0"
- ],
- "authors": [
- {
- "name": "Amazon Web Services",
- "homepage": "http://aws.amazon.com"
- }
- ],
- "description": "AWS SDK for PHP - Use Amazon Web Services in your PHP project",
- "homepage": "http://aws.amazon.com/sdkforphp",
- "keywords": [
- "amazon",
- "aws",
- "cloud",
- "dynamodb",
- "ec2",
- "glacier",
- "s3",
- "sdk"
- ],
- "support": {
- "forum": "https://forums.aws.amazon.com/forum.jspa?forumID=80",
- "issues": "https://github.com/aws/aws-sdk-php/issues",
- "source": "https://github.com/aws/aws-sdk-php/tree/3.305.4"
- },
- "time": "2024-04-26T18:06:31+00:00"
- },
{
"name": "bacon/bacon-qr-code",
"version": "v3.0.0",
@@ -5040,71 +4891,6 @@
],
"time": "2024-04-07T19:17:50+00:00"
},
- {
- "name": "league/flysystem-aws-s3-v3",
- "version": "3.27.0",
- "source": {
- "type": "git",
- "url": "https://github.com/thephpleague/flysystem-aws-s3-v3.git",
- "reference": "3e6ce2f972f1470db779f04d29c289dcd2c32837"
- },
- "dist": {
- "type": "zip",
- "url": "https://api.github.com/repos/thephpleague/flysystem-aws-s3-v3/zipball/3e6ce2f972f1470db779f04d29c289dcd2c32837",
- "reference": "3e6ce2f972f1470db779f04d29c289dcd2c32837",
- "shasum": ""
- },
- "require": {
- "aws/aws-sdk-php": "^3.295.10",
- "league/flysystem": "^3.10.0",
- "league/mime-type-detection": "^1.0.0",
- "php": "^8.0.2"
- },
- "conflict": {
- "guzzlehttp/guzzle": "<7.0",
- "guzzlehttp/ringphp": "<1.1.1"
- },
- "type": "library",
- "autoload": {
- "psr-4": {
- "League\\Flysystem\\AwsS3V3\\": ""
- }
- },
- "notification-url": "https://packagist.org/downloads/",
- "license": [
- "MIT"
- ],
- "authors": [
- {
- "name": "Frank de Jonge",
- "email": "[email protected]"
- }
- ],
- "description": "AWS S3 filesystem adapter for Flysystem.",
- "keywords": [
- "Flysystem",
- "aws",
- "file",
- "files",
- "filesystem",
- "s3",
- "storage"
- ],
- "support": {
- "source": "https://github.com/thephpleague/flysystem-aws-s3-v3/tree/3.27.0"
- },
- "funding": [
- {
- "url": "https://ecologi.com/frankdejonge",
- "type": "custom"
- },
- {
- "url": "https://github.com/frankdejonge",
- "type": "github"
- }
- ],
- "time": "2024-04-07T19:16:54+00:00"
- },
{
"name": "league/flysystem-local",
"version": "3.25.1",
@@ -5790,72 +5576,6 @@
],
"time": "2024-04-12T21:02:21+00:00"
},
- {
- "name": "mtdowling/jmespath.php",
- "version": "2.7.0",
- "source": {
- "type": "git",
- "url": "https://github.com/jmespath/jmespath.php.git",
- "reference": "bbb69a935c2cbb0c03d7f481a238027430f6440b"
- },
- "dist": {
- "type": "zip",
- "url": "https://api.github.com/repos/jmespath/jmespath.php/zipball/bbb69a935c2cbb0c03d7f481a238027430f6440b",
- "reference": "bbb69a935c2cbb0c03d7f481a238027430f6440b",
- "shasum": ""
- },
- "require": {
- "php": "^7.2.5 || ^8.0",
- "symfony/polyfill-mbstring": "^1.17"
- },
- "require-dev": {
- "composer/xdebug-handler": "^3.0.3",
- "phpunit/phpunit": "^8.5.33"
- },
- "bin": [
- "bin/jp.php"
- ],
- "type": "library",
- "extra": {
- "branch-alias": {
- "dev-master": "2.7-dev"
- }
- },
- "autoload": {
- "files": [
- "src/JmesPath.php"
- ],
- "psr-4": {
- "JmesPath\\": "src/"
- }
- },
- "notification-url": "https://packagist.org/downloads/",
- "license": [
- "MIT"
- ],
- "authors": [
- {
- "name": "Graham Campbell",
- "email": "[email protected]",
- "homepage": "https://github.com/GrahamCampbell"
- },
- {
- "name": "Michael Dowling",
- "email": "[email protected]",
- "homepage": "https://github.com/mtdowling"
- }
- ],
- "description": "Declaratively specify how to extract elements from a JSON document",
- "keywords": [
- "json",
- "jsonpath"
- ],
- "support": {
- "issues": "https://github.com/jmespath/jmespath.php/issues",
- "source": "https://github.com/jmespath/jmespath.php/tree/2.7.0"
- },
- "time": "2023-08-25T10:54:48+00:00"
- },
{
"name": "nesbot/carbon",
"version": "3.3.0",
|
solidtime
|
solidtime-io
|
PHP
|
PHP
| 5,267
| 278
|
Modern open-source time-tracking app
|
solidtime-io_solidtime
|
NEW_FEAT
|
Obvious
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.