Hash
stringlengths 40
40
| Date
stringlengths 19
20
⌀ | Author
stringlengths 2
30
| commit_message
stringlengths 3
28.8k
| IsMerge
bool 1
class | Additions
int64 0
55.2k
| Deletions
int64 0
991
| Total Changes
int64 -3
55.2k
| git_diff
stringlengths 23
47.3k
| Repository Name
stringclasses 159
values | Owner
stringclasses 85
values | Primary Language
stringclasses 20
values | Language
stringclasses 19
values | Stars
float64 218
411k
⌀ | Forks
float64 8
79k
⌀ | Description
stringclasses 96
values | Repository
stringclasses 161
values | type
stringclasses 6
values | Comment
stringlengths 7
156
⌀ |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
287d77224c54ec1104d41bfb6519d5f1c8acfe8d
|
2023-01-03 08:59:50
|
longpanda
|
Disable Fn/L/Ctrl hotkeys on select WIM menu.
| false
| 2
| 0
| 2
|
--- GRUB2/MOD_SRC/grub-2.04/grub-core/ventoy/ventoy_windows.c
@@ -1448,7 +1448,6 @@ grub_err_t ventoy_cmd_sel_winpe_wim(grub_extcmd_context_t ctxt, int argc, char *
g_ventoy_menu_esc = 1;
g_ventoy_suppress_esc = 1;
g_ventoy_suppress_esc_default = 0;
- g_ventoy_secondary_menu_on = 1;
grub_snprintf(cfgfile, sizeof(cfgfile), "configfile mem:0x%llx:size:%d", (ulonglong)(ulong)cmd, pos);
grub_script_execute_sourcecode(cfgfile);
@@ -1456,7 +1455,6 @@ grub_err_t ventoy_cmd_sel_winpe_wim(grub_extcmd_context_t ctxt, int argc, char *
g_ventoy_menu_esc = 0;
g_ventoy_suppress_esc = 0;
g_ventoy_suppress_esc_default = 1;
- g_ventoy_secondary_menu_on = 0;
for (node = g_wim_patch_head; node; node = node->next)
{
|
ventoy
|
ventoy
|
C
|
C
| 65,265
| 4,197
|
A new bootable USB solution.
|
ventoy_ventoy
|
BUG_FIX
|
some keys disabled
|
adccba5f9e7cc878e42f14b2fe1eb1d3c1a298af
|
2024-10-20 23:06:14
|
Edward Hsing
|
Update README.md
| false
| 1
| 1
| 2
|
--- README.md
@@ -1,6 +1,6 @@
# US.KG – A FREE NAME FOR EVERYONE
[Registry Website (https://nic.us.kg/)](https://nic.us.kg/)
-### ⚠️⚠️⚠️ On October 16, 2024 (UTC), we received a warning from the .KG registry due to subdomains being used for large number of malicious activities (including Bank Hacking). As a result, we have temporarily suspended registrations. We will be upgrading our KYC process and reopening registrations soon. In the meantime, all reviews are on hold, so please refrain from submitting any new applications. ⚠️⚠️⚠️ (Accounts that have already completed KYC and comply with the Acceptable Use Policy will not be affected in any way, and you will never be charged. All activated accounts are promised to remain free forever.)
+### ⚠️⚠️⚠️ On October 16, 2024 (UTC), we received a warning from the .KG registry due to subdomains being used for large number of malicious activities (including Bank Hacking). As a result, we have temporarily suspended registrations. We will be upgrading our KYC process and reopening registrations soon. In the meantime, all reviews are on hold, so please refrain from submitting any new applications. ⚠️⚠️⚠️
#### Your request is typically processed within 15 minutes after submission.
|
freedomain
|
digitalplatdev
|
HTML
|
HTML
| 41,142
| 933
|
DigitalPlat FreeDomain: Free Domain For Everyone
|
digitalplatdev_freedomain
|
DOC_CHANGE
|
changes in readme
|
50e52ef58c4215736d7c4d9b9aa4f96e5afb4887
|
2024-06-21 05:31:55
|
github-actions[bot]
|
chore(main): release 2.5.0 (#393) Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
| false
| 13
| 0
| 13
|
--- CHANGELOG.md
@@ -1,18 +1,5 @@
# Changelog
-## [2.5.0](https://github.com/ellite/Wallos/compare/v2.4.2...v2.5.0) (2024-06-21)
-
-
-### Features
-
-* add option to clone subscription ([8304ed7](https://github.com/ellite/Wallos/commit/8304ed7b54f50ed7fa5ab520ff4d8d54f3ef34df))
-* edit and delete options now available directly on the subscription list ([8304ed7](https://github.com/ellite/Wallos/commit/8304ed7b54f50ed7fa5ab520ff4d8d54f3ef34df))
-
-
-### Bug Fixes
-
-* typo on webhook payload ([8304ed7](https://github.com/ellite/Wallos/commit/8304ed7b54f50ed7fa5ab520ff4d8d54f3ef34df))
-
## [2.4.2](https://github.com/ellite/Wallos/compare/v2.4.1...v2.4.2) (2024-06-10)
|
wallos
|
ellite
|
PHP
|
PHP
| 4,155
| 178
|
Wallos: Open-Source Personal Subscription Tracker
|
ellite_wallos
|
DOC_CHANGE
|
changes in md file
|
e1bbcc415f86c15ac7a72898137e096dd8f7e68a
|
2024-10-04 12:27:46
|
Lucas
|
chore: disable copy link to block feature (#6472)
| false
| 34
| 33
| 67
|
--- frontend/appflowy_flutter/integration_test/desktop/cloud/document/document_option_actions_test.dart
@@ -4,6 +4,7 @@ import 'package:appflowy/plugins/document/presentation/editor_plugins/actions/dr
import 'package:appflowy_editor/appflowy_editor.dart';
import 'package:easy_localization/easy_localization.dart';
import 'package:flutter/material.dart';
+import 'package:flutter/services.dart';
import 'package:flutter_test/flutter_test.dart';
import 'package:integration_test/integration_test.dart';
@@ -68,37 +69,36 @@ void main() {
expect(afterMoveBlock.delta, beforeMoveBlock.delta);
});
- // Copy link to block feature is disable temporarily, enable this test when the feature is ready.
- // testWidgets('copy block link', (tester) async {
- // await tester.initializeAppFlowy(
- // cloudType: AuthenticatorType.appflowyCloudSelfHost,
- // );
- // await tester.tapGoogleLoginInButton();
- // await tester.expectToSeeHomePageWithGetStartedPage();
-
- // // open getting started page
- // await tester.openPage(Constants.gettingStartedPageName);
-
- // // hover and click on the option menu button beside the block component.
- // await tester.editor.hoverAndClickOptionMenuButton([0]);
-
- // // click the copy link to block option
- // await tester.tap(
- // find.findTextInFlowyText(
- // LocaleKeys.document_plugins_optionAction_copyLinkToBlock.tr(),
- // ),
- // );
- // await tester.pumpAndSettle(Durations.short1);
-
- // // check the clipboard
- // final content = await Clipboard.getData(Clipboard.kTextPlain);
- // expect(
- // content?.text,
- // matches(
- // r'^https:\/\/appflowy\.com\/app\/[a-f0-9-]{36}\/[a-f0-9-]{36}\?blockId=[A-Za-z0-9_-]+$',
- // ),
- // );
- // });
+ testWidgets('copy block link', (tester) async {
+ await tester.initializeAppFlowy(
+ cloudType: AuthenticatorType.appflowyCloudSelfHost,
+ );
+ await tester.tapGoogleLoginInButton();
+ await tester.expectToSeeHomePageWithGetStartedPage();
+
+ // open getting started page
+ await tester.openPage(Constants.gettingStartedPageName);
+
+ // hover and click on the option menu button beside the block component.
+ await tester.editor.hoverAndClickOptionMenuButton([0]);
+
+ // click the copy link to block option
+ await tester.tap(
+ find.findTextInFlowyText(
+ LocaleKeys.document_plugins_optionAction_copyLinkToBlock.tr(),
+ ),
+ );
+ await tester.pumpAndSettle(Durations.short1);
+
+ // check the clipboard
+ final content = await Clipboard.getData(Clipboard.kTextPlain);
+ expect(
+ content?.text,
+ matches(
+ r'^https:\/\/appflowy\.com\/app\/[a-f0-9-]{36}\/[a-f0-9-]{36}\?blockId=[A-Za-z0-9_-]+$',
+ ),
+ );
+ });
testWidgets('hover on the block and delete it', (tester) async {
await tester.initializeAppFlowy(
--- frontend/appflowy_flutter/lib/plugins/document/presentation/editor_configuration.dart
@@ -31,10 +31,9 @@ Map<String, BlockComponentBuilder> getEditorBuilderMap({
final standardActions = [
OptionAction.delete,
OptionAction.duplicate,
- // Copy link to block feature is disable temporarily, enable this test when the feature is ready.
// filter out the copy link to block option if in local mode
- // if (context.read<DocumentBloc?>()?.isLocalMode != true)
- // OptionAction.copyLinkToBlock,
+ if (context.read<DocumentBloc?>()?.isLocalMode != true)
+ OptionAction.copyLinkToBlock,
];
final calloutBGColor = AFThemeExtension.of(context).calloutBGColor;
|
appflowy
|
appflowy-io
|
Dart
|
Dart
| 61,077
| 4,078
|
Bring projects, wikis, and teams together with AI. AppFlowy is the AI collaborative workspace where you achieve more without losing control of your data. The leading open source Notion alternative.
|
appflowy-io_appflowy
|
BUG_FIX
|
feature not ready yet, so link disabled
|
9fbb4dec8edf6a72cc49b9c8f086eed48bdf44ab
| null |
Mike Frysinger
|
app-admin/clsync: drop redundant USE=seccomp description
| false
| 0
| 1
| -1
|
--- metadata.xml
@@ -23,7 +23,6 @@
<flag name="gio">Enable GIO for FS monitoring (glib based alternative to inotify interface, not recommended; if both are compiled, may be selected at runtime).</flag>
<flag name="highload-locks">Allows to use spinlocks for short delays instead of mutexes, but only on SMP systems.</flag>
<flag name="namespaces">Enable namespaces isolation.</flag>
- <flag name="seccomp">Enable seccomp for system call filtering.</flag>
</use>
<upstream>
<remote-id type="github">xaionaro/clsync</remote-id>
|
gentoo_gentoo.json
| null | null | null | null | null | null |
gentoo_gentoo.json
|
CODE_IMPROVEMENT
|
5, removed redundant description
|
d31f190239e9236cbfdd0900a89cff68b606e764
| null |
Adrian Holovaty
|
Fixed #1959 -- Fixed typo in django/contrib/admin/views/doc.py git-svn-id: http://code.djangoproject.com/svn/django/trunk@2957 bcc190cf-cafb-0310-a4f2-bffc1f526a37
| false
| 1
| 1
| 0
|
--- doc.py
@@ -284,7 +284,7 @@ DATA_TYPE_MAPPING = {
'DateTimeField' : _('Date (with time)'),
'EmailField' : _('E-mail address'),
'FileField' : _('File path'),
- 'FilePathField', : _('File path'),
+ 'FilePathField' : _('File path'),
'FloatField' : _('Decimal number'),
'ForeignKey' : _('Integer'),
'ImageField' : _('File path'),
|
django_django.json
| null | null | null | null | null | null |
django_django.json
|
BUG_FIX
|
5, fix written in commit msg
|
20b940a73a4573484f072833ecaf375361bd99dc
| null |
Rich Harris
|
deprecate store.onchange
| false
| 2
| 0
| 2
|
--- store.js
@@ -118,6 +118,8 @@ assign(Store.prototype, {
onchange: function(callback) {
// TODO remove this method
+ console.warn("store.onchange is deprecated in favour of store.on('state', event => {...})");
+
return this.on('state', function(event) {
callback(event.current, event.changed);
});
|
sveltejs_svelte.json
| null | null | null | null | null | null |
sveltejs_svelte.json
|
CODE_IMPROVEMENT
|
4, deprecating a method
|
49344088a062f194e9e602b050e1f59ad0cde86d
|
2023-01-06 04:57:22
|
bannedbook
|
update
| false
| 0
| 0
| 0
|
--- docs/vsp-en.py
Binary files a/docs/vsp-en.py and b/docs/vsp-en.py differ
|
fanqiang
|
bannedbook
|
Kotlin
|
Kotlin
| 39,286
| 7,317
|
翻墙-科学上网
|
bannedbook_fanqiang
|
CONFIG_CHANGE
|
probably a config change since some update is done
|
78e7a18fd60a64fe82718070c2e8e615deba92c5
| null |
Daniel Smith
|
disable travis at tip
| false
| 0
| 1
| -1
|
--- .travis.yml
@@ -1,7 +1,6 @@
language: go
go:
- - tip
- 1.3
- 1.2
|
kubernetes_kubernetes.json
| null | null | null | null | null | null |
kubernetes_kubernetes.json
|
CODE_IMPROVEMENT
|
4, Disabled travis at tip
|
da11bc1a06fab2c6a6319ea29575d1b01335c3a3
|
2023-01-15 18:16:05
|
Richard McElreath
|
lecture 5 recording
| false
| 1
| 1
| 2
|
--- README.md
@@ -33,7 +33,7 @@ Note about slides: In some browsers, the slides don't show correctly. If points
| ------- | -------------- | ------------- | ---------------------- |
| Week 01 | 06 January | Chapters 1, 2 and 3 | [1] <[Science Before Statistics](https://www.youtube.com/watch?v=FdnMWdICdRs&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=1)> <[Slides](https://speakerdeck.com/rmcelreath/statistical-rethinking-2023-lecture-01)> <br> [2] <[Garden of Forking Data](https://www.youtube.com/watch?v=R1vcdhPBlXA&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=2)> <[Slides](https://speakerdeck.com/rmcelreath/statistical-rethinking-2023-lecture-02)>
| Week 02 | 13 January | Chapter 4 | [3] <[Geocentric Models](https://www.youtube.com/watch?v=tNOu-SEacNU&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=3)> <[Slides](https://speakerdeck.com/rmcelreath/statistical-rethinking-2023-lecture-03)> <br> [4] <[Categories and Curves](https://www.youtube.com/watch?v=F0N4b7K_iYQ&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=4)> <[Slides](https://speakerdeck.com/rmcelreath/statistical-rethinking-2023-lecture-04)>
-| Week 03 | 20 January | Chapters 5 and 6 | [5] <[Confounding](https://www.youtube.com/watch?v=mBEA7PKDmiY&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=5)> <[Slides]> <br> [6] Even Worse Confounding
+| Week 03 | 20 January | Chapters 5 and 6 | [5] Confounding <br> [6] Even Worse Confounding
| Week 04 | 27 January | Chapters 7 and 8 | [7] Overfitting <br> [8] Interactions
| Week 05 | 03 February | Chapters 9, 10 and 11 | [9] Markov chain Monte Carlo <br> [10] Binomial GLMs
| Week 06 | 10 February | Chapters 11 and 12 | [11] Poisson GLMs <br> [12] Ordered Categories
|
stat_rethinking_2024
|
rmcelreath
|
R
|
R
| 1,474
| 151
| null |
rmcelreath_stat_rethinking_2024
|
DOC_CHANGE
|
changes in readme
|
88ee347ccbb118c9cc299a451bbe3c112f3a3bd1
|
2024-04-17 03:51:40
|
Robert Felker
|
Update
| false
| 1
| 10
| 11
|
--- source.md
@@ -220,7 +220,7 @@ If you appreciate the content 📖, support projects visibility, give 👍| ⭐|
#### UI Helpers
- [Offline](https://github.com/jogboms/flutter_offline) <!--stargazers:jogboms/flutter_offline--> - Tidy utility to handle offline/online connectivity by [Jeremiah Ogbomo](https://twitter.com/jogboms).
-- [In View Notifier List](https://github.com/rvamsikrishna/inview_notifier_list) <!--stargazers:rvamsikrishna/flutter_offline--> - ListView that notify when widgets are on screen within a provided area by [Vamsi Krishna](https://github.com/inview_notifier_list).
+- [In View Notifier List](https://github.com/rvamsikrishna/inview_notifier_list) - ListView that notify when widgets are on screen within a provided area by [Vamsi Krishna](https://github.com/rvamsikrishna).
- [ShowCaseView](https://github.com/simformsolutions/flutter_showcaseview) <!--stargazers:simformsolutions/flutter_showcaseview--> - Way to showcase your app features on iOS and Android by [Simform](https://github.com/simformsolutions)
- [Mix](https://github.com/leoafarias/mix) <!--stargazers:leoafarias/mix--> - An expressive way to effortlessly build design systems by [Leo Farias](https://github.com/leoafarias).
- [Blurhash](https://github.com/fluttercommunity/flutter_blurhash) <!--stargazers:fluttercommunity/flutter_blurhash--> - Compact representation of a placeholder for an image. Encode a blurry image under 30 caracters by [Robert Felker](https://www.linkedin.com/in/robert-felker/)
@@ -232,6 +232,10 @@ If you appreciate the content 📖, support projects visibility, give 👍| ⭐|
- [Slidable](https://github.com/letsar/flutter_slidable) <!--stargazers:letsar/flutter_slidable--> - Slidable list item with left and right slide actions by [Romain Rastel](https://github.com/letsar)
- [Backdrop](https://github.com/fluttercommunity/backdrop) <!--stargazers:fluttercommunity/backdrop--> - [Backdrop](https://material.io/design/components/backdrop.html) implementation for flutter.
+#### Cupertino Design
+
+- [Peek & Pop](https://github.com/aliyigitbireroglu/flutter-peek-and-pop) <!--stargazers:aliyigitbireroglu/flutter-peek-and-pop--> - Peek & Pop implementation based on the iOS functionality by [Ali Yigit Bireroglu](https://github.com/aliyigitbireroglu)
+
#### Effect
- [Frosted Glass](http://stackoverflow.com/questions/43550853/how-do-i-do-the-frosted-glass-effect-in-flutter) - Render effect by [Collin Jackson](http://www.collinjackson.com)
@@ -250,6 +254,11 @@ If you appreciate the content 📖, support projects visibility, give 👍| ⭐|
- [Table Calendar](https://github.com/aleksanderwozniak/table_calendar) <!--stargazers:aleksanderwozniak/table_calendar--> - Calendar organized neatly into a Table, with vertical autosizing by [Aleksander Woźniak](https://github.com/aleksanderwozniak)
- [Time Planner](https://github.com/Jamalianpour/time_planner) <!--stargazers:Jamalianpour/time_planner--> - A beautiful, easy to use and customizable time planner for flutter mobile, desktop and web by [Mohammad Jamalianpour](https://github.com/Jamalianpour)
+
+#### Login
+
+- [Flutter Login](https://github.com/NearHuscarl/flutter_login) - Login widget with slick animation from start to finish by [NearHuscarl](https://github.com/NearHuscarl)
+
#### Backend-Driven
- [Dynamic Widget](https://github.com/dengyin2000/dynamic_widget) <!--stargazers:dengyin2000/dynamic_widget--> - Build your dynamic UI with json, and the json format is very similar with flutter widget code by [Denny Deng](https://github.com/dengyin2000).
|
awesome-flutter
|
solido
|
Dart
|
Dart
| 54,974
| 6,726
|
An awesome list that curates the best Flutter libraries, tools, tutorials, articles and more.
|
solido_awesome-flutter
|
DOC_CHANGE
|
Obvious
|
1b4ada1bd116cd5c50592e8462356f978a394c57
|
2023-03-22 20:41:01
|
schochastics
|
urlchecker GHA test 3
| false
| 4
| 5
| 9
|
--- .github/workflows/urlchecker.yml
@@ -1,14 +1,15 @@
name: URLChecker
on:
schedule:
- - cron: '12 15 22 * *'
+ - cron: '08 15 22 * *'
jobs:
- build:
+ check-urls:
runs-on: ubuntu-latest
steps:
- - uses: actions/checkout@v3
- - name: urls-checker
+ - name: Checkout Actions Repository
+ uses: actions/checkout@v2
+ - name: Test GitHub Action
uses: urlstechie/urlchecker-action@master
with:
file_types: .md
|
awesome-computational-social-science
|
gesiscss
|
R
|
R
| 648
| 83
|
A list of awesome resources for Computational Social Science
|
gesiscss_awesome-computational-social-science
|
CONFIG_CHANGE
|
Only config file changes have been made.
|
9bb10d27e7072cec919d38f0b987c70e598039ad
|
2024-12-07 04:22:09
|
Patrick Steinhardt
|
Makefile: generate "git.rc" via GIT-VERSION-GEN The "git.rc" is used on Windows to embed information like the project name and version into the resulting executables. As such we need to inject the version information, which we do by using preprocessor defines. The logic to do so is non-trivial and needs to be kept in sync with the different build systems. Refactor the logic so that we generate "git.rc" via `GIT-VERSION-GEN`. Signed-off-by: Patrick Steinhardt <[email protected]> Signed-off-by: Junio C Hamano <[email protected]>
| false
| 51
| 38
| 89
|
--- .gitignore
@@ -199,7 +199,6 @@
*.tar.gz
*.dsc
*.deb
-/git.rc
/git.spec
*.exe
*.[aos]
--- GIT-VERSION-GEN
@@ -58,18 +58,14 @@ then
GIT_USER_AGENT="git/$GIT_VERSION"
fi
-# While released Git versions only have three numbers, development builds also
-# have a fourth number that corresponds to the number of patches since the last
-# release.
-read GIT_MAJOR_VERSION GIT_MINOR_VERSION GIT_MICRO_VERSION GIT_PATCH_LEVEL trailing <<EOF
-$(echo "$GIT_VERSION" 0 0 0 0 | tr '.a-zA-Z-' ' ')
+read GIT_MAJOR_VERSION GIT_MINOR_VERSION GIT_MICRO_VERSION trailing <<EOF
+$(echo "$GIT_VERSION" 0 0 0 | tr '.a-zA-Z-' ' ')
EOF
sed -e "s|@GIT_VERSION@|$GIT_VERSION|" \
-e "s|@GIT_MAJOR_VERSION@|$GIT_MAJOR_VERSION|" \
-e "s|@GIT_MINOR_VERSION@|$GIT_MINOR_VERSION|" \
-e "s|@GIT_MICRO_VERSION@|$GIT_MICRO_VERSION|" \
- -e "s|@GIT_PATCH_LEVEL@|$GIT_PATCH_LEVEL|" \
-e "s|@GIT_BUILT_FROM_COMMIT@|$GIT_BUILT_FROM_COMMIT|" \
-e "s|@GIT_USER_AGENT@|$GIT_USER_AGENT|" \
"$INPUT" >"$OUTPUT"+
--- Makefile
@@ -2568,12 +2568,11 @@ $(SCRIPT_LIB) : % : %.sh GIT-SCRIPT-DEFINES
$(QUIET_GEN)$(cmd_munge_script) && \
mv $@+ $@
-git.rc: git.rc.in GIT-VERSION-GEN GIT-VERSION-FILE
- $(QUIET_GEN)$(SHELL_PATH) ./GIT-VERSION-GEN "$(shell pwd)" $< $@+
- @if cmp $@+ $@ >/dev/null 2>&1; then $(RM) $@+; else mv $@+ $@; fi
-
-git.res: git.rc GIT-PREFIX
- $(QUIET_RC)$(RC) -i $< -o $@
+git.res: git.rc GIT-VERSION-FILE GIT-PREFIX
+ $(QUIET_RC)$(RC) \
+ $(join -DMAJOR= -DMINOR= -DMICRO= -DPATCHLEVEL=, $(wordlist 1, 4, \
+ $(shell echo $(GIT_VERSION) 0 0 0 0 | tr '.a-zA-Z-' ' '))) \
+ -DGIT_VERSION="\\\"$(GIT_VERSION)\\\"" -i $< -o $@
# This makes sure we depend on the NO_PERL setting itself.
$(SCRIPT_PERL_GEN): GIT-BUILD-OPTIONS
@@ -3718,7 +3717,7 @@ clean: profile-clean coverage-clean cocciclean
$(RM) -r .build $(UNIT_TEST_BIN)
$(RM) GIT-TEST-SUITES
$(RM) po/git.pot po/git-core.pot
- $(RM) git.rc git.res
+ $(RM) git.res
$(RM) $(OBJECTS)
$(RM) headless-git.o
$(RM) $(LIB_FILE) $(XDIFF_LIB) $(REFTABLE_LIB)
--- contrib/buildsystems/CMakeLists.txt
@@ -691,25 +691,18 @@ list(TRANSFORM reftable_SOURCES PREPEND "${CMAKE_SOURCE_DIR}/")
add_library(reftable STATIC ${reftable_SOURCES})
if(WIN32)
- add_custom_command(OUTPUT ${CMAKE_BINARY_DIR}/git.rc
- COMMAND "${SH_EXE}" "${CMAKE_SOURCE_DIR}/GIT-VERSION-GEN"
- "${CMAKE_SOURCE_DIR}"
- "${CMAKE_SOURCE_DIR}/git.rc.in"
- "${CMAKE_BINARY_DIR}/git.rc"
- DEPENDS "${CMAKE_SOURCE_DIR}/GIT-VERSION-GEN"
- "${CMAKE_SOURCE_DIR}/git.rc.in"
- VERBATIM)
-
if(NOT MSVC)#use windres when compiling with gcc and clang
add_custom_command(OUTPUT ${CMAKE_BINARY_DIR}/git.res
- COMMAND ${WINDRES_EXE} -O coff -i ${CMAKE_BINARY_DIR}/git.rc -o ${CMAKE_BINARY_DIR}/git.res
- DEPENDS "${CMAKE_BINARY_DIR}/git.rc"
+ COMMAND ${WINDRES_EXE} -O coff -DMAJOR=${PROJECT_VERSION_MAJOR} -DMINOR=${PROJECT_VERSION_MINOR}
+ -DMICRO=${PROJECT_VERSION_PATCH} -DPATCHLEVEL=0 -DGIT_VERSION="\\\"${PROJECT_VERSION}.GIT\\\""
+ -i ${CMAKE_SOURCE_DIR}/git.rc -o ${CMAKE_BINARY_DIR}/git.res
WORKING_DIRECTORY ${CMAKE_SOURCE_DIR}
VERBATIM)
else()#MSVC use rc
add_custom_command(OUTPUT ${CMAKE_BINARY_DIR}/git.res
- COMMAND ${CMAKE_RC_COMPILER} /fo ${CMAKE_BINARY_DIR}/git.res ${CMAKE_BINARY_DIR}/git.rc
- DEPENDS "${CMAKE_BINARY_DIR}/git.rc"
+ COMMAND ${CMAKE_RC_COMPILER} /d MAJOR=${PROJECT_VERSION_MAJOR} /d MINOR=${PROJECT_VERSION_MINOR}
+ /d MICRO=${PROJECT_VERSION_PATCH} /d PATCHLEVEL=0 /d GIT_VERSION="${PROJECT_VERSION}.GIT"
+ /fo ${CMAKE_BINARY_DIR}/git.res ${CMAKE_SOURCE_DIR}/git.rc
WORKING_DIRECTORY ${CMAKE_SOURCE_DIR}
VERBATIM)
endif()
--- git.rc.in
@@ -1,6 +1,6 @@
1 VERSIONINFO
-FILEVERSION @GIT_MAJOR_VERSION@,@GIT_MINOR_VERSION@,@GIT_MICRO_VERSION@,@GIT_PATCH_LEVEL@
-PRODUCTVERSION @GIT_MAJOR_VERSION@,@GIT_MINOR_VERSION@,@GIT_MICRO_VERSION@,@GIT_PATCH_LEVEL@
+FILEVERSION MAJOR,MINOR,MICRO,PATCHLEVEL
+PRODUCTVERSION MAJOR,MINOR,MICRO,PATCHLEVEL
BEGIN
BLOCK "StringFileInfo"
BEGIN
@@ -11,7 +11,7 @@ BEGIN
VALUE "InternalName", "git\0"
VALUE "OriginalFilename", "git.exe\0"
VALUE "ProductName", "Git\0"
- VALUE "ProductVersion", "@GIT_VERSION@\0"
+ VALUE "ProductVersion", GIT_VERSION "\0"
END
END
|
git
| null |
C
|
C
| null | null |
Version control
|
_git
|
NEW_FEAT
|
Introduce a new functionality
|
501397cb6186700203637a37389142b6e9cdaefa
| null |
David Luzar
|
fix: disable locking aspect ratio for box-selection (#5525)
| false
| 1
| 1
| 0
|
--- dragElements.ts
@@ -105,7 +105,7 @@ export const dragNewElement = (
true */
widthAspectRatio?: number | null,
) => {
- if (shouldMaintainAspectRatio) {
+ if (shouldMaintainAspectRatio && draggingElement.type !== "selection") {
if (widthAspectRatio) {
height = width / widthAspectRatio;
} else {
|
excalidraw_excalidraw.json
| null | null | null | null | null | null |
excalidraw_excalidraw.json
|
BUG_FIX
|
5, obvious
|
3df30e7ac694fb571dd1e6af71fe0514e720a6de
|
2025-03-24 15:14:30
|
vladislavgrecko
|
[JVM] Refactor inlining code: get rid of `InlineCodegen` After removal of the old backend, no reason left to split the code between `InlineCodegen` and `IrInlineCodegen`.
| false
| 325
| 362
| 687
|
--- compiler/backend/src/org/jetbrains/kotlin/codegen/inline/InlineCodegen.kt
@@ -0,0 +1,311 @@
+/*
+ * Copyright 2010-2019 JetBrains s.r.o. and Kotlin Programming Language contributors.
+ * Use of this source code is governed by the Apache 2.0 license that can be found in the license/LICENSE.txt file.
+ */
+
+package org.jetbrains.kotlin.codegen.inline
+
+import com.intellij.psi.PsiElement
+import org.jetbrains.kotlin.codegen.*
+import org.jetbrains.kotlin.codegen.AsmUtil.isPrimitive
+import org.jetbrains.kotlin.codegen.state.GenerationState
+import org.jetbrains.kotlin.resolve.jvm.jvmSignature.JvmMethodSignature
+import org.jetbrains.kotlin.types.KotlinType
+import org.jetbrains.kotlin.utils.exceptions.rethrowIntellijPlatformExceptionIfNeeded
+import org.jetbrains.org.objectweb.asm.Label
+import org.jetbrains.org.objectweb.asm.Opcodes
+import org.jetbrains.org.objectweb.asm.Type
+import org.jetbrains.org.objectweb.asm.tree.*
+import kotlin.math.max
+
+abstract class InlineCodegen<out T : BaseExpressionCodegen>(
+ protected val codegen: T,
+ protected val state: GenerationState,
+ protected val jvmSignature: JvmMethodSignature,
+ private val typeParameterMappings: TypeParameterMappings<*>,
+ protected val sourceCompiler: SourceCompilerForInline,
+ private val reifiedTypeInliner: ReifiedTypeInliner<*>
+) {
+ private val initialFrameSize = codegen.frameMap.currentSize
+
+ protected val invocationParamBuilder = ParametersBuilder.newBuilder()
+ private val maskValues = ArrayList<Int>()
+ private var maskStartIndex = -1
+ private var methodHandleInDefaultMethodIndex = -1
+
+ protected fun generateStub(text: String, codegen: BaseExpressionCodegen) {
+ leaveTemps()
+ AsmUtil.genThrow(codegen.visitor, "java/lang/UnsupportedOperationException", "Call is part of inline cycle: $text")
+ }
+
+ fun compileInline(): SMAPAndMethodNode {
+ return sourceCompiler.compileInlineFunction(jvmSignature).apply {
+ node.preprocessSuspendMarkers(forInline = true, keepFakeContinuation = false)
+ }
+ }
+
+ fun performInline(registerLineNumberAfterwards: Boolean, isInlineOnly: Boolean) {
+ var nodeAndSmap: SMAPAndMethodNode? = null
+ try {
+ nodeAndSmap = compileInline()
+ val result = inlineCall(nodeAndSmap, isInlineOnly)
+ leaveTemps()
+ codegen.propagateChildReifiedTypeParametersUsages(result.reifiedTypeParametersUsages)
+ codegen.markLineNumberAfterInlineIfNeeded(registerLineNumberAfterwards)
+ state.factory.removeClasses(result.calcClassesToRemove())
+ } catch (e: CompilationException) {
+ throw e
+ } catch (e: InlineException) {
+ throw CompilationException(
+ "Couldn't inline method call: ${sourceCompiler.callElementText}",
+ e, sourceCompiler.callElement as? PsiElement
+ )
+ } catch (e: Exception) {
+ rethrowIntellijPlatformExceptionIfNeeded(e)
+ throw CompilationException(
+ "Couldn't inline method call: ${sourceCompiler.callElementText}\nMethod: ${nodeAndSmap?.node?.nodeText}",
+ e, sourceCompiler.callElement as? PsiElement
+ )
+ }
+ }
+
+ private fun inlineCall(nodeAndSmap: SMAPAndMethodNode, isInlineOnly: Boolean): InlineResult {
+ val node = nodeAndSmap.node
+ if (maskStartIndex != -1) {
+ val parameters = invocationParamBuilder.buildParameters()
+ val infos = expandMaskConditionsAndUpdateVariableNodes(
+ node, maskStartIndex, maskValues, methodHandleInDefaultMethodIndex,
+ parameters.parameters.filter { it.functionalArgument === DefaultValueOfInlineParameter }
+ .mapTo<_, _, MutableCollection<Int>>(mutableSetOf()) { parameters.getDeclarationSlot(it) }
+ )
+ for (info in infos) {
+ val lambda = DefaultLambda(info, sourceCompiler, node.name.substringBeforeLast("\$default"))
+ parameters.getParameterByDeclarationSlot(info.offset).functionalArgument = lambda
+ if (info.needReification) {
+ lambda.reifiedTypeParametersUsages.mergeAll(reifiedTypeInliner.reifyInstructions(lambda.node.node))
+ }
+ for (captured in lambda.capturedVars) {
+ val param = invocationParamBuilder.addCapturedParam(captured, captured.fieldName, false)
+ param.remapValue = StackValue.Local(codegen.frameMap.enterTemp(param.type), param.type, null)
+ param.isSynthetic = true
+ }
+ }
+ }
+
+ val reificationResult = reifiedTypeInliner.reifyInstructions(node)
+
+ val parameters = invocationParamBuilder.buildParameters()
+
+ val info = RootInliningContext(
+ state, codegen.inlineNameGenerator.subGenerator(jvmSignature.asmMethod.name),
+ sourceCompiler, sourceCompiler.inlineCallSiteInfo, reifiedTypeInliner, typeParameterMappings,
+ codegen.inlineScopesGenerator
+ )
+
+ val sourceMapper = sourceCompiler.sourceMapper
+ val sourceInfo = sourceMapper.sourceInfo!!
+ val lastLineNumber = codegen.lastLineNumber
+ val callSite = SourcePosition(lastLineNumber, sourceInfo.sourceFileName!!, sourceInfo.pathOrCleanFQN)
+ info.inlineScopesGenerator?.apply { currentCallSiteLineNumber = lastLineNumber }
+ val inliner = MethodInliner(
+ node, parameters, info, FieldRemapper(null, null, parameters), sourceCompiler.isCallInsideSameModuleAsCallee,
+ { "Method inlining " + sourceCompiler.callElementText },
+ SourceMapCopier(sourceMapper, nodeAndSmap.classSMAP, callSite),
+ info.callSiteInfo,
+ isInlineOnlyMethod = isInlineOnly,
+ !isInlinedToInlineFunInKotlinRuntime(),
+ maskStartIndex,
+ maskStartIndex + maskValues.size,
+ ) //with captured
+
+ val remapper = LocalVarRemapper(parameters, initialFrameSize)
+
+ val adapter = createEmptyMethodNode()
+ //hack to keep linenumber info, otherwise jdi will skip begin of linenumber chain
+ adapter.visitInsn(Opcodes.NOP)
+
+ val result = inliner.doInline(adapter, remapper, true, mapOf())
+ result.reifiedTypeParametersUsages.mergeAll(reificationResult)
+
+ val infos = MethodInliner.processReturns(adapter, sourceCompiler.getContextLabels(), null)
+ generateAndInsertFinallyBlocks(
+ adapter, infos, (remapper.remap(parameters.argsSizeOnStack).value as StackValue.Local).index
+ )
+ if (!sourceCompiler.isFinallyMarkerRequired) {
+ removeFinallyMarkers(adapter)
+ }
+
+ // In case `codegen.visitor` is `<clinit>`, initializer for the `$assertionsDisabled` field
+ // needs to be inserted before the code that actually uses it.
+ if (info.generateAssertField) {
+ generateAssertField()
+ }
+
+ val shouldSpillStack = node.requiresEmptyStackOnEntry()
+ if (shouldSpillStack) {
+ addInlineMarker(codegen.visitor, true)
+ }
+ adapter.accept(MethodBodyVisitor(codegen.visitor))
+ if (shouldSpillStack) {
+ addInlineMarker(codegen.visitor, false)
+ }
+ return result
+ }
+
+ private fun generateAndInsertFinallyBlocks(
+ intoNode: MethodNode,
+ insertPoints: List<MethodInliner.PointForExternalFinallyBlocks>,
+ offsetForFinallyLocalVar: Int
+ ) {
+ if (!sourceCompiler.hasFinallyBlocks()) return
+
+ val extensionPoints = insertPoints.associateBy { it.beforeIns }
+ val processor = DefaultProcessor(intoNode, offsetForFinallyLocalVar)
+
+ var curFinallyDepth = 0
+ var curInstr: AbstractInsnNode? = intoNode.instructions.first
+ while (curInstr != null) {
+ processor.processInstruction(curInstr, true)
+ if (isFinallyStart(curInstr)) {
+ //TODO depth index calc could be more precise
+ curFinallyDepth = getConstant(curInstr.previous)
+ }
+
+ val extension = extensionPoints[curInstr]
+ if (extension != null) {
+ var nextFreeLocalIndex = processor.nextFreeLocalIndex
+ for (local in processor.localVarsMetaInfo.currentIntervals) {
+ val size = Type.getType(local.node.desc).size
+ nextFreeLocalIndex = max(offsetForFinallyLocalVar + local.node.index + size, nextFreeLocalIndex)
+ }
+
+ val start = Label()
+ val finallyNode = createEmptyMethodNode()
+ finallyNode.visitLabel(start)
+ val mark = codegen.frameMap.skipTo(nextFreeLocalIndex)
+ sourceCompiler.generateFinallyBlocks(
+ finallyNode, curFinallyDepth, extension.returnType, extension.finallyIntervalEnd.label, extension.jumpTarget
+ )
+ mark.dropTo()
+ insertNodeBefore(finallyNode, intoNode, curInstr)
+
+ val splitBy = SimpleInterval(start.info as LabelNode, extension.finallyIntervalEnd)
+ processor.tryBlocksMetaInfo.splitAndRemoveCurrentIntervals(splitBy, true)
+ processor.localVarsMetaInfo.splitAndRemoveCurrentIntervals(splitBy, true)
+ finallyNode.localVariables.forEach {
+ processor.localVarsMetaInfo.addNewInterval(LocalVarNodeWrapper(it))
+ }
+ }
+
+ curInstr = curInstr.next
+ }
+
+ processor.substituteTryBlockNodes(intoNode)
+ processor.substituteLocalVarTable(intoNode)
+ }
+
+ protected abstract fun generateAssertField()
+
+ protected abstract fun isInlinedToInlineFunInKotlinRuntime(): Boolean
+
+ protected fun rememberClosure(parameterType: Type, index: Int, lambdaInfo: LambdaInfo) {
+ invocationParamBuilder.addNextValueParameter(parameterType, true, null, index).functionalArgument = lambdaInfo
+ }
+
+ protected fun putCapturedToLocalVal(stackValue: StackValue, capturedParam: CapturedParamDesc, kotlinType: KotlinType?) {
+ val info = invocationParamBuilder.addCapturedParam(capturedParam, capturedParam.fieldName, false)
+ val asmType = info.type
+ if (stackValue.isLocalWithNoBoxing(JvmKotlinType(asmType, kotlinType))) {
+ info.remapValue = stackValue
+ } else {
+ stackValue.put(asmType, kotlinType, codegen.visitor)
+ val index = codegen.frameMap.enterTemp(asmType)
+ codegen.visitor.store(index, asmType)
+ info.remapValue = StackValue.Local(index, asmType, null)
+ info.isSynthetic = true
+ }
+ }
+
+ protected fun putArgumentToLocalVal(jvmKotlinType: JvmKotlinType, stackValue: StackValue, parameterIndex: Int, kind: ValueKind) {
+ if (kind === ValueKind.DEFAULT_MASK || kind === ValueKind.METHOD_HANDLE_IN_DEFAULT) {
+ return processDefaultMaskOrMethodHandler(stackValue, kind)
+ }
+
+ val info = when (parameterIndex) {
+ -1 -> invocationParamBuilder.addNextParameter(jvmKotlinType.type, false)
+ else -> invocationParamBuilder.addNextValueParameter(jvmKotlinType.type, false, null, parameterIndex)
+ }
+ info.functionalArgument = when (kind) {
+ ValueKind.READ_OF_INLINE_LAMBDA_FOR_INLINE_SUSPEND_PARAMETER ->
+ NonInlineArgumentForInlineSuspendParameter.INLINE_LAMBDA_AS_VARIABLE
+ ValueKind.READ_OF_OBJECT_FOR_INLINE_SUSPEND_PARAMETER ->
+ NonInlineArgumentForInlineSuspendParameter.OTHER
+ ValueKind.DEFAULT_INLINE_PARAMETER ->
+ DefaultValueOfInlineParameter
+ else -> null
+ }
+ when {
+ kind === ValueKind.DEFAULT_PARAMETER || kind === ValueKind.DEFAULT_INLINE_PARAMETER ->
+ codegen.frameMap.enterTemp(info.type) // the inline function will put the value into this slot
+ stackValue.isLocalWithNoBoxing(jvmKotlinType) ->
+ info.remapValue = stackValue
+ else -> {
+ stackValue.put(info.type, jvmKotlinType.kotlinType, codegen.visitor)
+ codegen.visitor.store(codegen.frameMap.enterTemp(info.type), info.type)
+ }
+ }
+ }
+
+ private fun leaveTemps() {
+ invocationParamBuilder.listAllParams().asReversed().forEach { param ->
+ if (!param.isSkippedOrRemapped || CapturedParamInfo.isSynthetic(param)) {
+ codegen.frameMap.leaveTemp(param.type)
+ }
+ }
+ }
+
+ private fun processDefaultMaskOrMethodHandler(value: StackValue, kind: ValueKind) {
+ assert(value is StackValue.Constant) { "Additional default method argument should be constant, but $value" }
+ val constantValue = (value as StackValue.Constant).value
+ if (kind === ValueKind.DEFAULT_MASK) {
+ assert(constantValue is Int) { "Mask should be of Integer type, but $constantValue" }
+ maskValues.add(constantValue as Int)
+ if (maskStartIndex == -1) {
+ maskStartIndex = invocationParamBuilder.listAllParams().sumOf {
+ if (it is CapturedParamInfo) 0 else it.type.size
+ }
+ }
+ } else {
+ assert(constantValue == null) { "Additional method handle for default argument should be null, but " + constantValue!! }
+ methodHandleInDefaultMethodIndex = maskStartIndex + maskValues.size
+ }
+ }
+
+ companion object {
+ private fun StackValue.isLocalWithNoBoxing(expected: JvmKotlinType): Boolean =
+ this is StackValue.Local &&
+ isPrimitive(expected.type) == isPrimitive(type) &&
+ !StackValue.requiresInlineClassBoxingOrUnboxing(type, kotlinType, expected.type, expected.kotlinType)
+
+ // Stack spilling before inline function call is required if the inlined bytecode has:
+ // 1. try-catch blocks - otherwise the stack spilling before and after them will not be correct;
+ // 2. suspension points - again, the stack spilling around them is otherwise wrong;
+ // 3. loops - OpenJDK cannot JIT-optimize between loop iterations if the stack is not empty.
+ // Instead of checking for loops precisely, we just check if there are any backward jumps -
+ // that is, a jump from instruction #i to instruction #j where j < i.
+ private fun MethodNode.requiresEmptyStackOnEntry(): Boolean = tryCatchBlocks.isNotEmpty() ||
+ instructions.any { isBeforeSuspendMarker(it) || isBeforeInlineSuspendMarker(it) || isBackwardsJump(it) }
+
+ private fun MethodNode.isBackwardsJump(insn: AbstractInsnNode): Boolean = when (insn) {
+ is JumpInsnNode -> isBackwardsJump(insn, insn.label)
+ is LookupSwitchInsnNode ->
+ insn.dflt?.let { to -> isBackwardsJump(insn, to) } == true || insn.labels.any { to -> isBackwardsJump(insn, to) }
+ is TableSwitchInsnNode ->
+ insn.dflt?.let { to -> isBackwardsJump(insn, to) } == true || insn.labels.any { to -> isBackwardsJump(insn, to) }
+ else -> false
+ }
+
+ private fun MethodNode.isBackwardsJump(from: AbstractInsnNode, to: LabelNode): Boolean =
+ instructions.indexOf(to) < instructions.indexOf(from)
+ }
+}
--- compiler/backend/src/org/jetbrains/kotlin/codegen/inline/inlineCodegenUtils.kt
@@ -440,7 +440,8 @@ fun getConstant(ins: AbstractInsnNode): Int {
}
}
}
-fun removeFinallyMarkers(intoNode: MethodNode) {
+
+internal fun removeFinallyMarkers(intoNode: MethodNode) {
val instructions = intoNode.instructions
var curInstr: AbstractInsnNode? = instructions.first
while (curInstr != null) {
@@ -523,9 +524,9 @@ private fun InstructionAdapter.emitInlineMarker(id: Int) {
)
}
-fun isBeforeSuspendMarker(insn: AbstractInsnNode) = isSuspendMarker(insn, INLINE_MARKER_BEFORE_SUSPEND_ID)
+internal fun isBeforeSuspendMarker(insn: AbstractInsnNode) = isSuspendMarker(insn, INLINE_MARKER_BEFORE_SUSPEND_ID)
internal fun isAfterSuspendMarker(insn: AbstractInsnNode) = isSuspendMarker(insn, INLINE_MARKER_AFTER_SUSPEND_ID)
-fun isBeforeInlineSuspendMarker(insn: AbstractInsnNode) = isSuspendMarker(insn, INLINE_MARKER_BEFORE_INLINE_SUSPEND_ID)
+internal fun isBeforeInlineSuspendMarker(insn: AbstractInsnNode) = isSuspendMarker(insn, INLINE_MARKER_BEFORE_INLINE_SUSPEND_ID)
internal fun isAfterInlineSuspendMarker(insn: AbstractInsnNode) = isSuspendMarker(insn, INLINE_MARKER_AFTER_INLINE_SUSPEND_ID)
internal fun isReturnsUnitMarker(insn: AbstractInsnNode) = isSuspendMarker(insn, INLINE_MARKER_RETURNS_UNIT)
internal fun isFakeContinuationMarker(insn: AbstractInsnNode) =
--- compiler/ir/backend.jvm/codegen/src/org/jetbrains/kotlin/backend/jvm/codegen/IrInlineCodegen.kt
@@ -5,25 +5,17 @@
package org.jetbrains.kotlin.backend.jvm.codegen
-import com.intellij.psi.PsiElement
import org.jetbrains.kotlin.backend.jvm.ir.isInlineOnly
import org.jetbrains.kotlin.backend.jvm.ir.isInlineParameter
import org.jetbrains.kotlin.backend.jvm.ir.unwrapInlineLambda
import org.jetbrains.kotlin.backend.jvm.localClassType
import org.jetbrains.kotlin.backend.jvm.mapping.IrCallableMethod
import org.jetbrains.kotlin.builtins.StandardNames
-import org.jetbrains.kotlin.codegen.AsmUtil.genThrow
-import org.jetbrains.kotlin.codegen.AsmUtil.isPrimitive
-import org.jetbrains.kotlin.codegen.CompilationException
import org.jetbrains.kotlin.codegen.IrExpressionLambda
import org.jetbrains.kotlin.codegen.JvmKotlinType
import org.jetbrains.kotlin.codegen.StackValue
import org.jetbrains.kotlin.codegen.ValueKind
import org.jetbrains.kotlin.codegen.inline.*
-import org.jetbrains.kotlin.codegen.inline.CapturedParamInfo
-import org.jetbrains.kotlin.codegen.inline.SMAPAndMethodNode
-import org.jetbrains.kotlin.codegen.inline.nodeText
-import org.jetbrains.kotlin.codegen.inline.preprocessSuspendMarkers
import org.jetbrains.kotlin.codegen.state.GenerationState
import org.jetbrains.kotlin.ir.declarations.IrDeclarationOrigin
import org.jetbrains.kotlin.ir.declarations.IrFunction
@@ -39,71 +31,39 @@ import org.jetbrains.kotlin.ir.util.isSuspendFunction
import org.jetbrains.kotlin.resolve.jvm.AsmTypes
import org.jetbrains.kotlin.resolve.jvm.jvmSignature.JvmMethodSignature
import org.jetbrains.kotlin.types.KotlinType
-import org.jetbrains.kotlin.utils.exceptions.rethrowIntellijPlatformExceptionIfNeeded
-import org.jetbrains.org.objectweb.asm.Label
-import org.jetbrains.org.objectweb.asm.Opcodes
import org.jetbrains.org.objectweb.asm.Type
import org.jetbrains.org.objectweb.asm.commons.Method
-import org.jetbrains.org.objectweb.asm.tree.AbstractInsnNode
-import org.jetbrains.org.objectweb.asm.tree.JumpInsnNode
-import org.jetbrains.org.objectweb.asm.tree.LabelNode
-import org.jetbrains.org.objectweb.asm.tree.LookupSwitchInsnNode
-import org.jetbrains.org.objectweb.asm.tree.MethodNode
-import org.jetbrains.org.objectweb.asm.tree.TableSwitchInsnNode
-import kotlin.math.max
class IrInlineCodegen(
- private val codegen: ExpressionCodegen,
- private val state: GenerationState,
+ codegen: ExpressionCodegen,
+ state: GenerationState,
private val function: IrFunction,
- private val jvmSignature: JvmMethodSignature,
- private val typeParameterMappings: TypeParameterMappings<IrType>,
- private val sourceCompiler: SourceCompilerForInline,
- private val reifiedTypeInliner: ReifiedTypeInliner<IrType>,
-) : IrInlineCallGenerator {
+ signature: JvmMethodSignature,
+ typeParameterMappings: TypeParameterMappings<IrType>,
+ sourceCompiler: SourceCompilerForInline,
+ reifiedTypeInliner: ReifiedTypeInliner<IrType>
+) :
+ InlineCodegen<ExpressionCodegen>(codegen, state, signature, typeParameterMappings, sourceCompiler, reifiedTypeInliner),
+ IrInlineCallGenerator {
private val inlineArgumentsInPlace = canInlineArgumentsInPlace()
- private val invocationParamBuilder = ParametersBuilder.newBuilder()
- private val maskValues = ArrayList<Int>()
- private var maskStartIndex = -1
- private var methodHandleInDefaultMethodIndex = -1
- private val initialFrameSize = codegen.frameMap.currentSize
- override fun genInlineCall(
- callableMethod: IrCallableMethod,
- codegen: ExpressionCodegen,
- expression: IrFunctionAccessExpression,
- isInsideIfCondition: Boolean,
- ) {
- var nodeAndSmap: SMAPAndMethodNode? = null
- try {
- nodeAndSmap = sourceCompiler.compileInlineFunction(jvmSignature).apply {
- node.preprocessSuspendMarkers(forInline = true, keepFakeContinuation = false)
- }
- val result = inlineCall(nodeAndSmap, function.isInlineOnly())
- leaveTemps()
- codegen.propagateChildReifiedTypeParametersUsages(result.reifiedTypeParametersUsages)
- codegen.markLineNumberAfterInlineIfNeeded(isInsideIfCondition)
- state.factory.removeClasses(result.calcClassesToRemove())
- } catch (e: CompilationException) {
- throw e
- } catch (e: InlineException) {
- throw CompilationException(
- "Couldn't inline method call: ${sourceCompiler.callElementText}",
- e, sourceCompiler.callElement as? PsiElement
- )
- } catch (e: Exception) {
- rethrowIntellijPlatformExceptionIfNeeded(e)
- throw CompilationException(
- "Couldn't inline method call: ${sourceCompiler.callElementText}\nMethod: ${nodeAndSmap?.node?.nodeText}",
- e, sourceCompiler.callElement as? PsiElement
- )
- }
- }
+ private fun canInlineArgumentsInPlace(): Boolean {
+ if (!function.isInlineOnly())
+ return false
- override fun genCycleStub(text: String, codegen: ExpressionCodegen) {
- leaveTemps()
- genThrow(codegen.visitor, "java/lang/UnsupportedOperationException", "Call is part of inline cycle: $text")
+ var actualParametersCount = function.valueParameters.size
+ if (function.dispatchReceiverParameter != null)
+ ++actualParametersCount
+ if (function.extensionReceiverParameter != null)
+ ++actualParametersCount
+ if (actualParametersCount == 0)
+ return false
+
+ if (function.valueParameters.any { it.isInlineParameter() })
+ return false
+
+ return canInlineArgumentsInPlace(sourceCompiler.compileInlineFunction(jvmSignature).node)
}
override fun beforeCallStart() {
@@ -118,22 +78,29 @@ class IrInlineCodegen(
}
}
+ override fun generateAssertField() {
+ // May be inlining code into `<clinit>`, in which case it's too late to modify the IR and
+ // `generateAssertFieldIfNeeded` will return a statement for which we need to emit bytecode.
+ val isClInit = sourceCompiler.inlineCallSiteInfo.method.name == "<clinit>"
+ codegen.classCodegen.generateAssertFieldIfNeeded(isClInit)?.accept(codegen, BlockInfo())?.discard()
+ }
+
+ override fun isInlinedToInlineFunInKotlinRuntime(): Boolean {
+ val callee = codegen.irFunction
+ return callee.isInline && callee.getPackageFragment().packageFqName.startsWith(StandardNames.BUILT_INS_PACKAGE_NAME)
+ }
+
override fun genValueAndPut(
irValueParameter: IrValueParameter,
argumentExpression: IrExpression,
parameterType: Type,
codegen: ExpressionCodegen,
- blockInfo: BlockInfo,
+ blockInfo: BlockInfo
) {
val inlineLambda = argumentExpression.unwrapInlineLambda()
if (inlineLambda != null) {
val lambdaInfo = IrExpressionLambdaImpl(codegen, inlineLambda)
- invocationParamBuilder.addNextValueParameter(
- parameterType,
- true,
- null,
- irValueParameter.indexInOldValueParameters
- ).functionalArgument = lambdaInfo
+ rememberClosure(parameterType, irValueParameter.indexInOldValueParameters, lambdaInfo)
lambdaInfo.generateLambdaBody(sourceCompiler)
lambdaInfo.reference.getArgumentsWithIr().forEachIndexed { index, (_, ir) ->
val param = lambdaInfo.capturedVars[index]
@@ -189,263 +156,21 @@ class IrInlineCodegen(
}
val expectedType = JvmKotlinType(parameterType, irValueParameter.type.toIrBasedKotlinType())
- val parameterIndex = irValueParameter.indexInOldValueParameters
-
- if (kind === ValueKind.DEFAULT_MASK || kind === ValueKind.METHOD_HANDLE_IN_DEFAULT) {
- assert(onStack is StackValue.Constant) { "Additional default method argument should be constant, but $onStack" }
- val constantValue = (onStack as StackValue.Constant).value
- if (kind === ValueKind.DEFAULT_MASK) {
- assert(constantValue is Int) { "Mask should be of Integer type, but $constantValue" }
- maskValues.add(constantValue as Int)
- if (maskStartIndex == -1) {
- maskStartIndex = invocationParamBuilder.listAllParams().sumOf<ParameterInfo> {
- if (it is CapturedParamInfo) 0 else it.type.size
- }
- }
- } else {
- assert(constantValue == null) { "Additional method handle for default argument should be null, but " + constantValue!! }
- methodHandleInDefaultMethodIndex = maskStartIndex + maskValues.size
- }
- return
- }
-
- val info = when (parameterIndex) {
- -1 -> invocationParamBuilder.addNextParameter(expectedType.type, false)
- else -> invocationParamBuilder.addNextValueParameter(expectedType.type, false, null, parameterIndex)
- }
-
- info.functionalArgument = when (kind) {
- ValueKind.READ_OF_INLINE_LAMBDA_FOR_INLINE_SUSPEND_PARAMETER ->
- NonInlineArgumentForInlineSuspendParameter.INLINE_LAMBDA_AS_VARIABLE
- ValueKind.READ_OF_OBJECT_FOR_INLINE_SUSPEND_PARAMETER ->
- NonInlineArgumentForInlineSuspendParameter.OTHER
- ValueKind.DEFAULT_INLINE_PARAMETER ->
- DefaultValueOfInlineParameter
- else -> null
- }
-
- when {
- kind === ValueKind.DEFAULT_PARAMETER || kind === ValueKind.DEFAULT_INLINE_PARAMETER ->
- codegen.frameMap.enterTemp(info.type) // the inline function will put the value into this slot
- onStack.isLocalWithNoBoxing(expectedType) ->
- info.remapValue = onStack
- else -> {
- onStack.put(info.type, expectedType.kotlinType, codegen.visitor)
- codegen.visitor.store(codegen.frameMap.enterTemp(info.type), info.type)
- }
- }
- }
- }
-
- private fun isInlinedToInlineFunInKotlinRuntime(): Boolean {
- val callee = codegen.irFunction
- return callee.isInline && callee.getPackageFragment().packageFqName.startsWith(StandardNames.BUILT_INS_PACKAGE_NAME)
- }
-
- private fun canInlineArgumentsInPlace(): Boolean {
- if (!function.isInlineOnly())
- return false
-
- var actualParametersCount = function.valueParameters.size
- if (function.dispatchReceiverParameter != null)
- ++actualParametersCount
- if (function.extensionReceiverParameter != null)
- ++actualParametersCount
- if (actualParametersCount == 0)
- return false
-
- if (function.valueParameters.any { it.isInlineParameter() })
- return false
-
- return canInlineArgumentsInPlace(sourceCompiler.compileInlineFunction(jvmSignature).node)
- }
-
- private fun inlineCall(nodeAndSmap: SMAPAndMethodNode, isInlineOnly: Boolean): InlineResult {
- val node = nodeAndSmap.node
- if (maskStartIndex != -1) {
- val parameters = invocationParamBuilder.buildParameters()
- val infos = expandMaskConditionsAndUpdateVariableNodes(
- node, maskStartIndex, maskValues, methodHandleInDefaultMethodIndex,
- parameters.parameters.filter { it.functionalArgument === DefaultValueOfInlineParameter }
- .mapTo<_, _, MutableCollection<Int>>(mutableSetOf()) { parameters.getDeclarationSlot(it) }
- )
- for (info in infos) {
- val lambda = DefaultLambda(info, sourceCompiler, node.name.substringBeforeLast("\$default"))
- parameters.getParameterByDeclarationSlot(info.offset).functionalArgument = lambda
- if (info.needReification) {
- lambda.reifiedTypeParametersUsages.mergeAll(reifiedTypeInliner.reifyInstructions(lambda.node.node))
- }
- for (captured in lambda.capturedVars) {
- val param = invocationParamBuilder.addCapturedParam(captured, captured.fieldName, false)
- param.remapValue = StackValue.Local(codegen.frameMap.enterTemp(param.type), param.type, null)
- param.isSynthetic = true
- }
- }
+ putArgumentToLocalVal(expectedType, onStack, irValueParameter.indexInOldValueParameters, kind)
}
-
- val reificationResult = reifiedTypeInliner.reifyInstructions(node)
-
- val parameters = invocationParamBuilder.buildParameters()
-
- val info = RootInliningContext(
- state, codegen.inlineNameGenerator.subGenerator(jvmSignature.asmMethod.name),
- sourceCompiler, sourceCompiler.inlineCallSiteInfo, reifiedTypeInliner, typeParameterMappings,
- codegen.inlineScopesGenerator
- )
-
- val sourceMapper = sourceCompiler.sourceMapper
- val sourceInfo = sourceMapper.sourceInfo!!
- val lastLineNumber = codegen.lastLineNumber
- val callSite = SourcePosition(lastLineNumber, sourceInfo.sourceFileName!!, sourceInfo.pathOrCleanFQN)
- info.inlineScopesGenerator?.apply { currentCallSiteLineNumber = lastLineNumber }
- val inliner = MethodInliner(
- node, parameters, info, FieldRemapper(null, null, parameters), sourceCompiler.isCallInsideSameModuleAsCallee,
- { "Method inlining " + sourceCompiler.callElementText },
- SourceMapCopier(sourceMapper, nodeAndSmap.classSMAP, callSite),
- info.callSiteInfo,
- isInlineOnlyMethod = isInlineOnly,
- !isInlinedToInlineFunInKotlinRuntime(),
- maskStartIndex,
- maskStartIndex + maskValues.size,
- ) //with captured
-
- val remapper = LocalVarRemapper(parameters, initialFrameSize)
-
- val adapter = createEmptyMethodNode()
- //hack to keep linenumber info, otherwise jdi will skip begin of linenumber chain
- adapter.visitInsn(Opcodes.NOP)
-
- val result = inliner.doInline(adapter, remapper, true, mapOf())
- result.reifiedTypeParametersUsages.mergeAll(reificationResult)
-
- val infos = MethodInliner.processReturns(adapter, sourceCompiler.getContextLabels(), null)
- generateAndInsertFinallyBlocks(
- adapter, infos, (remapper.remap(parameters.argsSizeOnStack).value as StackValue.Local).index
- )
- if (!sourceCompiler.isFinallyMarkerRequired) {
- removeFinallyMarkers(adapter)
- }
-
- // In case `codegen.visitor` is `<clinit>`, initializer for the `$assertionsDisabled` field
- // needs to be inserted before the code that actually uses it.
- if (info.generateAssertField) {
- // May be inlining code into `<clinit>`, in which case it's too late to modify the IR and
- // `generateAssertFieldIfNeeded` will return a statement for which we need to emit bytecode.
- val isClInit = sourceCompiler.inlineCallSiteInfo.method.name == "<clinit>"
- codegen.classCodegen.generateAssertFieldIfNeeded(isClInit)?.accept(codegen, BlockInfo())?.discard()
- }
-
- val shouldSpillStack = node.requiresEmptyStackOnEntry()
- if (shouldSpillStack) {
- addInlineMarker(codegen.visitor, true)
- }
- adapter.accept(MethodBodyVisitor(codegen.visitor))
- if (shouldSpillStack) {
- addInlineMarker(codegen.visitor, false)
- }
- return result
}
- private fun generateAndInsertFinallyBlocks(
- intoNode: MethodNode,
- insertPoints: List<MethodInliner.PointForExternalFinallyBlocks>,
- offsetForFinallyLocalVar: Int,
+ override fun genInlineCall(
+ callableMethod: IrCallableMethod,
+ codegen: ExpressionCodegen,
+ expression: IrFunctionAccessExpression,
+ isInsideIfCondition: Boolean,
) {
- if (!sourceCompiler.hasFinallyBlocks()) return
-
- val extensionPoints = insertPoints.associateBy { it.beforeIns }
- val processor = DefaultProcessor(intoNode, offsetForFinallyLocalVar)
-
- var curFinallyDepth = 0
- var curInstr: AbstractInsnNode? = intoNode.instructions.first
- while (curInstr != null) {
- processor.processInstruction(curInstr, true)
- if (isFinallyStart(curInstr)) {
- //TODO depth index calc could be more precise
- curFinallyDepth = getConstant(curInstr.previous)
- }
-
- val extension = extensionPoints[curInstr]
- if (extension != null) {
- var nextFreeLocalIndex = processor.nextFreeLocalIndex
- for (local in processor.localVarsMetaInfo.currentIntervals) {
- val size = Type.getType(local.node.desc).size
- nextFreeLocalIndex = max(offsetForFinallyLocalVar + local.node.index + size, nextFreeLocalIndex)
- }
-
- val start = Label()
- val finallyNode = createEmptyMethodNode()
- finallyNode.visitLabel(start)
- val mark = codegen.frameMap.skipTo(nextFreeLocalIndex)
- sourceCompiler.generateFinallyBlocks(
- finallyNode, curFinallyDepth, extension.returnType, extension.finallyIntervalEnd.label, extension.jumpTarget
- )
- mark.dropTo()
- insertNodeBefore(finallyNode, intoNode, curInstr)
-
- val splitBy = SimpleInterval(start.info as LabelNode, extension.finallyIntervalEnd)
- processor.tryBlocksMetaInfo.splitAndRemoveCurrentIntervals(splitBy, true)
- processor.localVarsMetaInfo.splitAndRemoveCurrentIntervals(splitBy, true)
- finallyNode.localVariables.forEach {
- processor.localVarsMetaInfo.addNewInterval(LocalVarNodeWrapper(it))
- }
- }
-
- curInstr = curInstr.next
- }
-
- processor.substituteTryBlockNodes(intoNode)
- processor.substituteLocalVarTable(intoNode)
- }
-
- private fun leaveTemps() {
- invocationParamBuilder.listAllParams().asReversed().forEach { param ->
- if (!param.isSkippedOrRemapped || CapturedParamInfo.isSynthetic(param)) {
- codegen.frameMap.leaveTemp(param.type)
- }
- }
- }
-
- private fun putCapturedToLocalVal(stackValue: StackValue, capturedParam: CapturedParamDesc, kotlinType: KotlinType?) {
- val info = invocationParamBuilder.addCapturedParam(capturedParam, capturedParam.fieldName, false)
- val asmType = info.type
- if (stackValue.isLocalWithNoBoxing(JvmKotlinType(asmType, kotlinType))) {
- info.remapValue = stackValue
- } else {
- stackValue.put(asmType, kotlinType, codegen.visitor)
- val index = codegen.frameMap.enterTemp(asmType)
- codegen.visitor.store(index, asmType)
- info.remapValue = StackValue.Local(index, asmType, null)
- info.isSynthetic = true
- }
+ performInline(isInsideIfCondition, function.isInlineOnly())
}
- companion object {
- private fun StackValue.isLocalWithNoBoxing(expected: JvmKotlinType): Boolean =
- this is StackValue.Local &&
- isPrimitive(expected.type) == isPrimitive(type) &&
- !StackValue.requiresInlineClassBoxingOrUnboxing(type, kotlinType, expected.type, expected.kotlinType)
-
- // Stack spilling before inline function call is required if the inlined bytecode has:
- // 1. try-catch blocks - otherwise the stack spilling before and after them will not be correct;
- // 2. suspension points - again, the stack spilling around them is otherwise wrong;
- // 3. loops - OpenJDK cannot JIT-optimize between loop iterations if the stack is not empty.
- // Instead of checking for loops precisely, we just check if there are any backward jumps -
- // that is, a jump from instruction #i to instruction #j where j < i.
- private fun MethodNode.requiresEmptyStackOnEntry(): Boolean = tryCatchBlocks.isNotEmpty() ||
- instructions.any { isBeforeSuspendMarker(it) || isBeforeInlineSuspendMarker(it) || isBackwardsJump(it) }
-
- private fun MethodNode.isBackwardsJump(insn: AbstractInsnNode): Boolean = when (insn) {
- is JumpInsnNode -> isBackwardsJump(insn, insn.label)
- is LookupSwitchInsnNode ->
- insn.dflt?.let { to -> isBackwardsJump(insn, to) } == true || insn.labels.any { to -> isBackwardsJump(insn, to) }
- is TableSwitchInsnNode ->
- insn.dflt?.let { to -> isBackwardsJump(insn, to) } == true || insn.labels.any { to -> isBackwardsJump(insn, to) }
- else -> false
- }
-
- private fun MethodNode.isBackwardsJump(from: AbstractInsnNode, to: LabelNode): Boolean =
- instructions.indexOf(to) < instructions.indexOf(from)
+ override fun genCycleStub(text: String, codegen: ExpressionCodegen) {
+ generateStub(text, codegen)
}
}
|
kotlin
|
jetbrains
|
Kotlin
|
Kotlin
| 50,115
| 5,861
|
The Kotlin Programming Language.
|
jetbrains_kotlin
|
CODE_IMPROVEMENT
|
refactoring done
|
b2b8585e63664a0c7aa18b95528e345c2738c4ae
|
2023-04-07 21:21:25
|
Ishan Dutta
|
Add LeNet Implementation in PyTorch (#7070) * add torch to requirements
* add lenet architecture in pytorch
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* add type hints
* remove file
* add type hints
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* update variable name
* add fail test
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* add newline
* reformatting
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
| false
| 83
| 0
| 83
|
--- computer_vision/lenet_pytorch.py
@@ -1,82 +0,0 @@
-"""
-LeNet Network
-
-Paper: http://vision.stanford.edu/cs598_spring07/papers/Lecun98.pdf
-"""
-
-import numpy
-import torch
-import torch.nn as nn
-
-
-class LeNet(nn.Module):
- def __init__(self) -> None:
- super().__init__()
-
- self.tanh = nn.Tanh()
- self.avgpool = nn.AvgPool2d(kernel_size=2, stride=2)
-
- self.conv1 = nn.Conv2d(
- in_channels=1,
- out_channels=6,
- kernel_size=(5, 5),
- stride=(1, 1),
- padding=(0, 0),
- )
- self.conv2 = nn.Conv2d(
- in_channels=6,
- out_channels=16,
- kernel_size=(5, 5),
- stride=(1, 1),
- padding=(0, 0),
- )
- self.conv3 = nn.Conv2d(
- in_channels=16,
- out_channels=120,
- kernel_size=(5, 5),
- stride=(1, 1),
- padding=(0, 0),
- )
-
- self.linear1 = nn.Linear(120, 84)
- self.linear2 = nn.Linear(84, 10)
-
- def forward(self, image_array: numpy.ndarray) -> numpy.ndarray:
- image_array = self.tanh(self.conv1(image_array))
- image_array = self.avgpool(image_array)
- image_array = self.tanh(self.conv2(image_array))
- image_array = self.avgpool(image_array)
- image_array = self.tanh(self.conv3(image_array))
-
- image_array = image_array.reshape(image_array.shape[0], -1)
- image_array = self.tanh(self.linear1(image_array))
- image_array = self.linear2(image_array)
- return image_array
-
-
-def test_model(image_tensor: torch.tensor) -> bool:
- """
- Test the model on an input batch of 64 images
-
- Args:
- image_tensor (torch.tensor): Batch of Images for the model
-
- >>> test_model(torch.randn(64, 1, 32, 32))
- True
-
- """
- try:
- model = LeNet()
- output = model(image_tensor)
- except RuntimeError:
- return False
-
- return output.shape == torch.zeros([64, 10]).shape
-
-
-if __name__ == "__main__":
- random_image_1 = torch.randn(64, 1, 32, 32)
- random_image_2 = torch.randn(1, 32, 32)
-
- print(f"random_image_1 Model Passed: {test_model(random_image_1)}")
- print(f"\nrandom_image_2 Model Passed: {test_model(random_image_2)}")
--- requirements.txt
@@ -17,7 +17,6 @@ statsmodels
sympy
tensorflow
texttable
-torch
tweepy
xgboost
yulewalker
|
python
|
thealgorithms
|
Python
|
Python
| 197,891
| 46,346
|
All Algorithms implemented in Python
|
thealgorithms_python
|
CODE_IMPROVEMENT
|
Obvious
|
cfdc8845829d080dd2c745f59a8b2b8b8c3a91f7
| null |
Kyle Mathews
|
typo
| false
| 1
| 1
| 0
|
--- 09.4-test-utils.md
@@ -95,7 +95,7 @@ Traverse all components in `tree` and accumulate all components where `test(comp
### scryRenderedDOMComponentsWithClass
```javascript
-array scryRenderedDOMComponentsWithClass(ReactCompoennt tree, string className)
+array scryRenderedDOMComponentsWithClass(ReactComponent tree, string className)
```
Finds all instance of components in the rendered tree that are DOM components with the class name matching `className`.
|
facebook_react.json
| null | null | null | null | null | null |
facebook_react.json
|
CONFIG_CHANGE
|
5, obvious
|
2e6b65ae7c79af85b62261012a31c892250c32df
|
2023-05-04 04:36:05
|
Darren
|
Correct links instead of commenting them out (override of #740) (#741) * Remove gittr and Slack links, replace with Discord
* Update Rivest paper link
* Update link for No Silver Bullet
* Update link to Event Driven FRP
* Update link to Realizing quality improvement paper
| false
| 5
| 5
| 10
|
--- CODE_OF_CONDUCT.md
@@ -8,7 +8,7 @@ Papers We Love events are for anyone interested in Computer Science/Computer Eng
**Be an adult, don't be a jerk.**
-We value the participation of each member of the community and want all attendees to have an enjoyable and fulfilling experience. Accordingly, all attendees are expected to show respect and courtesy to other attendees throughout the meet-ups and at all Papers We Love events and interactions on the GitHub repository, or [Discord](https://discord.gg/XnarqB59r2).
+We value the participation of each member of the community and want all attendees to have an enjoyable and fulfilling experience. Accordingly, all attendees are expected to show respect and courtesy to other attendees throughout the meet-ups and at all Papers We Love events and interactions on the GitHub repository, IRC, [gitter](https://gitter.im/papers-we-love/), or [Slack](https://paperswelove.slack.com/messages/general/) channels.
Need help?
----------
--- cryptography/README.md
@@ -1,6 +1,6 @@
# Cryptography
-* [A Method for Obtaining Digital Signatures and Public-Key Cryptosystems (1977)](https://dl.acm.org/doi/10.1145/359340.359342)
+* [A Method for Obtaining Digital Signatures and Public-Key Cryptosystems (1977)](http://people.csail.mit.edu/rivest/Rsapaper.pdf)
* [Twenty Years of Attacks on the RSA Cryptosystem (1999)](https://crypto.stanford.edu/~dabo/papers/RSA-survey.pdf)
* :scroll: [Communication Theory of Secrecy Systems (1949)](communication-theory-of-secrecy-systems.pdf)
* [New Directions in Cryptography (1976)](http://www-ee.stanford.edu/~hellman/publications/24.pdf)
--- design/README.md
@@ -1,5 +1,5 @@
# Design
-* [No Silver Bullet — Essence and Accidents of Software Engineering](https://www.semanticscholar.org/paper/No-Silver-Bullet-Essence-and-Accidents-of-Software-Brooks/ed0759b7001f8be53bb4282750e98198b359307d)
+* [No Silver Bullet — Essence and Accidents of Software Engineering](http://www.cs.unc.edu/techreports/86-020.pdf)
* [Traits: A Mechanism for Fine-Grained Reuse](http://scg.unibe.ch/archive/papers/Duca06bTOPLASTraits.pdf)
* [THING-MODEL-VIEW-EDITOR an Example from a planningsystem](http://heim.ifi.uio.no/~trygver/1979/mvc-1/1979-05-MVC.pdf)
--- languages-paradigms/functional_reactive_programming/README.md
@@ -2,7 +2,7 @@
* [Functional Reactive Programming, Continued](https://www.antonycourtney.com/pubs/frpcont.pdf)
-* [Event-Driven FRP](https://www.researchgate.net/publication/2415013_Event-driven_FRP)
+* [Event-Driven FRP](http://www.cs.yale.edu/homes/zwan/papers/mcu/efrp.pdf)
* [Real-Time FRP](https://csiflabs.cs.ucdavis.edu/~johari/refs/rt-frp.pdf)
--- testing/tdd/README.md
@@ -5,4 +5,4 @@
## In industrial teams
-[Realizing quality improvement through test driven development: results and experiences of four industrial teams](https://www.microsoft.com/en-us/research/wp-content/uploads/2009/10/Realizing-Quality-Improvement-Through-Test-Driven-Development-Results-and-Experiences-of-Four-Industrial-Teams-nagappan_tdd.pdf). This paper is important because it one of the few instances of quantitative research about TDD in industrial teams (not in controlled environments)
+[Realizing quality improvement through test driven development: results and experiences of four industrial teams](https://github.com/tpn/pdfs/raw/master/Realizing%20Quality%20Improvement%20Through%20Test%20Driven%20Development%20-%20Results%20and%20Experiences%20of%20Four%20Industrial%20Teams%20(nagappan_tdd).pdf). This paper is important because it one of the few instances of quantitative research about TDD in industrial teams (not in controlled environments)
|
papers-we-love
|
papers-we-love
|
Shell
|
Shell
| 91,347
| 5,859
|
Papers from the computer science community to read and discuss.
|
papers-we-love_papers-we-love
|
DOC_CHANGE
|
Obvious
|
c25727a929f83b7cd731d2b5d01fdf2a3ee60654
|
2023-01-16 20:04:33
|
Don Turner
|
Remove reference to Result from Async
| false
| 1
| 3
| 4
|
--- app/src/main/java/com/example/android/architecture/blueprints/todoapp/util/Async.kt
@@ -16,8 +16,10 @@
package com.example.android.architecture.blueprints.todoapp.util
+import com.example.android.architecture.blueprints.todoapp.data.Result
+
/**
- * A generic class that holds a loading signal or the result of an async operation.
+ * A generic class that holds a loading signal or a [Result].
*/
sealed class Async<out T> {
object Loading : Async<Nothing>()
|
architecture-samples
|
android
|
Kotlin
|
Kotlin
| 44,806
| 11,701
|
A collection of samples to discuss and showcase different architectural tools and patterns for Android apps.
|
android_architecture-samples
|
NEW_FEAT
|
Adding a new UI component to the main screen
|
33a98e8ac2a979eb33c250d2301fdeba5d4a1b3e
|
2023-06-12 06:09:33
|
Mike Bostock
|
better quadtree animation
| false
| 27
| 52
| 79
|
--- docs/components/ExampleAnimatedQuadtree.vue
@@ -1,7 +1,7 @@
<script setup>
import * as d3 from "d3";
-import quadtree_visitParent from "./quadtreeVisitParent.js";
+import quadtree_visitQuad from "./quadtreeVisitQuad.js";
</script>
<script>
@@ -10,29 +10,55 @@ const width = 688;
const height = width;
async function render(node, {points}) {
+ const svg = d3.select(node);
+ const g = svg.append("g").attr("stroke", "currentColor").attr("fill", "none");
const tree = d3.quadtree([], (i) => points[i][0], (i) => points[i][1]);
tree.cover(d3.min(points, ([x]) => x), d3.min(points, ([, y]) => y));
tree.cover(d3.max(points, ([x]) => x), d3.max(points, ([, y]) => y));
const x = d3.scaleLinear([tree._x0, tree._x1], [0.5, width - 0.5]);
const y = d3.scaleLinear([tree._y0, tree._y1], [height - 0.5, 0.5]);
- const svg = d3.select(node);
- const g = svg.append("g").attr("stroke", "currentColor").attr("fill", "none");
- g.append("rect").attr("x", x(tree._x0)).attr("y", y(tree._y1)).attr("width", x(tree._x1) - x(tree._x0)).attr("height", y(tree._y0) - y(tree._y1));
const nodes = new Set();
for (let i = 0; i < points.length; ++i) {
tree.add(i);
let t = svg.transition();
- svg.append("circle").attr("fill", "currentColor").attr("stroke", "var(--vp-c-bg-alt)").attr("cx", x(points[i][0])).attr("cy", y(points[i][1])).attr("r", 0).transition(t).attr("r", 2.5);
- quadtree_visitParent.call(tree, (x0, y0, x1, y1) => {
+ svg.append("circle")
+ .attr("fill", "currentColor")
+ .attr("stroke", "var(--vp-c-bg-alt)")
+ .attr("cx", x(points[i][0]))
+ .attr("cy", y(points[i][1]))
+ .attr("r", 0)
+ .transition(t)
+ .attr("r", 2.5);
+ quadtree_visitQuad.call(tree, (node, x0, y0, x1, y1, quad) => {
const key = [x0, y0, x1, y1].join();
if (nodes.has(key)) return;
nodes.add(key);
- const xm = (x0 + x1) / 2;
- const ym = (y0 + y1) / 2;
- g.append("line").attr("x1", x(xm)).attr("y1", y(ym)).attr("x2", x(xm)).attr("y2", y(ym)).transition(t).attr("x1", x(x0));
- g.append("line").attr("x1", x(xm)).attr("y1", y(ym)).attr("x2", x(xm)).attr("y2", y(ym)).transition(t).attr("x2", x(x1));
- g.append("line").attr("x1", x(xm)).attr("y1", y(ym)).attr("x2", x(xm)).attr("y2", y(ym)).transition(t).attr("y1", y(y0));
- g.append("line").attr("x1", x(xm)).attr("y1", y(ym)).attr("x2", x(xm)).attr("y2", y(ym)).transition(t).attr("y2", y(y1));
+ switch (quad) {
+ case 0: { // top-left
+ g.append("line").attr("x1", x(x1)).attr("y1", y(y1)).attr("x2", x(x1)).attr("y2", y(y1)).transition(t).attr("y1", y(y0));
+ g.append("line").attr("x1", x(x1)).attr("y1", y(y1)).attr("x2", x(x1)).attr("y2", y(y1)).transition(t).attr("x1", x(x0));
+ break;
+ }
+ case 1: { // top-right
+ g.append("line").attr("x1", x(x0)).attr("y1", y(y1)).attr("x2", x(x0)).attr("y2", y(y1)).transition(t).attr("y1", y(y0));
+ g.append("line").attr("x1", x(x0)).attr("y1", y(y1)).attr("x2", x(x0)).attr("y2", y(y1)).transition(t).attr("x2", x(x1));
+ break;
+ }
+ case 2: { // bottom-left
+ g.append("line").attr("x1", x(x1)).attr("y1", y(y0)).attr("x2", x(x1)).attr("y2", y(y0)).transition(t).attr("y2", y(y1));
+ g.append("line").attr("x1", x(x1)).attr("y1", y(y0)).attr("x2", x(x1)).attr("y2", y(y0)).transition(t).attr("x1", x(x0));
+ break;
+ }
+ case 3: { // bottom-right
+ g.append("line").attr("x1", x(x0)).attr("y1", y(y0)).attr("x2", x(x0)).attr("y2", y(y0)).transition(t).attr("y2", y(y1));
+ g.append("line").attr("x1", x(x0)).attr("y1", y(y0)).attr("x2", x(x0)).attr("y2", y(y0)).transition(t).attr("x2", x(x1));
+ break;
+ }
+ case undefined: { // root
+ g.append("rect").attr("x", x(x0)).attr("y", y(y1)).attr("width", x(x1) - x(x0)).attr("height", y(y0) - y(y1));
+ break;
+ }
+ }
});
await new Promise((resolve) => setTimeout(resolve, 100));
if (!node.isConnected) return;
--- docs/components/quadtreeVisitParent.js
@@ -1,15 +0,0 @@
-export default function quadtree_visitParent(callback) {
- let quads = [], q, node = this._root, parent, child, x0, y0, x1, y1, xm, ym;
- if (node) quads.push({node, x0: this._x0, y0: this._y0, x1: this._x1, y1: this._y1});
- while ((q = quads.pop())) {
- node = q.node, parent = q.parent;
- if (parent) callback(parent.x0, parent.y0, parent.x1, parent.y1);
- if (!node.length) continue;
- x0 = q.x0, y0 = q.y0, x1 = q.x1, y1 = q.y1, xm = (x0 + x1) / 2, ym = (y0 + y1) / 2;
- if ((child = node[3])) quads.push({parent: q, node: child, x0: xm, y0: ym, x1: x1, y1: y1});
- if ((child = node[2])) quads.push({parent: q, node: child, x0: x0, y0: ym, x1: xm, y1: y1});
- if ((child = node[1])) quads.push({parent: q, node: child, x0: xm, y0: y0, x1: x1, y1: ym});
- if ((child = node[0])) quads.push({parent: q, node: child, x0: x0, y0: y0, x1: xm, y1: ym});
- }
- return this;
-}
--- docs/components/quadtreeVisitQuad.js
@@ -0,0 +1,14 @@
+export default function quadtree_visitQuad(callback) {
+ let quads = [], q, node = this._root, child, x0, y0, x1, y1;
+ if (node) quads.push({node, x0: this._x0, y0: this._y0, x1: this._x1, y1: this._y1});
+ while ((q = quads.pop())) {
+ if (!callback(node = q.node, x0 = q.x0, y0 = q.y0, x1 = q.x1, y1 = q.y1, q.quad) && node.length) {
+ let xm = (x0 + x1) / 2, ym = (y0 + y1) / 2;
+ if ((child = node[3])) quads.push({node: child, x0: xm, y0: ym, x1: x1, y1: y1, quad: 3});
+ if ((child = node[2])) quads.push({node: child, x0: x0, y0: ym, x1: xm, y1: y1, quad: 2});
+ if ((child = node[1])) quads.push({node: child, x0: xm, y0: y0, x1: x1, y1: ym, quad: 1});
+ if ((child = node[0])) quads.push({node: child, x0: x0, y0: y0, x1: xm, y1: ym, quad: 0});
+ }
+ }
+ return this;
+}
|
d3
|
d3
|
Shell
|
Shell
| 109,977
| 22,868
|
Bring data to life with SVG, Canvas and HTML. :bar_chart::chart_with_upwards_trend::tada:
|
d3_d3
|
CODE_IMPROVEMENT
|
Simplifying the d3 selection API
|
521b65be0c32c788c221e447085654fa30da9020
|
2023-05-08 10:30:23
|
adams549659584
|
fix: :bug: 新版本直接清理缓存
| false
| 24
| 19
| 43
|
--- web/js/index.js
@@ -107,9 +107,8 @@ async function registerSW() {
console.log('Service Worker 安装成功:', event);
const newSWVersion = await wb.messageSW({ type: 'GET_VERSION' });
if (newSWVersion !== oldSWVersion) {
- await clearCache();
alert(`新版本 ${newSWVersion} 已就绪,刷新后即可体验 !`);
- window.location.reload();
+ window.location.reload(true);
}
});
@@ -164,25 +163,6 @@ function hideLoading() {
loadingEle.classList.add('hidden');
}
-async function clearCache() {
- // del storage
- localStorage.clear();
- sessionStorage.clear();
- // del sw
- const cacheKeys = await caches.keys();
- for (const cacheKey of cacheKeys) {
- await caches.open(cacheKey).then(async (cache) => {
- const requests = await cache.keys();
- return await Promise.all(
- requests.map((request) => {
- console.log(`del cache : `, request.url);
- return cache.delete(request);
- })
- );
- });
- }
-}
-
(function () {
var config = { cookLoc: {} };
sj_evt.bind(
@@ -237,7 +217,22 @@ async function clearCache() {
// del cookie
setCookie(userCookieName, '', -1);
setCookie(randIpCookieName, '', -1);
- await clearCache();
+ // del storage
+ localStorage.clear();
+ sessionStorage.clear();
+ // del sw
+ const cacheKeys = await caches.keys();
+ for (const cacheKey of cacheKeys) {
+ await caches.open(cacheKey).then(async (cache) => {
+ const requests = await cache.keys();
+ return await Promise.all(
+ requests.map((request) => {
+ console.log(`del cache : `, request.url);
+ return cache.delete(request);
+ })
+ );
+ });
+ }
chatLoginBgEle.style.display = 'none';
window.location.reload();
};
--- web/sw.js
@@ -1,7 +1,7 @@
// 引入workbox 框架
importScripts('./js/sw/workbox-sw.js');
-const SW_VERSION = 'v1.4.1';
+const SW_VERSION = 'v1.4.0';
const CACHE_PREFIX = 'BingAI';
workbox.setConfig({ debug: false, logLevel: 'warn' });
@@ -56,7 +56,7 @@ workbox.precaching.precacheAndRoute([
},
{
url: '/web/js/index.js',
- revision: '2023.05.08',
+ revision: '2023.05.06.17',
},
// html
{
|
go-proxy-bingai
|
adams549659584
|
HTML
|
HTML
| 8,773
| 13,135
|
用 Vue3 和 Go 搭建的微软 New Bing 演示站点,拥有一致的 UI 体验,支持 ChatGPT 提示词,国内可用。
|
adams549659584_go-proxy-bingai
|
BUG_FIX
|
obvious
|
7d4f202c1c03655cd58411f07853eef772c3316b
|
2024-09-17 01:28:51
|
John Kleinschmidt
|
ci: move Archaeologist to GHA (#43701) * chore: move Archaeologist to GHA
* chore: test archaelogist changes
* Revert "chore: test archaelogist changes"
This reverts commit a575d6ef3a6495a71c03e9e2d15ec9bb329c5033.
* chore: properly name steps in archaeologist-dig
| false
| 85
| 0
| 85
|
--- .github/actions/generate-types/action.yml
@@ -1,24 +0,0 @@
-name: 'Generate Types for Archaeologist Dig'
-description: 'Generate Types for Archaeologist Dig'
-inputs:
- sha-file:
- description: 'File containing sha'
- required: true
- filename:
- description: 'Filename to write types to'
- required: true
-runs:
- using: "composite"
- steps:
- - name: Generating Types for SHA in ${{ inputs.sha-file }}
- shell: bash
- run: |
- git checkout $(cat ${{ inputs.sha-file }})
- rm -rf node_modules
- yarn install --frozen-lockfile --ignore-scripts
- echo "#!/usr/bin/env node\nglobal.x=1" > node_modules/typescript/bin/tsc
- node node_modules/.bin/electron-docs-parser --dir=./ --outDir=./ --moduleVersion=0.0.0-development
- node node_modules/.bin/electron-typescript-definitions --api=electron-api.json --outDir=artifacts
- mv artifacts/electron.d.ts artifacts/${{ inputs.filename }}
- git checkout .
- working-directory: ./electron
--- .github/workflows/archaeologist-dig.yml
@@ -1,61 +0,0 @@
-name: Archaeologist
-
-on:
- pull_request:
-
-jobs:
- archaeologist-dig:
- name: Archaeologist Dig
- runs-on: ubuntu-latest
- steps:
- - name: Checkout Electron
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332 #v4.0.2
- with:
- fetch-depth: 0
- - name: Setting Up Dig Site
- run: |
- echo "remote: ${{ github.event.pull_request.head.repo.clone_url }}"
- echo "sha ${{ github.event.pull_request.head.sha }}"
- echo "base ref ${{ github.event.pull_request.base.ref }}"
- git clone https://github.com/electron/electron.git electron
- cd electron
- mkdir -p artifacts
- git remote add fork ${{ github.event.pull_request.head.repo.clone_url }} && git fetch fork
- git checkout ${{ github.event.pull_request.head.sha }}
- git merge-base origin/${{ github.event.pull_request.base.ref }} HEAD > .dig-old
- echo ${{ github.event.pull_request.head.sha }} > .dig-new
- cp .dig-old artifacts
-
- - name: Generating Types for SHA in .dig-new
- uses: ./.github/actions/generate-types
- with:
- sha-file: .dig-new
- filename: electron.new.d.ts
- - name: Generating Types for SHA in .dig-old
- uses: ./.github/actions/generate-types
- with:
- sha-file: .dig-old
- filename: electron.old.d.ts
- - name: Upload artifacts
- uses: actions/upload-artifact@50769540e7f4bd5e21e526ee35c689e35e0d6874 #v4.4.0
- with:
- name: artifacts
- path: electron/artifacts
- include-hidden-files: true
- - name: Set job output
- run: |
- git diff --no-index electron.old.d.ts electron.new.d.ts > patchfile || true
- if [ -s patchfile ]; then
- echo "Changes Detected"
- echo "## Changes Detected" > $GITHUB_STEP_SUMMARY
- echo "Looks like the \`electron.d.ts\` file changed." >> $GITHUB_STEP_SUMMARY
- echo "" >> $GITHUB_STEP_SUMMARY
- echo "\`\`\`\`\`\`diff" >> $GITHUB_STEP_SUMMARY
- cat patchfile >> $GITHUB_STEP_SUMMARY
- echo "\`\`\`\`\`\`" >> $GITHUB_STEP_SUMMARY
- else
- echo "No Changes Detected"
- echo "## No Changes" > $GITHUB_STEP_SUMMARY
- echo "We couldn't see any changes in the \`electron.d.ts\` artifact" >> $GITHUB_STEP_SUMMARY
- fi
- working-directory: ./electron/artifacts
|
electron
|
electron
|
C++
|
C++
| 115,677
| 15,852
|
:electron: Build cross-platform desktop apps with JavaScript, HTML, and CSS
|
electron_electron
|
CONFIG_CHANGE
|
yml files updated
|
8b7b819b4b9e6ba457e011e92e33266690e26957
| null |
Jannis Hell
|
Use const instead of var in env.js (#7526)
| false
| 1
| 1
| 0
|
--- env.js
@@ -23,7 +23,7 @@ if (!NODE_ENV) {
}
// https://github.com/bkeepers/dotenv#what-other-env-files-can-i-use
-var dotenvFiles = [
+const dotenvFiles = [
`${paths.dotenv}.${NODE_ENV}.local`,
`${paths.dotenv}.${NODE_ENV}`,
// Don't include `.env.local` for `test` environment
|
facebook_create-react-app.json
| null | null | null | null | null | null |
facebook_create-react-app.json
|
CODE_IMPROVEMENT
|
5, const instead of var
|
85163c785dcfbb6cd5c3d9fc6a9c67fcdafca3aa
|
2024-04-06 00:06:28
|
Vinicius Souza
|
update action sintax
| false
| 15
| 13
| 28
|
--- .github/workflows/main.yml
@@ -1,5 +1,6 @@
-name: Convert README.md to HTML and Publish
+name: render readme
+# Controls when the action will run
on:
push:
branches: master
@@ -11,20 +12,17 @@ jobs:
contents: write
id-token: write
steps:
- - name: Checkout repo
- uses: actions/checkout@v3
-
- - name: Setup NodeJS
- uses: actions/setup-node@v3
- with:
- node-version: 18
- registry-url: 'https://registry.npmjs.org'
+ - uses: actions/checkout@v3
+ - uses: actions/setup-node@v3
+ with:
+ node-version: 18
+ registry-url: 'https://registry.npmjs.org'
- name: Test Converts markdown text to HTML
uses: ./
- with:
- source: README-zh.md
- output: index.html
- style: 'body { margin: 0; }'
- github-corners: https://github.com/vsouza/awesome-ios
- favicon: data:image/svg+xml,<svg xmlns=%22http://www.w3.org/2000/svg%22 viewBox=%220 0 100 100%22><text y=%22.9em%22 font-size=%2290%22>🌐</text></svg>
\ No newline at end of file
+ with:
+ source: README-zh.md
+ output: index.html
+ style: 'body { margin: 0; }'
+ github-corners: https://github.com/jaywcjlove/markdown-to-html-cli
+ favicon: data:image/svg+xml,<svg xmlns=%22http://www.w3.org/2000/svg%22 viewBox=%220 0 100 100%22><text y=%22.9em%22 font-size=%2290%22>🌐</text></svg>
\ No newline at end of file
|
awesome-ios
|
vsouza
|
Swift
|
Swift
| 48,363
| 6,877
|
A curated list of awesome iOS ecosystem, including Objective-C and Swift Projects
|
vsouza_awesome-ios
|
CONFIG_CHANGE
|
changes in yml file
|
dd8ea898a79b1e09e2aa4a32a28c8c7e5512d4c0
|
2022-03-26 21:27:26
|
Robert Felker
|
Stream support
| false
| 32
| 14
| 46
|
--- source.md
@@ -1,9 +1,9 @@
[<img src="https://user-images.githubusercontent.com/1295961/45949308-cbb2f680-bffb-11e8-8054-28c35ed6d132.png" align="center" width="850">](https://flutter.dev/)
-
<p align="center">
+
<a href="https://github.com/search?q=flutter+language%3Adart&type=Repositories">
- <img alt="Github Repositories" src="https://img.shields.io/badge/Repos-254629-brightgreen.svg" />
+ <img alt="Github Repositories" src="https://img.shields.io/badge/Repos-@[email protected]" />
</a>
<a href="https://github.com/sindresorhus/awesome">
<img alt="Awesome" src="https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg" />
@@ -13,43 +13,25 @@
</a>
</p>
-
<a href="https://flutter.dev/">Flutter</a> is Google’s UI toolkit for building beautiful, natively compiled applications for mobile, web, and desktop from a single codebase.
If you appreciate the content 📖, support projects visibility, give 👍| ⭐| 👏
-<a href="https://getstream.io/chat/sdk/flutter/?utm_source=Github&utm_medium=Github_Repo_Content_Ad&utm_content=Developer&utm_campaign=Github_Mar2022_FlutterChatSDK&utm_term=Awesome">
-<img src="https://user-images.githubusercontent.com/1295961/160238710-1b5a987a-478e-41b4-b11c-37be8670a8c9.png"/>
-</a>
-
-
-
-#### Demonstrations
<div style="text-align: center"><table><tr>
<td style="text-align: center">
- <a href="https://getstream.io/chat/sdk/flutter/?utm_source=Github&utm_medium=Github_Repo_Content_Ad&utm_content=Developer&utm_campaign=Github_Mar2022_FlutterChatSDK&utm_term=Awesome">
- <img src="https://user-images.githubusercontent.com/1295961/158429732-0a5dc21e-d051-4c8a-8eea-e58734f9cfd7.gif" width="180"/>
-
- Instant
- Chat Integration
- with Stream!
- </a>
- </td>
- <td style="text-align: center">
- <img width="180" alt="BMW" src="https://user-images.githubusercontent.com/1295961/160239273-ce881c0c-c3de-4953-9448-dfd12d7ffe30.png">
-
- BMW
- Connect
-
- </td>
+ <a href="https://twitter.com/BlueAquilae/status/1049315328835182592">
+ <img src="https://i.imgur.com/1Xdsp92.gif" width="200"/></a>
+</td>
+<td style="text-align: center">
+<img src="https://github.com/flschweiger/reply/blob/master/gif/reply.gif" width="400"/>
+</td>
+ <td style="text-align: center">
+<img src="https://camo.githubusercontent.com/23d3c78b0a2b645567630468bd68d54c02c2076a/68747470733a2f2f63646e2e3264696d656e73696f6e732e636f6d2f315f53746172742e676966" width="200"/>
+</td>
+</td>
<td style="text-align: center">
- <img width="180" alt="BlueAquilae Twitter Meteo" src="https://user-images.githubusercontent.com/1295961/160238906-540a4a0d-b721-4c73-8b58-58b96b5e6414.png">
-
-
- Calendar
- Meteo
-
- </td>
+</td>
+
</tr></table></div>
|
awesome-flutter
|
solido
|
Dart
|
Dart
| 54,974
| 6,726
|
An awesome list that curates the best Flutter libraries, tools, tutorials, articles and more.
|
solido_awesome-flutter
|
DOC_CHANGE
|
changes in md file
|
dacd59f6546d442f02f505d145138ac22919979b
|
2025-04-02 20:37:59
|
Marco Pennekamp
|
[LL] Extract `LLKnownClassDeclarationSymbolProvider` from `LLKotlinSymbolProvider` - This will allow `LLFirJavaSymbolProvider` to implement `getClassLikeSymbolByClassId` for a known declaration as well so that it can be used in the (common) implementation of `getClassLikeSymbolByPsi`. - Furthermore, unified symbol providers can use this interface to call the `getClassLikeSymbolByClassId` for known declarations on individual providers. ^KT-72998
| false
| 34
| 20
| 54
|
--- analysis/low-level-api-fir/src/org/jetbrains/kotlin/analysis/low/level/api/fir/symbolProviders/LLKnownClassDeclarationSymbolProvider.kt
@@ -1,31 +0,0 @@
-/*
- * Copyright 2010-2025 JetBrains s.r.o. and Kotlin Programming Language contributors.
- * Use of this source code is governed by the Apache 2.0 license that can be found in the license/LICENSE.txt file.
- */
-
-package org.jetbrains.kotlin.analysis.low.level.api.fir.symbolProviders
-
-import com.intellij.psi.PsiElement
-import org.jetbrains.kotlin.fir.resolve.providers.FirSymbolProvider
-import org.jetbrains.kotlin.fir.symbols.impl.FirClassLikeSymbol
-import org.jetbrains.kotlin.name.ClassId
-import org.jetbrains.kotlin.psi.KtClassLikeDeclaration
-
-/**
- * A [FirSymbolProvider] which is able to provide a class-like symbol for a [ClassId] with an already known class-like declaration [E]. The
- * main purpose is optimization to avoid searching for a PSI declaration which is already known.
- */
-interface LLKnownClassDeclarationSymbolProvider<E : PsiElement> {
- /**
- * Returns the [FirClassLikeSymbol] with the given [classId] for a known [classLikeDeclaration].
- *
- * As [classLikeDeclaration] is already known, this function is optimized to avoid a search for the corresponding PSI declaration.
- * However, the given declaration has to be *in the scope of the symbol provider*.
- *
- * Furthermore, the function does not guarantee that a symbol for exactly [classLikeDeclaration] will be returned, as this parameter is
- * only used for optimization. This is in line with the contracts of [FirSymbolProvider.getClassLikeSymbolByClassId], which only
- * considers the [ClassId] itself and operates on a first-come, first-serve basis. The first [KtClassLikeDeclaration] passed to this
- * function or fetched by the symbol provider itself becomes the basis of the class-like symbol for that class ID.
- */
- fun getClassLikeSymbolByClassId(classId: ClassId, classLikeDeclaration: E): FirClassLikeSymbol<*>?
-}
--- analysis/low-level-api-fir/src/org/jetbrains/kotlin/analysis/low/level/api/fir/symbolProviders/LLKotlinSymbolProvider.kt
@@ -11,9 +11,11 @@ import org.jetbrains.kotlin.fir.FirSession
import org.jetbrains.kotlin.fir.resolve.providers.FirSymbolProvider
import org.jetbrains.kotlin.fir.resolve.providers.FirSymbolProviderInternals
import org.jetbrains.kotlin.fir.symbols.impl.FirCallableSymbol
+import org.jetbrains.kotlin.fir.symbols.impl.FirClassLikeSymbol
import org.jetbrains.kotlin.fir.symbols.impl.FirNamedFunctionSymbol
import org.jetbrains.kotlin.fir.symbols.impl.FirPropertySymbol
import org.jetbrains.kotlin.name.CallableId
+import org.jetbrains.kotlin.name.ClassId
import org.jetbrains.kotlin.psi.KtCallableDeclaration
import org.jetbrains.kotlin.psi.KtClassLikeDeclaration
import org.jetbrains.kotlin.psi.KtNamedFunction
@@ -24,9 +26,7 @@ import org.jetbrains.kotlin.psi.KtProperty
*
* @see org.jetbrains.kotlin.analysis.low.level.api.fir.symbolProviders.combined.LLCombinedKotlinSymbolProvider
*/
-internal abstract class LLKotlinSymbolProvider(session: FirSession) :
- FirSymbolProvider(session),
- LLKnownClassDeclarationSymbolProvider<KtClassLikeDeclaration> {
+internal abstract class LLKotlinSymbolProvider(session: FirSession) : FirSymbolProvider(session) {
abstract val declarationProvider: KotlinDeclarationProvider
abstract val packageProvider: KotlinPackageProvider
@@ -37,6 +37,23 @@ internal abstract class LLKotlinSymbolProvider(session: FirSession) :
*/
abstract val allowKotlinPackage: Boolean
+ /**
+ * Returns the [FirClassLikeSymbol] with the given [classId] for a known [classLikeDeclaration].
+ *
+ * As [classLikeDeclaration] is already known, this function is optimized to avoid declaration provider accesses. However, the given
+ * declaration has to be one of the [classes][KotlinDeclarationProvider.getAllClassesByClassId] or [type aliases][KotlinDeclarationProvider.getAllTypeAliasesByClassId]
+ * provided by the [declarationProvider].
+ *
+ * Furthermore, the function does not guarantee that a symbol for exactly [classLikeDeclaration] will be returned, as this parameter is
+ * only used for optimization. This is in line with the contracts of [FirSymbolProvider.getClassLikeSymbolByClassId], which only
+ * considers the [ClassId] itself and operates on a first-come, first-serve basis. The first [KtClassLikeDeclaration] passed to this
+ * function or fetched with [KotlinDeclarationProvider.getClassLikeDeclarationByClassId] (which does not guarantee a stable result
+ * either) becomes the basis of the class-like symbol for that class ID.
+ *
+ * To get a symbol for an exact class-like declaration, [getClassLikeSymbolByPsi] should be used instead.
+ */
+ abstract fun getClassLikeSymbolByClassId(classId: ClassId, classLikeDeclaration: KtClassLikeDeclaration): FirClassLikeSymbol<*>?
+
/**
* Maps the [FirCallableSymbol]s with the given [callableId] for known [callables] to [destination].
*
|
kotlin
|
jetbrains
|
Kotlin
|
Kotlin
| 50,115
| 5,861
|
The Kotlin Programming Language.
|
jetbrains_kotlin
|
CODE_IMPROVEMENT
|
Code change: type annotation added
|
ae1995f849aac92ec768361e73ad0ae56ee2bdbb
| null |
Adam Roben
|
Update libchromiumcontent for better Chrome 35 support * vendor/libchromiumcontent ded3c0a...331dbed (2): > Merge pull request #48 from brightray/chrome35-brightray > Merge pull request #49 from brightray/linux-fix
| false
| 1
| 1
| 0
|
--- libchromiumcontent
@@ -1 +1 @@
-Subproject commit ded3c0ad1d36cc900d94c7587a88d1b959ebc1c7
+Subproject commit 331dbed44676c534faf21f7db1985e796260649a
|
electron_electron.json
| null | null | null | null | null | null |
electron_electron.json
|
NEW_FEAT
|
5, obvious
|
e345ddfdcfd0d9703292ce5f9a9dde87c32c89cf
|
2023-01-20 16:01:34
|
Richard McElreath
|
week 2 solutions and week 3 hw
| false
| 0
| 0
| 0
|
--- homework/week02_solutions.pdf
Binary files a/homework/week02_solutions.pdf and /dev/null differ
--- homework/week03.pdf
Binary files a/homework/week03.pdf and /dev/null differ
|
stat_rethinking_2024
|
rmcelreath
|
R
|
R
| 1,474
| 151
| null |
rmcelreath_stat_rethinking_2024
|
CONFIG_CHANGE
|
new files uploaded
|
a43c6b15bb5623623bed9be53ff88b12572ed4f2
|
2025-02-14 00:01:42
|
David Vacca
|
Remove legacy codegen of $PropSetter classes for ViewManagers that are migrated to new architecture (#49404) Summary: Pull Request resolved: https://github.com/facebook/react-native/pull/49404 This diff disables codegen of legacy $PropSetter for viewManagers that implement the interface com.facebook.react.uimanager.ViewManagerWithGeneratedInterface. This logic will only be enabled for apps that are configured with BuildConfig.UNSTABLE_ENABLE_MINIFY_LEGACY_ARCHITECTURE = true changelog: [internal] internal Reviewed By: javache Differential Revision: D67412734 fbshipit-source-id: 682725714dc41f5f34d95d4d9a13ab09726b28f7
| false
| 27
| 2
| 29
|
--- packages/react-native/ReactAndroid/src/main/java/com/facebook/react/processing/ReactPropertyProcessor.java
@@ -20,7 +20,6 @@ import com.facebook.react.bridge.DynamicFromObject;
import com.facebook.react.bridge.ReadableArray;
import com.facebook.react.bridge.ReadableMap;
import com.facebook.react.common.annotations.UnstableReactNativeAPI;
-import com.facebook.react.common.build.ReactBuildConfig;
import com.facebook.react.uimanager.annotations.ReactProp;
import com.facebook.react.uimanager.annotations.ReactPropGroup;
import com.facebook.react.uimanager.annotations.ReactPropertyHolder;
@@ -94,8 +93,6 @@ public class ReactPropertyProcessor extends ProcessorBase {
private static final TypeName PROPERTY_MAP_TYPE =
ParameterizedTypeName.get(Map.class, String.class, String.class);
- public static final String VIEW_MANAGER_INTERFACE =
- "com.facebook.react.uimanager.ViewManagerWithGeneratedInterface";
private final Map<ClassName, ClassInfo> mClasses;
@@ -103,7 +100,6 @@ public class ReactPropertyProcessor extends ProcessorBase {
@SuppressFieldNotInitialized private Messager mMessager;
@SuppressFieldNotInitialized private Elements mElements;
@SuppressFieldNotInitialized private Types mTypes;
- @SuppressFieldNotInitialized private TypeMirror mViewManagerWithGeneratedInterface = null;
static {
DEFAULT_TYPES = new HashMap<>();
@@ -148,17 +144,6 @@ public class ReactPropertyProcessor extends ProcessorBase {
mTypes = processingEnv.getTypeUtils();
}
- private TypeMirror getViewManagerWithGeneratedInterface() {
- if (mViewManagerWithGeneratedInterface == null) {
- TypeElement typeElement = mElements.getTypeElement(VIEW_MANAGER_INTERFACE);
- if (typeElement == null || typeElement.asType() == null) {
- throw new IllegalStateException("Could not find " + VIEW_MANAGER_INTERFACE);
- }
- mViewManagerWithGeneratedInterface = typeElement.asType();
- }
- return mViewManagerWithGeneratedInterface;
- }
-
@Override
public boolean processImpl(Set<? extends TypeElement> annotations, RoundEnvironment roundEnv) {
// Clear properties from previous rounds
@@ -169,10 +154,7 @@ public class ReactPropertyProcessor extends ProcessorBase {
try {
TypeElement classType = (TypeElement) element;
ClassName className = ClassName.get(classType);
- ClassInfo classInfo = parseClass(className, classType);
- if (classInfo != null) {
- mClasses.put(className, classInfo);
- }
+ mClasses.put(className, parseClass(className, classType));
} catch (Exception e) {
error(element, e.getMessage());
}
@@ -210,16 +192,9 @@ public class ReactPropertyProcessor extends ProcessorBase {
return typeName.equals(SHADOW_NODE_IMPL_TYPE);
}
- private @Nullable ClassInfo parseClass(ClassName className, TypeElement typeElement) {
+ private ClassInfo parseClass(ClassName className, TypeElement typeElement) {
TypeName targetType = getTargetType(typeElement.asType());
TypeName viewType = isShadowNodeType(targetType) ? null : targetType;
- boolean implementsViewManagerWithGeneratedInterface =
- mTypes.isAssignable(typeElement.asType(), getViewManagerWithGeneratedInterface());
-
- if (ReactBuildConfig.UNSTABLE_ENABLE_MINIFY_LEGACY_ARCHITECTURE
- && implementsViewManagerWithGeneratedInterface) {
- return null;
- }
ClassInfo classInfo = new ClassInfo(className, typeElement, viewType);
findProperties(classInfo, typeElement);
|
react-native
|
facebook
|
C++
|
C++
| 120,863
| 24,536
|
A framework for building native applications using React
|
facebook_react-native
|
DOC_CHANGE
|
Matched \bchangelog\b in message
|
0258e85186e07fa9fcdadb90a78e0a61c3f1daa8
|
2025-03-07 14:28:16
|
antirez
|
VSIM TRUTH option for ground truth results.
| false
| 100
| 9
| 109
|
--- README.md
@@ -60,7 +60,7 @@ performed in the background, while the command is executed in the main thread.
**VSIM: return elements by vector similarity**
- VSIM key [ELE|FP32|VALUES] <vector or element> [WITHSCORES] [COUNT num] [EF search-exploration-factor] [FILTER expression] [FILTER-EF max-filtering-effort] [TRUTH]
+ VSIM key [ELE|FP32|VALUES] <vector or element> [WITHSCORES] [COUNT num] [EF search-exploration-factor] [FILTER expression] [FILTER-EF max-filtering-effort]
The command returns similar vectors, for simplicity (and verbosity) in the following example, instead of providing a vector using FP32 or VALUES (like in `VADD`), we will ask for elements having a vector similar to a given element already in the sorted set:
@@ -88,8 +88,6 @@ It is possible to specify a `COUNT` and also to get the similarity score (from 1
The `EF` argument is the exploration factor: the higher it is, the slower the command becomes, but the better the index is explored to find nodes that are near to our query. Sensible values are from 50 to 1000.
-The `TRUTH` option forces the command to perform a linear scan of all the entries inside the set, without using the graph search inside the HNSW, so it returns the best matching elements (the perfect result set) that can be used in order to easily calculate the recall. Of course the linear scan is `O(N)`, so it is much slower than the `log(N)` (considering a small `COUNT`) provided by the HNSW index.
-
For `FILTER` and `FILTER-EF` options, please check the filtered search section of this documentation.
**VDIM: return the dimension of the vectors inside the vector set**
--- hnsw.c
@@ -2552,69 +2552,3 @@ void hnsw_test_graph_recall(HNSW *index, int test_ef, int verbose) {
unreachable_nodes,
total_nodes ? (float)unreachable_nodes * 100 / total_nodes : 0);
}
-
-/* Return exact K-NN items by performing a linear scan of all nodes.
- * This function has the same signature as hnsw_search_with_filter() but
- * instead of using the graph structure, it scans all nodes to find the
- * true nearest neighbors.
- *
- * Note that neighbors and distances arrays must have space for at least 'k' items.
- * norm_query should be set to 1 if the query vector is already normalized.
- *
- * If the filter_callback is passed, only elements passing the specified filter
- * are returned. The slot parameter is ignored but kept for API consistency. */
-int hnsw_ground_truth_with_filter
- (HNSW *index, const float *query_vector, uint32_t k,
- hnswNode **neighbors, float *distances, uint32_t slot,
- int query_vector_is_normalized,
- int (*filter_callback)(void *value, void *privdata),
- void *filter_privdata)
-{
- /* Note that we don't really use the slot here: it's a linear scan.
- * Yet we want the user to acquire the slot as this will hold the
- * global lock in read only mode. */
- (void) slot;
-
- /* Take our query vector into a temporary node. */
- hnswNode query;
- if (hnsw_init_tmp_node(index, &query, query_vector_is_normalized, query_vector) == 0) return -1;
-
- /* Accumulate best results into a priority queue. */
- pqueue *results = pq_new(k);
- if (!results) {
- hnsw_free_tmp_node(&query, query_vector);
- return -1;
- }
-
- /* Scan all nodes linearly. */
- hnswNode *current = index->head;
- while (current) {
- /* Apply filter if needed. */
- if (filter_callback &&
- !filter_callback(current->value, filter_privdata))
- {
- current = current->next;
- continue;
- }
-
- /* Calculate distance to query. */
- float dist = hnsw_distance(index, &query, current);
-
- /* Add to results to pqueue. Will be accepted only if better than
- * the current worse or pqueue not full. */
- pq_push(results, current, dist);
- current = current->next;
- }
-
- /* Copy results to output arrays. */
- uint32_t found = MIN(k, results->count);
- for (uint32_t i = 0; i < found; i++) {
- neighbors[i] = pq_get_node(results, i);
- if (distances) distances[i] = pq_get_distance(results, i);
- }
-
- /* Clean up. */
- pq_free(results);
- hnsw_free_tmp_node(&query, query_vector);
- return found;
-}
--- hnsw.h
@@ -160,11 +160,5 @@ void hnsw_set_allocator(void (*free_ptr)(void*), void *(*malloc_ptr)(size_t),
int hnsw_validate_graph(HNSW *index, uint64_t *connected_nodes, int *reciprocal_links);
void hnsw_test_graph_recall(HNSW *index, int test_ef, int verbose);
float hnsw_distance(HNSW *index, hnswNode *a, hnswNode *b);
-int hnsw_ground_truth_with_filter
- (HNSW *index, const float *query_vector, uint32_t k,
- hnswNode **neighbors, float *distances, uint32_t slot,
- int query_vector_is_normalized,
- int (*filter_callback)(void *value, void *privdata),
- void *filter_privdata);
#endif /* HNSW_H */
--- vset.c
@@ -585,8 +585,7 @@ int vectorSetFilterCallback(void *value, void *privdata) {
* handles the HNSW locking explicitly. */
void VSIM_execute(RedisModuleCtx *ctx, struct vsetObject *vset,
float *vec, unsigned long count, float epsilon, unsigned long withscores,
- unsigned long ef, exprstate *filter_expr, unsigned long filter_ef,
- int ground_truth)
+ unsigned long ef, exprstate *filter_expr, unsigned long filter_ef)
{
/* In our scan, we can't just collect 'count' elements as
* if count is small we would explore the graph in an insufficient
@@ -604,20 +603,10 @@ void VSIM_execute(RedisModuleCtx *ctx, struct vsetObject *vset,
float *distances = RedisModule_Alloc(sizeof(float)*ef);
int slot = hnsw_acquire_read_slot(vset->hnsw);
unsigned int found;
- if (ground_truth) {
- found = hnsw_ground_truth_with_filter(vset->hnsw, vec, ef, neighbors,
- distances, slot, 0,
- filter_expr ? vectorSetFilterCallback : NULL,
- filter_expr);
+ if (filter_expr == NULL) {
+ found = hnsw_search(vset->hnsw, vec, ef, neighbors, distances, slot, 0);
} else {
- if (filter_expr == NULL) {
- found = hnsw_search(vset->hnsw, vec, ef, neighbors,
- distances, slot, 0);
- } else {
- found = hnsw_search_with_filter(vset->hnsw, vec, ef, neighbors,
- distances, slot, 0, vectorSetFilterCallback,
- filter_expr, filter_ef);
- }
+ found = hnsw_search_with_filter(vset->hnsw, vec, ef, neighbors, distances, slot, 0, vectorSetFilterCallback, filter_expr, filter_ef);
}
hnsw_release_read_slot(vset->hnsw,slot);
RedisModule_Free(vec);
@@ -665,7 +654,6 @@ void *VSIM_thread(void *arg) {
unsigned long ef = (unsigned long)targ[6];
exprstate *filter_expr = targ[7];
unsigned long filter_ef = (unsigned long)targ[8];
- unsigned long ground_truth = (unsigned long)targ[9];
RedisModule_Free(targ[4]);
RedisModule_Free(targ);
@@ -673,7 +661,7 @@ void *VSIM_thread(void *arg) {
RedisModuleCtx *ctx = RedisModule_GetThreadSafeContext(bc);
// Run the query.
- VSIM_execute(ctx, vset, vec, count, epsilon, withscores, ef, filter_expr, filter_ef, ground_truth);
+ VSIM_execute(ctx, vset, vec, count, epsilon, withscores, ef, filter_expr, filter_ef);
// Cleanup.
RedisModule_FreeThreadSafeContext(ctx);
@@ -695,7 +683,6 @@ int VSIM_RedisCommand(RedisModuleCtx *ctx, RedisModuleString **argv, int argc) {
long long count = VSET_DEFAULT_COUNT; /* New default value */
long long ef = 0; /* Exploration factor (see HNSW paper) */
double epsilon = 2.0; /* Max cosine distance */
- long long ground_truth = 0; /* Linear scan instead of HNSW search? */
/* Things computed later. */
long long filter_ef = 0;
@@ -785,9 +772,6 @@ int VSIM_RedisCommand(RedisModuleCtx *ctx, RedisModuleString **argv, int argc) {
if (!strcasecmp(opt, "WITHSCORES")) {
withscores = 1;
j++;
- } else if (!strcasecmp(opt, "TRUTH")) {
- ground_truth = 1;
- j++;
} else if (!strcasecmp(opt, "COUNT") && j+1 < argc) {
if (RedisModule_StringToLongLong(argv[j+1], &count)
!= REDISMODULE_OK || count <= 0)
@@ -868,7 +852,7 @@ int VSIM_RedisCommand(RedisModuleCtx *ctx, RedisModuleString **argv, int argc) {
RedisModuleBlockedClient *bc = RedisModule_BlockClient(ctx,NULL,NULL,NULL,0);
pthread_t tid;
- void **targ = RedisModule_Alloc(sizeof(void*)*10);
+ void **targ = RedisModule_Alloc(sizeof(void*)*9);
targ[0] = bc;
targ[1] = vset;
targ[2] = vec;
@@ -879,17 +863,16 @@ int VSIM_RedisCommand(RedisModuleCtx *ctx, RedisModuleString **argv, int argc) {
targ[6] = (void*)(unsigned long)ef;
targ[7] = (void*)filter_expr;
targ[8] = (void*)(unsigned long)filter_ef;
- targ[9] = (void*)(unsigned long)ground_truth;
if (pthread_create(&tid,NULL,VSIM_thread,targ) != 0) {
pthread_rwlock_unlock(&vset->in_use_lock);
RedisModule_AbortBlock(bc);
RedisModule_Free(vec);
RedisModule_Free(targ[4]);
RedisModule_Free(targ);
- VSIM_execute(ctx, vset, vec, count, epsilon, withscores, ef, filter_expr, filter_ef, ground_truth);
+ VSIM_execute(ctx, vset, vec, count, epsilon, withscores, ef, filter_expr, filter_ef);
}
} else {
- VSIM_execute(ctx, vset, vec, count, epsilon, withscores, ef, filter_expr, filter_ef, ground_truth);
+ VSIM_execute(ctx, vset, vec, count, epsilon, withscores, ef, filter_expr, filter_ef);
}
return REDISMODULE_OK;
|
redis
|
redis
|
C
|
C
| 68,201
| 23,916
|
Redis is an in-memory database that persists on disk. The data model is key-value, but many different kind of values are supported: Strings, Lists, Sets, Sorted Sets, Hashes, Streams, HyperLogLogs, Bitmaps.
|
redis_redis
|
DOC_CHANGE
|
changes in readme
|
86f9fbbaef3b8716765e1ece085496d9fbfcad9e
|
2025-01-23 22:32:33
|
jbengler
|
Update NEWS
| false
| 8
| 0
| 8
|
--- NEWS.md
@@ -1,13 +1,5 @@
# tidyplots (development version)
-## Breaking changes
-
-* Hard deprecation of `as_tidyplot()`. Converting a ggplot to a tidyplot was probably never a good idea.
-
-## Improvements
-
-* Switch from the magrittr pipe `%>%` to the base R pipe `|>` in both the documentation and code (#55, #56)
-
# tidyplots 0.2.1
## Breaking changes
|
tidyplots
|
jbengler
|
R
|
R
| 495
| 18
|
Tidy Plots for Scientific Papers
|
jbengler_tidyplots
|
DOC_CHANGE
|
The prefix fix: suggests a bug fix, but the actual change is not fixing code behavior, it’s improving documentation rendering
|
4531ebac1bc5aa2b94386dd722683e59128f6fc0
|
2025-01-30 21:34:33
|
Stelios Fragkakis
|
Fix coverity issue (#19535) Better fix coverity issue, always cleanup
| false
| 2
| 4
| 6
|
--- src/database/sqlite/sqlite_aclk.c
@@ -882,8 +882,10 @@ static void aclk_synchronization(void *arg)
(void) uv_loop_close(loop);
// Free execute commands / queries
- free_query_list(aclk_query_execute->JudyL);
- (void)JudyLFreeArray(&aclk_query_execute->JudyL, PJE0);
+ if (pending_queries) {
+ free_query_list(aclk_query_execute->JudyL);
+ (void)JudyLFreeArray(&aclk_query_execute->JudyL, PJE0);
+ }
freez(aclk_query_execute);
// Free batch commands
|
netdata
|
netdata
|
C
|
C
| 73,681
| 6,023
|
X-Ray Vision for your infrastructure!
|
netdata_netdata
|
BUG_FIX
|
Obvious
|
b684d7d71a5d5e5300fbe2d90f2977777ff5d342
|
2025-04-05T15:57:57Z
|
chromium-internal-autoroll
|
Roll Media App from xrI54Nb6AILqiwdbc... to tYUDS08XNqGPs10VS... Release_Notes: http://go/media_app-x20/relnotes/Main/media_app_202504050700_RC00.html https://chrome-infra-packages.appspot.com/p/chromeos_internal/apps/media_app/app/+/tYUDS08XNqGPs10VSaG4S0oyekAPAp2gUhOH6IQRRbUC If this roll has caused a breakage, revert this CL and stop the roller using the controls here: https://skia-autoroll.corp.goog/r/media-app-chromium-autoroll Please CC [email protected],[email protected],[email protected] on the revert to ensure that a human is aware of the problem. To report a problem with the AutoRoller itself, please file a bug: https://issues.skia.org/issues/new?component=1389291&template=1850622 Documentation for the AutoRoller is here: https://skia.googlesource.com/buildbot/+doc/main/autoroll/README.md Cq-Include-Trybots: luci.chrome.try:chromeos-betty-chrome;luci.chrome.try:linux-chromeos-chrome Bug: None Tbr: [email protected] Change-Id: I893b826c428c0dfecd0bcc820715b684b650b475 Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/6435977 Bot-Commit: chromium-internal-autoroll <chromium-internal-autoroll@skia-corp.google.com.iam.gserviceaccount.com> Commit-Queue: chromium-internal-autoroll <chromium-internal-autoroll@skia-corp.google.com.iam.gserviceaccount.com> Cr-Commit-Position: refs/heads/main@{#1443107}
| false
| 1
| 1
| 2
|
--- DEPS
@@ -3053,7 +3053,7 @@ deps = {
'packages': [
{
'package': 'chromeos_internal/apps/media_app/app',
- 'version': 'xrI54Nb6AILqiwdbcVexWkmiEcURwGoevY7_OSABFE0C',
+ 'version': 'tYUDS08XNqGPs10VSaG4S0oyekAPAp2gUhOH6IQRRbUC',
},
],
'condition': 'checkout_chromeos and checkout_src_internal',
|
chromium
| null |
C
|
C
| null | null |
Browser
|
_chromium
|
CONFIG_CHANGE
|
version/id change of roll media app
|
e137a5280f1ca2f6fee0c5d50418fc36c9236848
|
2025-02-20 04:45:10
|
Mateo Guzmán
|
Reland of: Make `RCTLog` & `ExceptionDataHelper` internal (#49536) Summary: Pull Request resolved: https://github.com/facebook/react-native/pull/49536 This is a re-land of a previous diff D69836482 which was reverted due to a build failure. As part of the initiative to reduce the public API surface, this classes can be internalized. I've checked there are no relevant OSS usages: - [RCTLog](https://github.com/search?type=code&q=NOT+is%3Afork+NOT+org%3Afacebook+NOT+repo%3Areact-native-tvos%2Freact-native-tvos+NOT+repo%3Anuagoz%2Freact-native+NOT+repo%3A2lambda123%2Freact-native+NOT+repo%3Abeanchips%2Ffacebookreactnative+NOT+repo%3AfabOnReact%2Freact-native-notes+NOT+user%3Ahuntie+NOT+repo%3AMaxdev18%2Fpowersync_app+NOT+repo%3Acarter-0%2Finstagram-decompiled+NOT+repo%3Am0mosenpai%2Finstadamn+NOT+repo%3AA-Star100%2FA-Star100-AUG2-2024+NOT+repo%3Alclnrd%2Fdetox-scrollview-reproductible+NOT+repo%3ADionisisChytiris%2FWorldWiseTrivia_Main+NOT+repo%3Apast3l%2Fhi2+NOT+repo%3AoneDotpy%2FCaribouQuest+NOT+repo%3Abejayoharen%2Fdailytodo+NOT+repo%3Amolangning%2Freversing-discord+NOT+repo%3AScottPrzy%2Freact-native+NOT+repo%3Agabrieldonadel%2Freact-native-visionos+NOT+repo%3AGabriel2308%2FTestes-Soft+NOT+repo%3Adawnzs03%2FflakyBuild+NOT+repo%3Acga2351%2Fcode+NOT+repo%3Astreeg%2Ftcc+NOT+repo%3Asoftware-mansion-labs%2Freact-native-swiftui+com.facebook.react.util.RCTLog) - [ExceptionDataHelper](https://github.com/search?type=code&q=NOT+is%3Afork+NOT+org%3Afacebook+NOT+repo%3Areact-native-tvos%2Freact-native-tvos+NOT+repo%3Anuagoz%2Freact-native+NOT+repo%3A2lambda123%2Freact-native+NOT+repo%3Abeanchips%2Ffacebookreactnative+NOT+repo%3AfabOnReact%2Freact-native-notes+NOT+user%3Ahuntie+NOT+repo%3AMaxdev18%2Fpowersync_app+NOT+repo%3Acarter-0%2Finstagram-decompiled+NOT+repo%3Am0mosenpai%2Finstadamn+NOT+repo%3AA-Star100%2FA-Star100-AUG2-2024+NOT+repo%3Alclnrd%2Fdetox-scrollview-reproductible+NOT+repo%3ADionisisChytiris%2FWorldWiseTrivia_Main+NOT+repo%3Apast3l%2Fhi2+NOT+repo%3AoneDotpy%2FCaribouQuest+NOT+repo%3Abejayoharen%2Fdailytodo+NOT+repo%3Amolangning%2Freversing-discord+NOT+repo%3AScottPrzy%2Freact-native+NOT+repo%3Agabrieldonadel%2Freact-native-visionos+NOT+repo%3AGabriel2308%2FTestes-Soft+NOT+repo%3Adawnzs03%2FflakyBuild+NOT+repo%3Acga2351%2Fcode+NOT+repo%3Astreeg%2Ftcc+NOT+repo%3Asoftware-mansion-labs%2Freact-native-swiftui+com.facebook.react.util.ExceptionDataHelper) ## Changelog: [INTERNAL] - Make RCTLog & ExceptionDataHelper internal Pull Request resolved: https://github.com/facebook/react-native/pull/49502 Test Plan: ```bash yarn test-android yarn android ``` Reviewed By: mdvacca Differential Revision: D69863875 Pulled By: cortinico fbshipit-source-id: 59f0ccbcbeba6e75b776d3bb8fd7c672f1b50994
| false
| 5
| 15
| 20
|
--- packages/react-native/ReactAndroid/api/ReactAndroid.api
@@ -5720,6 +5720,12 @@ public abstract interface class com/facebook/react/uimanager/util/ReactFindViewU
public abstract fun onViewFound (Landroid/view/View;)V
}
+public final class com/facebook/react/util/ExceptionDataHelper {
+ public static final field EXTRA_DATA_FIELD Ljava/lang/String;
+ public static final field INSTANCE Lcom/facebook/react/util/ExceptionDataHelper;
+ public static final fun getExtraDataAsJson (Lcom/facebook/react/bridge/ReadableMap;)Ljava/lang/String;
+}
+
public final class com/facebook/react/util/JSStackTrace {
public static final field COLUMN_KEY Ljava/lang/String;
public static final field FILE_KEY Ljava/lang/String;
@@ -5729,6 +5735,10 @@ public final class com/facebook/react/util/JSStackTrace {
public static final fun format (Ljava/lang/String;Lcom/facebook/react/bridge/ReadableArray;)Ljava/lang/String;
}
+public abstract interface class com/facebook/react/util/RCTLog : com/facebook/react/bridge/JavaScriptModule {
+ public abstract fun logIfNoNativeHook (Ljava/lang/String;Ljava/lang/String;)V
+}
+
public final class com/facebook/react/util/RNLog {
public static final field ADVICE I
public static final field ERROR I
--- packages/react-native/ReactAndroid/src/main/java/com/facebook/react/util/ExceptionDataHelper.kt
@@ -14,12 +14,12 @@ import com.facebook.react.bridge.ReadableType
import java.io.IOException
import java.io.StringWriter
-internal object ExceptionDataHelper {
+public object ExceptionDataHelper {
- const val EXTRA_DATA_FIELD: String = "extraData"
+ public const val EXTRA_DATA_FIELD: String = "extraData"
@JvmStatic
- fun getExtraDataAsJson(metadata: ReadableMap?): String? {
+ public fun getExtraDataAsJson(metadata: ReadableMap?): String? {
if (metadata == null || metadata.getType(EXTRA_DATA_FIELD) == ReadableType.Null) {
return null
}
--- packages/react-native/ReactAndroid/src/main/java/com/facebook/react/util/RCTLog.kt
@@ -14,12 +14,12 @@ import com.facebook.react.bridge.JavaScriptModule
*
* The RCTLog module allows for showing native logs in JavaScript.
*/
-internal interface RCTLog : JavaScriptModule {
+public interface RCTLog : JavaScriptModule {
/**
* Send a log to JavaScript.
*
* @param level The level of the log.
* @param message The message to log.
*/
- fun logIfNoNativeHook(level: String?, message: String?)
+ public fun logIfNoNativeHook(level: String?, message: String?)
}
|
react-native
|
facebook
|
C++
|
C++
| 120,863
| 24,536
|
A framework for building native applications using React
|
facebook_react-native
|
CODE_IMPROVEMENT
|
refactoring
|
0a8b4ca31ec9b7af7b9f1256abab0f983292e49c
|
2024-02-15 21:44:08
|
hugo-syn
|
chore: fix typo in CHANGELOG.md (#3829) Fix typo in CHANGELOG.md
---------
Co-authored-by: Jon Shier <[email protected]>
| false
| 2
| 2
| 4
|
--- CHANGELOG.md
@@ -938,14 +938,14 @@ Released on 2019-03-27. All issues associated with this milestone can be found u
#### Updated
-- Project for compatibility with Xcode 10.2.
+- Project for compatability with Xcode 10.2.
- Updated by [Jon Shier](https://github.com/jshier) in Pull Request [#2767](https://github.com/Alamofire/Alamofire/pull/2767).
- MultipartFormData to have a mutable boundary.
- Updated by [Ondrej Stocek](https://github.com/ondrejstocek) in Pull Request [#2705](https://github.com/Alamofire/Alamofire/pull/2705).
#### Fixed
-- Compatibility with SPM from Xcode 10.2.
+- Compatability with SPM from Xcode 10.2.
- Fixed by [Klaas](https://github.com/klaas) in Pull Request [#2762](https://github.com/Alamofire/Alamofire/pull/2762).
## [4.8.1](https://github.com/Alamofire/Alamofire/releases/tag/4.8.1)
|
alamofire
|
alamofire
|
Swift
|
Swift
| 41,720
| 7,598
|
Elegant HTTP Networking in Swift
|
alamofire_alamofire
|
DOC_CHANGE
|
changes in md file
|
f81d9ddb76cf4a404a3d38d8b38f3671574ebac6
| null |
lissyx
|
Bring back CUDA and CuDNN versions.
| false
| 1
| 1
| 0
|
--- README.md
@@ -90,7 +90,7 @@ Please refer to your system's documentation on how to install these dependencies
### CUDA dependency
-The GPU capable builds (Python, NodeJS, C++, etc) depend on the same CUDA runtime as upstream TensorFlow. Make sure you've installed the correct version of CUDA
+The GPU capable builds (Python, NodeJS, C++, etc) depend on the same CUDA runtime as upstream TensorFlow. Currently with TensorFlow 1.13 it depends on CUDA 10.0 and CuDNN v7.5.
### Getting the pre-trained model
|
mozilla_DeepSpeech.json
| null | null | null | null | null | null |
mozilla_DeepSpeech.json
|
CONFIG_CHANGE
|
5, obvious
|
4d43977d1bf4eb1009f8f0a910d3b77d210c2efc
|
2025-03-25 23:55:31
|
David Dunleavy
|
Update tags documentation for `notap` `notap` can be seen as the equivalent to `no_oss` internally PiperOrigin-RevId: 740411905
| false
| 1
| 3
| 4
|
--- third_party/xla/build_tools/lint/tags.py
@@ -41,7 +41,9 @@ _TAGS_TO_DOCUMENTATION_MAP = {
" doesn't build things tagged with this either."
),
# Various disable tags (currently *unrecognized* by OpenXLA CI)
- "notap": "Internal tag which disables the test. Not used on OpenXLA CI.",
+ "notap": (
+ "Internal tag which disables the test. Will be extended to OpenXLA CI."
+ ),
"nosan": "Disabled under all sanitizers. Not used on OpenXLA CI.",
"noasan": "Disabled under asan. Not used on OpenXLA CI.",
"nomsan": "Disabled under msan. Not used on OpenXLA CI.",
|
tensorflow
|
tensorflow
|
C++
|
C++
| 188,388
| 74,565
|
An Open Source Machine Learning Framework for Everyone
|
tensorflow_tensorflow
|
PERF_IMPROVEMENT
|
simplify decoder draining logic
|
fe68f708ce723cef640c1cf784cb29da513bca22
| null |
Martin Bean
|
Removed funky characters.
| false
| 1
| 1
| 0
|
--- javascript.html
@@ -76,7 +76,7 @@
<div class="container">
<!-- Modal
- ================================================== -->
+ ================================================== -->
<section id="modal">
<div class="page-header">
|
twbs_bootstrap.json
| null | null | null | null | null | null |
twbs_bootstrap.json
|
CODE_IMPROVEMENT
|
5, obvious
|
a4a9a8cd8ccb4240a7c5df5f6766bd5340646e63
|
2023-03-01 15:53:32
|
Julian Suarez
|
feat(rvm): add `rb32` alias (#11533)
| false
| 2
| 0
| 2
|
--- plugins/rvm/README.md
@@ -24,7 +24,6 @@ plugins=(... rvm)
| `rb27` | `rvm use ruby-2.7` |
| `rb30` | `rvm use ruby-3.0` |
| `rb31` | `rvm use ruby-3.1` |
-| `rb32` | `rvm use ruby-3.2` |
| `rvm-update` | `rvm get head` |
| `gems` | `gem list` |
| `rvms` | `rvm gemset` |
--- plugins/rvm/rvm.plugin.zsh
@@ -27,7 +27,6 @@ rubies=(
27 'ruby-2.7'
30 'ruby-3.0'
31 'ruby-3.1'
- 32 'ruby-3.2'
)
for v in ${(k)rubies}; do
|
ohmyzsh
|
ohmyzsh
|
Shell
|
Shell
| 176,465
| 26,013
|
🙃 A delightful community-driven (with 2,400+ contributors) framework for managing your zsh configuration. Includes 300+ optional plugins (rails, git, macOS, hub, docker, homebrew, node, php, python, etc), 140+ themes to spice up your morning, and an auto-update tool that makes it easy to keep up with the latest updates from the community.
|
ohmyzsh_ohmyzsh
|
NEW_FEAT
|
Obvious
|
a30505f842e0a906d1d8e059071698791c999ca7
|
2024-12-02 04:45:21
|
simonhamp
|
Fix styling
| false
| 1
| 3
| 4
|
--- tests/Fakes/FakeGlobalShortcutTest.php
@@ -1,8 +1,9 @@
<?php
-use Native\Laravel\Contracts\GlobalShortcut as GlobalShortcutContract;
use Native\Laravel\Facades\GlobalShortcut;
+use Native\Laravel\Contracts\GlobalShortcut as GlobalShortcutContract;
use Native\Laravel\Fakes\GlobalShortcutFake;
+
use PHPUnit\Framework\AssertionFailedError;
use function Pest\Laravel\swap;
@@ -120,3 +121,4 @@ it('asserts unregistered count', function () {
$this->fail('Expected assertion to fail');
});
+
|
laravel
|
nativephp
|
PHP
|
PHP
| 3,498
| 182
|
Laravel wrapper for the NativePHP framework
|
nativephp_laravel
|
BUG_FIX
|
styling fixed
|
67338703aa52d662998733e58671dc9fe1edae47
|
2025-03-14 06:01:04
|
lauren
|
[ci] Update yarn and node_modules cache key (#32603) Now that the compiler lint rule is merged into eslint-plugin-react-hooks, we also need to update our caches so compiler dependencies are also cached. This should fix the CI walltime regression we are now seeing. --- [//]: # (BEGIN SAPLING FOOTER) Stack created with [Sapling](https://sapling-scm.com). Best reviewed with [ReviewStack](https://reviewstack.dev/facebook/react/pull/32603). * #32604 * __->__ #32603
| false
| 66
| 41
| 107
|
--- .github/workflows/runtime_build_and_test.yml
@@ -50,13 +50,13 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
- name: Restore cached node_modules
uses: actions/cache@v4
id: node_modules
with:
path: "**/node_modules"
- key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('**/yarn.lock') }}
+ key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('yarn.lock') }}
- name: Ensure clean build directory
run: rm -rf build
- run: yarn install --frozen-lockfile
@@ -74,18 +74,16 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
- name: Restore cached node_modules
uses: actions/cache@v4
id: node_modules
with:
path: "**/node_modules"
- key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('**/yarn.lock') }}
+ key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('yarn.lock') }}
- name: Ensure clean build directory
run: rm -rf build
- run: yarn install --frozen-lockfile
- - run: yarn install --frozen-lockfile
- working-directory: compiler
- run: |
yarn generate-inline-fizz-runtime
git diff --quiet || (echo "There was a change to the Fizz runtime. Run `yarn generate-inline-fizz-runtime` and check in the result." && false)
@@ -102,18 +100,16 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
- name: Restore cached node_modules
uses: actions/cache@v4
id: node_modules
with:
path: "**/node_modules"
- key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('**/yarn.lock') }}
+ key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('yarn.lock') }}
- name: Ensure clean build directory
run: rm -rf build
- run: yarn install --frozen-lockfile
- - run: yarn install --frozen-lockfile
- working-directory: compiler
- run: yarn flags
# ----- TESTS -----
@@ -157,18 +153,16 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
- name: Restore cached node_modules
uses: actions/cache@v4
id: node_modules
with:
path: "**/node_modules"
- key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('**/yarn.lock') }}
+ key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('yarn.lock') }}
- name: Ensure clean build directory
run: rm -rf build
- run: yarn install --frozen-lockfile
- - run: yarn install --frozen-lockfile
- working-directory: compiler
- run: yarn test ${{ matrix.params }} --ci --shard=${{ matrix.shard }}
# ----- BUILD -----
@@ -189,7 +183,7 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
- uses: actions/setup-java@v4
with:
distribution: temurin
@@ -199,12 +193,10 @@ jobs:
id: node_modules
with:
path: "**/node_modules"
- key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('**/yarn.lock') }}
+ key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('yarn.lock') }}
- name: Ensure clean build directory
run: rm -rf build
- run: yarn install --frozen-lockfile
- - run: yarn install --frozen-lockfile
- working-directory: compiler
- run: yarn build --index=${{ matrix.worker_id }} --total=20 --r=${{ matrix.release_channel }} --ci
env:
CI: github
@@ -269,18 +261,16 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
- name: Restore cached node_modules
uses: actions/cache@v4
id: node_modules
with:
path: "**/node_modules"
- key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('**/yarn.lock') }}
+ key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('yarn.lock') }}
- name: Ensure clean build directory
run: rm -rf build
- run: yarn install --frozen-lockfile
- - run: yarn install --frozen-lockfile
- working-directory: compiler
- name: Restore archived build
uses: actions/download-artifact@v4
with:
@@ -303,18 +293,16 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
- name: Restore cached node_modules
uses: actions/cache@v4
id: node_modules
with:
path: "**/node_modules"
- key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('**/yarn.lock') }}
+ key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('yarn.lock') }}
- name: Ensure clean build directory
run: rm -rf build
- run: yarn install --frozen-lockfile
- - run: yarn install --frozen-lockfile
- working-directory: compiler
- name: Restore archived build
uses: actions/download-artifact@v4
with:
@@ -352,18 +340,16 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
- name: Restore cached node_modules
uses: actions/cache@v4
id: node_modules
with:
path: "**/node_modules"
- key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('**/yarn.lock') }}
+ key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('yarn.lock') }}
- name: Ensure clean build directory
run: rm -rf build
- run: yarn install --frozen-lockfile
- - run: yarn install --frozen-lockfile
- working-directory: compiler
- name: Restore archived build
uses: actions/download-artifact@v4
with:
@@ -389,18 +375,16 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
- name: Restore cached node_modules
uses: actions/cache@v4
id: node_modules
with:
path: "**/node_modules"
- key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('**/yarn.lock') }}
+ key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('yarn.lock') }}
- name: Ensure clean build directory
run: rm -rf build
- run: yarn install --frozen-lockfile
- - run: yarn install --frozen-lockfile
- working-directory: compiler
- name: Restore archived build
uses: actions/download-artifact@v4
with:
@@ -423,13 +407,13 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: 'fixtures/dom/yarn.lock'
+ cache-dependency-path: yarn.lock
- name: Restore cached node_modules
uses: actions/cache@v4
id: node_modules
with:
path: "**/node_modules"
- key: fixtures_dom-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('fixtures/dom/yarn.lock') }}
+ key: fixtures_dom-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('yarn.lock') }}
- name: Ensure clean build directory
run: rm -rf build
- run: yarn install --frozen-lockfile
@@ -463,7 +447,7 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
# Fixture copies some built packages from the workroot after install.
# That means dependencies of the built packages are not installed.
# We need to install dependencies of the workroot to fulfill all dependency constraints
@@ -472,12 +456,10 @@ jobs:
id: node_modules
with:
path: "**/node_modules"
- key: fixtures_flight-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('**/yarn.lock') }}
+ key: fixtures_flight-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('yarn.lock') }}
- name: Ensure clean build directory
run: rm -rf build
- run: yarn install --frozen-lockfile
- - run: yarn install --frozen-lockfile
- working-directory: compiler
- name: Restore archived build
uses: actions/download-artifact@v4
with:
@@ -530,18 +512,16 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
- name: Restore cached node_modules
uses: actions/cache@v4
id: node_modules
with:
path: "**/node_modules"
- key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('**/yarn.lock') }}
+ key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('yarn.lock') }}
- name: Ensure clean build directory
run: rm -rf build
- run: yarn install --frozen-lockfile
- - run: yarn install --frozen-lockfile
- working-directory: compiler
- name: Restore archived build
uses: actions/download-artifact@v4
with:
@@ -583,18 +563,16 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
- name: Restore cached node_modules
uses: actions/cache@v4
id: node_modules
with:
path: "**/node_modules"
- key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('**/yarn.lock') }}
+ key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('yarn.lock') }}
- name: Ensure clean build directory
run: rm -rf build
- run: yarn install --frozen-lockfile
- - run: yarn install --frozen-lockfile
- working-directory: compiler
- name: Restore archived build
uses: actions/download-artifact@v4
with:
@@ -622,13 +600,13 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
- name: Restore cached node_modules
uses: actions/cache@v4
id: node_modules
with:
path: "**/node_modules"
- key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('**/yarn.lock') }}
+ key: runtime-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('yarn.lock') }}
- name: Ensure clean build directory
run: rm -rf build
- run: yarn install --frozen-lockfile
@@ -648,8 +626,6 @@ jobs:
- name: Ensure clean build directory
run: rm -rf build
- run: yarn install --frozen-lockfile
- - run: yarn install --frozen-lockfile
- working-directory: compiler
- name: Restore archived build for PR
uses: actions/download-artifact@v4
with:
--- .github/workflows/runtime_eslint_plugin_e2e.yml
@@ -35,18 +35,16 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
- name: Restore cached node_modules
uses: actions/cache@v4
id: node_modules
with:
- path: "**/node_modules"
- key: runtime-eslint_e2e-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('**/yarn.lock') }}
+ path: "node_modules"
+ key: runtime-eslint_e2e-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('yarn.lock') }}
- name: Ensure clean build directory
run: rm -rf build
- run: yarn install --frozen-lockfile
- - run: yarn install --frozen-lockfile
- working-directory: compiler
- name: Build plugin
working-directory: fixtures/eslint-v${{ matrix.eslint_major }}
run: node build.mjs
--- .github/workflows/runtime_prereleases.yml
@@ -34,13 +34,13 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
- name: Restore cached node_modules
uses: actions/cache@v4
id: node_modules
with:
path: "**/node_modules"
- key: runtime-release-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('**/yarn.lock') }}
+ key: runtime-release-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('yarn.lock', 'scripts/release/yarn.lock') }}
- name: Ensure clean build directory
run: rm -rf build
- run: yarn install --frozen-lockfile
--- .github/workflows/runtime_releases_from_npm_manual.yml
@@ -66,13 +66,13 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
- name: Restore cached node_modules
uses: actions/cache@v4
id: node_modules
with:
path: "**/node_modules"
- key: runtime-release-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('**/yarn.lock') }}
+ key: runtime-release-node_modules-${{ runner.arch }}-${{ runner.os }}-${{ hashFiles('yarn.lock', 'scripts/release/yarn.lock') }}
- name: Ensure clean build directory
run: rm -rf build
- run: yarn install --frozen-lockfile
--- .github/workflows/shared_lint.yml
@@ -24,7 +24,7 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
- name: Restore cached node_modules
uses: actions/cache@v4
with:
@@ -44,7 +44,7 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
- name: Restore cached node_modules
uses: actions/cache@v4
with:
@@ -64,7 +64,7 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
- name: Restore cached node_modules
uses: actions/cache@v4
with:
@@ -84,7 +84,7 @@ jobs:
with:
node-version-file: '.nvmrc'
cache: yarn
- cache-dependency-path: '**/yarn.lock'
+ cache-dependency-path: yarn.lock
- name: Restore cached node_modules
uses: actions/cache@v4
with:
--- package.json
@@ -118,7 +118,7 @@
"testRegex": "/scripts/jest/dont-run-jest-directly\\.js$"
},
"scripts": {
- "prebuild": "./scripts/react-compiler/link-compiler.sh",
+ "prebuild": "yarn --cwd compiler install --frozen-lockfile && ./scripts/react-compiler/link-compiler.sh",
"build": "node ./scripts/rollup/build-all-release-channels.js",
"build-for-devtools": "cross-env RELEASE_CHANNEL=experimental yarn build react/index,react/jsx,react/compiler-runtime,react-dom/index,react-dom/client,react-dom/unstable_testing,react-dom/test-utils,react-is,react-debug-tools,scheduler,react-test-renderer,react-refresh,react-art --type=NODE",
"build-for-devtools-dev": "yarn build-for-devtools --type=NODE_DEV",
--- scripts/react-compiler/build-compiler.sh
@@ -11,4 +11,5 @@ if [[ "$REACT_CLASS_EQUIVALENCE_TEST" == "true" ]]; then
fi
echo "Building babel-plugin-react-compiler..."
+yarn --cwd compiler install --frozen-lockfile
yarn --cwd compiler workspace babel-plugin-react-compiler build --dts
|
react
|
facebook
|
JavaScript
|
JavaScript
| 232,878
| 47,794
|
The library for web and native user interfaces.
|
facebook_react
|
BUG_FIX
|
Code change: bug removal
|
5b979f006cc918c39d93292f2fa321c42cc97cba
|
2022-11-08 05:34:06
|
Yuze Jiang
|
Revert "Fix use of Timer method not available in macOS 10.11" This reverts commit 26d8cac6dbe3f1bed6fd3bbba1a0df6e712da73b.
| false
| 21
| 36
| 57
|
--- iina/AppDelegate.swift
@@ -84,6 +84,9 @@ class AppDelegate: NSObject, NSApplicationDelegate {
return PreferenceWindowController(viewControllers: list)
}()
+ /// Whether the shutdown sequence timed out.
+ private var timedOut = false
+
@IBOutlet weak var menuController: MenuController!
@IBOutlet weak var dockMenu: NSMenu!
@@ -300,6 +303,28 @@ class AppDelegate: NSObject, NSApplicationDelegate {
return Preference.bool(for: .quitWhenNoOpenedWindow)
}
+ @objc
+ func shutdownTimedout() {
+ timedOut = true
+ Logger.log("Timed out waiting for players to stop and shutdown", level: .warning)
+ // For debugging list players that have not terminated.
+ for player in PlayerCore.playerCores {
+ let label = player.label ?? "unlabeled"
+ if !player.isStopped {
+ Logger.log("Player \(label) failed to stop", level: .warning)
+ } else if !player.isShutdown {
+ Logger.log("Player \(label) failed to shutdown", level: .warning)
+ }
+ }
+ // For debugging purposes we do not remove observers in case players stop or shutdown after
+ // the timeout has fired as knowing that occurred maybe useful for debugging why the
+ // termination sequence failed to complete on time.
+ Logger.log("Not waiting for players to shutdown; proceeding with application termination",
+ level: .warning)
+ // Tell Cocoa to proceed with termination.
+ NSApp.reply(toApplicationShouldTerminate: true)
+ }
+
func applicationShouldTerminate(_ sender: NSApplication) -> NSApplication.TerminateReply {
Logger.log("App should terminate")
isTerminating = true
@@ -356,26 +381,16 @@ class AppDelegate: NSObject, NSApplicationDelegate {
// arbitrary timeout that forces termination to complete. The expectation is that this timeout
// is never triggered. If a timeout warning is logged during termination then that needs to be
// investigated.
- var timedOut = false
- let timer = Timer(timeInterval: 10, repeats: false) { _ in
- timedOut = true
- Logger.log("Timed out waiting for players to stop and shutdown", level: .warning)
- // For debugging list players that have not terminated.
- for player in PlayerCore.playerCores {
- let label = player.label ?? "unlabeled"
- if !player.isStopped {
- Logger.log("Player \(label) failed to stop", level: .warning)
- } else if !player.isShutdown {
- Logger.log("Player \(label) failed to shutdown", level: .warning)
- }
+ var timer: Timer
+ if #available(macOS 10.12, *) {
+ timer = Timer(timeInterval: 10, repeats: false) { _ in
+ // Once macOS 10.11 is no longer supported the contents of the method can be inlined in this
+ // closure.
+ self.shutdownTimedout()
}
- // For debugging purposes we do not remove observers in case players stop or shutdown after
- // the timeout has fired as knowing that occurred maybe useful for debugging why the
- // termination sequence failed to complete on time.
- Logger.log("Not waiting for players to shutdown; proceeding with application termination",
- level: .warning)
- // Tell Cocoa to proceed with termination.
- NSApp.reply(toApplicationShouldTerminate: true)
+ } else {
+ timer = Timer(timeInterval: TimeInterval(10), target: self,
+ selector: #selector(self.shutdownTimedout), userInfo: nil, repeats: false)
}
RunLoop.main.add(timer, forMode: .common)
@@ -383,7 +398,7 @@ class AppDelegate: NSObject, NSApplicationDelegate {
let center = NotificationCenter.default
var observers: [NSObjectProtocol] = []
var observer = center.addObserver(forName: .iinaPlayerStopped, object: nil, queue: .main) { note in
- guard !timedOut else {
+ guard !self.timedOut else {
// The player has stopped after IINA already timed out, gave up waiting for players to
// shutdown, and told Cocoa to proceed with termination. AppKit will continue to process
// queued tasks during application termination even after AppKit has called
@@ -409,7 +424,7 @@ class AppDelegate: NSObject, NSApplicationDelegate {
// Establish an observer for a player core shutting down.
observer = center.addObserver(forName: .iinaPlayerShutdown, object: nil, queue: .main) { _ in
- guard !timedOut else {
+ guard !self.timedOut else {
// The player has shutdown after IINA already timed out, gave up waiting for players to
// shutdown, and told Cocoa to proceed with termination. AppKit will continue to process
// queued tasks during application termination even after AppKit has called
|
iina
|
iina
|
Swift
|
Swift
| 39,591
| 2,605
|
The modern video player for macOS.
|
iina_iina
|
BUG_FIX
|
this commit fixes/polishes an earlier feature
|
a60f45ce31b5486e817b6615b8db1efb4fa34b31
|
2025-03-20 09:36:13
|
2dust
|
The timeout was changed from 30s to 15s
| false
| 9
| 9
| 18
|
--- V2rayNG/app/src/main/java/com/v2ray/ang/handler/AngConfigManager.kt
@@ -420,7 +420,7 @@ object AngConfigManager {
var configText = try {
val httpPort = SettingsManager.getHttpPort()
- HttpUtil.getUrlContentWithUserAgent(url, 15000, httpPort)
+ HttpUtil.getUrlContentWithUserAgent(url, 30000, httpPort)
} catch (e: Exception) {
Log.e(AppConfig.ANG_PACKAGE, "Update subscription: proxy not ready or other error, try……")
//e.printStackTrace()
--- V2rayNG/app/src/main/java/com/v2ray/ang/handler/SpeedtestManager.kt
@@ -135,7 +135,7 @@ object SpeedtestManager {
var result: String
var elapsed = -1L
- val conn = HttpUtil.createProxyConnection(SettingsManager.getDelayTestUrl(), port, 15000, 15000) ?: return Pair(elapsed, "")
+ val conn = HttpUtil.createProxyConnection(SettingsManager.getDelayTestUrl(), port, 30000, 30000) ?: return Pair(elapsed, "")
try {
val start = SystemClock.elapsedRealtime()
val code = conn.responseCode
--- V2rayNG/app/src/main/java/com/v2ray/ang/ui/UserAssetActivity.kt
@@ -206,9 +206,9 @@ class UserAssetActivity : BaseActivity() {
var resultCount = 0
lifecycleScope.launch(Dispatchers.IO) {
assets.forEach {
- var result = downloadGeo(it.second, 15000, httpPort)
+ var result = downloadGeo(it.second, 30000, httpPort)
if (!result) {
- result = downloadGeo(it.second, 15000, 0)
+ result = downloadGeo(it.second, 30000, 0)
}
if (result)
resultCount++
--- V2rayNG/app/src/main/java/com/v2ray/ang/util/HttpUtil.kt
@@ -50,7 +50,7 @@ object HttpUtil {
* @throws IOException If an I/O error occurs.
*/
@Throws(IOException::class)
- fun getUrlContentWithUserAgent(url: String?, timeout: Int = 15000, httpPort: Int = 0): String {
+ fun getUrlContentWithUserAgent(url: String?, timeout: Int = 30000, httpPort: Int = 0): String {
var currentUrl = url
var redirects = 0
val maxRedirects = 3
@@ -88,16 +88,16 @@ object HttpUtil {
*
* @param urlStr The target URL address.
* @param port The port of the proxy server.
- * @param connectTimeout The connection timeout in milliseconds (default is 15000 ms).
- * @param readTimeout The read timeout in milliseconds (default is 15000 ms).
+ * @param connectTimeout The connection timeout in milliseconds (default is 30000 ms).
+ * @param readTimeout The read timeout in milliseconds (default is 30000 ms).
* @param needStream Whether the connection needs to support streaming.
* @return Returns a configured HttpURLConnection object, or null if it fails.
*/
fun createProxyConnection(
urlStr: String,
port: Int,
- connectTimeout: Int = 15000,
- readTimeout: Int = 15000,
+ connectTimeout: Int = 30000,
+ readTimeout: Int = 30000,
needStream: Boolean = false
): HttpURLConnection? {
|
v2rayng
|
2dust
|
Kotlin
|
Kotlin
| 38,863
| 5,828
|
A V2Ray client for Android, support Xray core and v2fly core
|
2dust_v2rayng
|
PERF_IMPROVEMENT
|
Obvious
|
6cf5176afed731e4f6c36942529296fa5ddd40a1
|
2025-02-13 17:52:46
|
Costa Tsaousis
|
Revert "fix windows logs" (#19639) Revert "fix windows logs (#19632)" This reverts commit d8c3dc087c7285400b229f972d081e1df340fbf2.
| false
| 18
| 19
| 37
|
--- src/libnetdata/log/nd_log-common.h
@@ -117,16 +117,16 @@ typedef enum __attribute__((__packed__)) {
NDF_ALERT_SUMMARY = 60,
NDF_ALERT_INFO = 61,
NDF_ALERT_NOTIFICATION_REALTIME_USEC = 62,
- // NDF_ALERT_FLAGS,
NDF_STACK_TRACE = 63, // stack trace of the thread logging
-
- // put new items here
- // leave the request URL and the message last
-
NDF_REQUEST = 64, // the request we are currently working on
NDF_MESSAGE = 65, // the log message, if any
+ // DO NOT CHANGE OR RENUMBER ANY OF THE ABOVE
+ // THEY ARE HARDCODED INTO THE WEVT MANIFEST!
+
+ // put new items here
+
// terminator
_NDF_MAX,
} ND_LOG_FIELD_ID;
--- src/libnetdata/log/wevt_netdata_mc_generate.c
@@ -67,35 +67,35 @@ const char *get_msg_symbol(MESSAGE_ID msg) {
const char *get_msg_format(MESSAGE_ID msg) {
switch(msg) {
case MSGID_MESSAGE_ONLY:
- return "%2(%12): %64\r\n";
+ return "%2(%12): %65\r\n";
case MSGID_MESSAGE_ERRNO:
- return "%2(%12): %64%n\r\n"
+ return "%2(%12): %65%n\r\n"
"%n\r\n"
" Unix Errno : %5%n\r\n"
" Windows Error: %6%n\r\n"
;
case MSGID_REQUEST_ONLY:
- return "%2(%12): %63\r\n";
+ return "%2(%12): %64\r\n";
case MSGID_ACCESS_MESSAGE:
- return "%64\r\n";
+ return "%65\r\n";
case MSGID_ACCESS_MESSAGE_REQUEST:
- return "%64%n\r\n"
+ return "%65%n\r\n"
"%n\r\n"
- " Request: %63%n\r\n"
+ " Request: %64%n\r\n"
;
case MSGID_ACCESS_MESSAGE_USER:
- return "%64%n\r\n"
+ return "%65%n\r\n"
"%n\r\n"
" User: %21, role: %22, permissions: %23%n\r\n"
;
case MSGID_ACCESS:
- return "%33 %63%n\r\n"
+ return "%33 %64%n\r\n"
"%n\r\n"
" Response Code : %34%n\r\n"
" Transaction ID: %36%n\r\n"
@@ -103,7 +103,7 @@ const char *get_msg_format(MESSAGE_ID msg) {
;
case MSGID_ACCESS_USER:
- return "%33 %63%n\r\n"
+ return "%33 %64%n\r\n"
"%n\r\n"
" Response Code : %34%n\r\n"
" Transaction ID: %36%n\r\n"
@@ -112,7 +112,7 @@ const char *get_msg_format(MESSAGE_ID msg) {
;
case MSGID_ACCESS_FORWARDER:
- return "%33 %63%n\r\n"
+ return "%33 %64%n\r\n"
"%n\r\n"
" Response Code : %34%n\r\n"
" Transaction ID: %36%n\r\n"
@@ -120,7 +120,7 @@ const char *get_msg_format(MESSAGE_ID msg) {
;
case MSGID_ACCESS_FORWARDER_USER:
- return "%33 %63%n\r\n"
+ return "%33 %64%n\r\n"
"%n\r\n"
" Response Code : %34%n\r\n"
" Transaction ID: %36%n\r\n"
@@ -282,8 +282,9 @@ int main(int argc, const char **argv) {
" <data name=\"AlertSummary\" inType=\"win:UnicodeString\"/> <!-- 60 (NDF_ALERT_SUMMARY) -->\r\n"
" <data name=\"AlertInfo\" inType=\"win:UnicodeString\"/> <!-- 61 (NDF_ALERT_INFO) -->\r\n"
" <data name=\"AlertNotificationTime\" inType=\"win:UnicodeString\"/> <!-- 62 (NDF_ALERT_NOTIFICATION_REALTIME_USEC) -->\r\n"
- " <data name=\"Request\" inType=\"win:UnicodeString\"/> <!-- 63 (NDF_REQUEST) -->\r\n"
- " <data name=\"Message\" inType=\"win:UnicodeString\"/> <!-- 64 (NDF_MESSAGE) -->\r\n"
+ " <data name=\"StackTrace\" inType=\"win:UnicodeString\"/> <!-- 63 (NDF_STACK_TRACE) -->\r\n"
+ " <data name=\"Request\" inType=\"win:UnicodeString\"/> <!-- 64 (NDF_REQUEST) -->\r\n"
+ " <data name=\"Message\" inType=\"win:UnicodeString\"/> <!-- 65 (NDF_MESSAGE) -->\r\n"
" </template>\r\n"
" </templates>\r\n"
"\r\n"
|
netdata
|
netdata
|
C
|
C
| 73,681
| 6,023
|
X-Ray Vision for your infrastructure!
|
netdata_netdata
|
BUG_FIX
|
Obvious
|
ec77849cb238abe0b6cebba600ae1c421ac2ac21
|
2023-07-10 22:05:42
|
Christopher Helmerich
|
Added missing dependency
| false
| 13
| 0
| 13
|
--- Solver/utils/tensorCell2Array.m
@@ -1,13 +0,0 @@
-function arrayTensor = tensorCell2Array(Tensor,tryGPU)
-% Converts a 4x4 cells of 4D spactime array into a single array with indexing of (mu,nu,t,x1,x2,x3)
-if nargin < 2
- tryGPU = 0;
-end
-
- arrayTensor(1,1,1,1,:,:) = Tensor.tensor;
- if tryGPU
- arrayTensor = cell2matGPU(arrayTensor);
- else
- arrayTensor = cell2mat(arrayTensor);
- end
-end
\ No newline at end of file
|
warpfactory
|
nerdswithattitudes
|
MATLAB
|
MATLAB
| 298
| 41
|
WarpFactory is a numerical toolkit for analyzing warp drive spacetimes.
|
nerdswithattitudes_warpfactory
|
CONFIG_CHANGE
|
Matched dependenc(y|ies) in message
|
845c8c51e5411ca0d199a8bfecab3e858d80238c
|
2024-11-27 23:45:28
|
Reinier van der Leer
|
ci: Add `merge_group` trigger to status checker
| false
| 1
| 0
| 1
|
--- .github/workflows/repo-workflow-checker.yml
@@ -2,7 +2,6 @@ name: Repo - PR Status Checker
on:
pull_request:
types: [opened, synchronize, reopened]
- merge_group:
jobs:
status-check:
|
autogpt
|
significant-gravitas
|
Python
|
Python
| 172,255
| 45,197
|
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
|
significant-gravitas_autogpt
|
CONFIG_CHANGE
|
Only config file changes have been made.
|
63004475aacb91cbd6434177ba626cd8148880c7
|
2024-10-09 16:06:51
|
Vedansh
|
Improve Go Roadmap (#7399) * Go Basics.
* Go Advanced.
* 102,103,104,105,106
* Everything Else.
| false
| 74
| 101
| 175
|
--- src/data/roadmaps/golang/content/100-go-basics/101-variables.md
@@ -1,6 +1,6 @@
# Variables in Go
-Variable is the name given to a memory location to store a value of a specific type. Go provides multiple ways to declare and use variables.
+Variable is the name given to a memory location to store a value of a specific [type](https://golangbot.com/types/). Go provides multiple ways to declare and use variables.
Visit the following resources to learn more:
--- src/data/roadmaps/golang/content/100-go-basics/102-data-types.md
@@ -2,8 +2,10 @@
Go is a statically typed programming language, which means each variable has a type defined at first and can only hold values with that type. There are two categories of types in Go: basics types and composite types.
+To learn more about types in Go, visit these resources :
+
Visit the following resources to learn more:
+- [@article@Basic data types](https://www.w3schools.com/go/go_data_types.php)
- [@official@Tour of Go: types](https://go.dev/tour/basics/11)
- [@article@Go types with examples](https://golangbyexample.com/all-data-types-in-golang-with-examples/)
-- [@article@Data Types](https://www.w3schools.com/go/go_data_types.php)
--- src/data/roadmaps/golang/content/100-go-basics/103-for-loop.md
@@ -11,4 +11,4 @@ Visit the following resources to learn more:
- [@official@For Loop in Golang](https://go.dev/tour/flowcontrol/1)
- [@official@Effective Go: For loop](https://go.dev/doc/effective_go#for)
- [@article@Go by Example: For loop](https://gobyexample.com/for)
-- [@article@5 Basic for Loop Patterns](https://yourbasic.org/golang/for-loop/)
+- [@article@5 basic for loop patterns](https://yourbasic.org/golang/for-loop/)
--- src/data/roadmaps/golang/content/100-go-basics/107-conditionals.md
@@ -9,6 +9,7 @@ Conditional statements are used to run code only if a certain condition is true.
Visit the following resources to learn more:
- [@official@Effective Go: `if` statement](https://go.dev/doc/effective_go#if)
+- [@article@Basic conditional patterns](https://yourbasic.org/golang/if-else-statement)
- [@article@Go by Example: If-Else](https://gobyexample.com/if-else)
-- [@article@Golang If-Else Statements](https://www.golangprograms.com/golang-if-else-statements.html)
-- [@article@Golang Switch Case Programs](https://www.golangprograms.com/golang-switch-case-statements.html)
+- [@article@Golang programs If-Else statement](https://www.golangprograms.com/golang-if-else-statements.html)
+- [@article@Golang programs `switch` case](https://www.golangprograms.com/golang-switch-case-statements.html)
--- src/data/roadmaps/golang/content/100-go-basics/108-functions.md
@@ -1,9 +1,14 @@
# Functions
-A function is a block of code that performs a specific task. It's a reusable unit of code that can be called from other parts of your program. Functions help you organize your code, make it more modular, and improve readability.
+Discover how functions work in Go, the list of resources below will cover:
+
+- How to define and call functions in Go?
+- Named returns in Go?
+- Handle multiple return types.
+- Different function types in Go.
Visit the following resources to learn more:
-- [@official@Effective Go: Functions](https://go.dev/doc/effective_go#functions)
- [@article@Go by Example: Functions](https://gobyexample.com/functions)
-- [@article@Functions in Go](https://www.golangprograms.com/go-language/functions.html)
+- [@article@Functions in go](https://www.golangprograms.com/go-language/functions.html)
+- [@official@Effective Go: Functions](https://go.dev/doc/effective_go#functions)
--- src/data/roadmaps/golang/content/100-go-basics/109-packages.md
@@ -4,7 +4,8 @@ Packages are the most powerful part of the Go language. The purpose of a package
Visit the following resources to learn more:
-- [@official@Go Packages Explorer](https://pkg.go.dev/)
-- [@official@Standard library](https://pkg.go.dev/std)
- [@article@How to create a package in Go](https://www.golang-book.com/books/intro/11)
-- [@official@How to Manage External Dependencies in Go](https://go.dev/doc/modules/managing-dependencies)
+- [@official@How to manage external dependencies in Go](https://go.dev/doc/modules/managing-dependencies)
+- [@article@Go Packages explorer](https://pkg.go.dev/)
+- [@article@Standard library](https://pkg.go.dev/std)
+- [@article@Go Packages](https://www.programiz.com/golang/packages)
--- src/data/roadmaps/golang/content/100-go-basics/110-type-casting.md
@@ -1,6 +1,6 @@
# Type Casting
-Go doesn't support automatic type conversion, but it allows type casting, which is the process of explicitly changing the variable type.
+Go doesn't support automatic type conversion, but it allows type casting, which is the process of explicitly changing the variable type. To learn more about typecasting, visit these resources :
Visit the following resources to learn more:
--- src/data/roadmaps/golang/content/100-go-basics/111-type-inference.md
@@ -4,5 +4,5 @@ Type inference gives go the capability to detect the type of a value without bei
Visit the following resources to learn more:
-- [@official@Tour of Go: Type Inference](https://go.dev/tour/basics/14)
- [@article@Go Variables: Type Inference](https://www.callicoder.com/golang-variables-zero-values-type-inference/#type-inference)
+- [@official@Tour of Go: Type Inference](https://go.dev/tour/basics/14)
--- src/data/roadmaps/golang/content/100-go-basics/index.md
@@ -4,6 +4,6 @@ Learn the common concepts of Go like variables, loops, conditional statements, f
Visit the following resources to learn more:
-- [@official@Go Tutorial](https://go.dev/doc/tutorial/)
+- [@official@Official Go Tutorial](https://go.dev/doc/tutorial/)
- [@article@W3 Schools Go Tutorial](https://www.w3schools.com/go/index.php)
- [@feed@Explore top posts about Golang](https://app.daily.dev/tags/golang?ref=roadmapsh)
--- src/data/roadmaps/golang/content/101-go-advanced/100-go-modules.md
@@ -2,10 +2,15 @@
Go modules are a group of related packages that are versioned and distributed together. They specify the requirements of our project, list all the required dependencies, and help us keep track of the specific versions of installed dependencies.
+Modules are identified by a module path that is declared in the first line of the go.mod file in our project.
+
Visit the following resources to learn more:
- [@official@Go Modules](https://go.dev/blog/using-go-modules)
-- [@article@DigitalOcean: How to use Go Modules](https://www.digitalocean.com/community/tutorials/how-to-use-go-modules)
- [@video@Go Modules](https://www.youtube.com/watch?v=9cV1KESTJRc)
-- [@video@Go Modules Explained in 5 Minutes](https://youtu.be/7xSxIwWJ9R4)
+- [@article@DigitalOcean: How to use Go Modules](https://www.digitalocean.com/community/tutorials/how-to-use-go-modules)
+- [@video@Go Modules Explained in 5 Minutes (by Golang Dojo on YouTube)](https://youtu.be/7xSxIwWJ9R4)
+- [@official@How to create a module in Go](https://go.dev/doc/tutorial/create-module)
+- [@official@How to use modules in Go](https://go.dev/blog/using-go-modules)
+- [@article@How to modify existing projects to use Go modules](https://jfrog.com/blog/converting-projects-for-go-modules/)
- [@feed@Explore top posts about Golang](https://app.daily.dev/tags/golang?ref=roadmapsh)
--- src/data/roadmaps/golang/content/101-go-advanced/101-working-with-json.md
@@ -7,4 +7,4 @@ Visit the following resources to learn more:
- [@official@JSON](https://go.dev/blog/json)
- [@article@Guide to JSON in Golang](https://www.sohamkamani.com/golang/json/)
- [@article@JSON to GO](https://mholt.github.io/json-to-go/)
-- [@article@Comprehensive Guide to using JSON in Go](https://betterstack.com/community/guides/scaling-go/json-in-go/)
+- [@article@Comprehensive Guide to using JSON in Go](https://betterstack.com/community/guides/scaling-go/json-in-go/)
\ No newline at end of file
--- src/data/roadmaps/golang/content/101-go-advanced/102-types-and-type-assertions.md
@@ -1,8 +1,10 @@
# Types and type assertions
-Types in Golang specify the data type that a valid Go variable can hold. Golang has four categories of Types including Basic, Aggregate, Reference, and Interface Types. Type assertions in Golang provide access to the exact type of variable of an interface.
+Types in Golang specify the data type that a valid Go variable can hold. Golang has four categories of Types including Basic, Aggregate, Reference, and Interface Types
+
+Type assertions in Golang provide access to the exact type of variable of an interface.
Visit the following resources to learn more:
-- [@official@Types Assertions](https://go.dev/tour/methods/15)
-- [@video@Go Syntax - Type Assertions](https://youtube.com/watch?v=vtGbi9bGr3s)
+- [@official@Types Assertions ](https://go.dev/tour/methods/15)
+- [@video@Go Syntax - Type Assertions ](https://youtube.com/watch?v=vtGbi9bGr3s)
--- src/data/roadmaps/golang/content/101-go-advanced/104-context.md
@@ -4,7 +4,7 @@ The `context` package provides a standard way to solve the problem of managing t
Visit the following resources to learn more:
-- [@official@Go Context](https://pkg.go.dev/context)
+- [@article@Go Context](https://pkg.go.dev/context)
- [@article@Go by Example: Context](https://gobyexample.com/context)
- [@article@Digital Ocean: How to Use Contexts in Go](https://www.digitalocean.com/community/tutorials/how-to-use-contexts-in-go)
- [@video@Context in Go](https://www.youtube.com/watch?v=LSzR0VEraWw)
--- src/data/roadmaps/golang/content/101-go-advanced/105-goroutines.md
@@ -1,6 +1,8 @@
# Goroutines
-Goroutines allow us to write concurrent programs in Go. Things like web servers handling thousands of requests or a website rendering new pages while also concurrently making network requests are a few example of concurrency. In Go, each of these concurrent tasks are called `Goroutines`.
+Goroutines allow us to write concurrent programs in Go. Things like web servers handling thousands of requests or a website rendering new pages while also concurrently making network requests are a few example of concurrency.
+
+In Go, each of these concurrent tasks are called `Goroutines`.
Visit the following resources to learn more:
@@ -9,4 +11,5 @@ Visit the following resources to learn more:
- [@video@GoRoutines](https://www.youtube.com/watch?v=LvgVSSpwND8)
- [@video@Understanding Concurrency](https://www.youtube.com/watch?v=V-0ifUKCkBI)
- [@article@Go by Example: Goroutines](https://gobyexample.com/goroutines)
+- [@video@Golang Goroutine Basics You MUST Learn! (by Golang Dojo on YouTube)](https://youtu.be/oHIbeTmmTaA)
- [@feed@Explore top posts about Golang](https://app.daily.dev/tags/golang?ref=roadmapsh)
--- src/data/roadmaps/golang/content/101-go-advanced/107-buffer.md
@@ -4,6 +4,6 @@ The `buffer` belongs to the byte package of the Go language, and we can use thes
Visit the following resources to learn more:
-- [@official@Buffer Examples](https://pkg.go.dev/bytes#example-Buffer)
+- [@article@Buffer Examples](https://pkg.go.dev/bytes#example-Buffer)
- [@article@Buffer](https://www.educba.com/golang-buffer/)
- [@video@Buffers in Golang](https://www.youtube.com/watch?v=NoDRq6Twkts)
--- src/data/roadmaps/golang/content/101-go-advanced/109-mutex.md
@@ -1,8 +1,7 @@
# Mutex
-Go allows us to run code concurrently using goroutines. However, when concurrent processes access the same piece of data, it can lead to race conditions. Mutexes are data structures provided by the sync package. They can help us place a lock on different sections of data so that only one goroutine can access it at a time.
+Go allows us to run code concurrently using goroutines. However, when concurrent processes access the same piece of data, it can lead to [race conditions](https://www.sohamkamani.com/golang/data-races/). Mutexes are data structures provided by the [sync](https://pkg.go.dev/sync/) package. They can help us place a lock on different sections of data so that only one goroutine can access it at a time.
Visit the following resources to learn more:
- [@article@Using a Mutex in Go with Examples](https://www.sohamkamani.com/golang/mutex/)
-- [@article@Sync Package](https://pkg.go.dev/sync/)
--- src/data/roadmaps/golang/content/101-go-advanced/110-scheduler.md
@@ -1,10 +1,12 @@
# Go Scheduler
-Go Scheduler allows us to understand more deeply about how Golang works internally. In terms of logical processors, cores, threads, pool cache, context switching etc. The Go scheduler is part of the Go runtime, and the Go runtime is built into your application.
+Go Scheduler allows us to understand more deeply about how Golang works internally. In terms of logical processors,
+cores, threads, pool cache, context switching etc. The Go scheduler is part of the Go runtime, and the Go runtime
+is built into your application
Visit the following resources to learn more:
-- [@article@OS Scheduler - 1](https://www.ardanlabs.com/blog/2018/08/scheduling-in-go-part1.html)
-- [@article@Go Scheduler - 2](https://www.ardanlabs.com/blog/2018/08/scheduling-in-go-part2.html)
+- [@article@OS Scheduler](https://www.ardanlabs.com/blog/2018/08/scheduling-in-go-part1.html)
+- [@article@Go Scheduler](https://www.ardanlabs.com/blog/2018/08/scheduling-in-go-part2.html)
- [@article@Illustrated Tales of Go Runtime Scheduler](https://medium.com/@ankur_anand/illustrated-tales-of-go-runtime-scheduler-74809ef6d19b)
- [@video@Go scheduler: Implementing language with lightweight concurrency](https://www.youtube.com/watch?v=-K11rY57K7k&ab_channel=Hydra)
--- src/data/roadmaps/golang/content/102-go-building-clis/100-cobra.md
@@ -4,7 +4,7 @@ Cobra is a library for creating powerful modern CLI applications.
Visit the following resources to learn more:
-- [@opensource@Cobra Github](https://github.com/spf13/cobra)
+- [@opensource@Cobra Github Repo](https://github.com/spf13/cobra)
- [@official@Cobra Website](https://cobra.dev/)
- [@article@Cobra Package Documentation](https://pkg.go.dev/github.com/spf13/cobra)
- [@video@How to write beautiful Golang CLI](https://www.youtube.com/watch?v=SSRIn5DAmyw)
--- src/data/roadmaps/golang/content/102-go-building-clis/101-urfave-cli.md
@@ -1,11 +1,11 @@
-# Urfave CLI
+# Urfave cli
-Urfave CLI is a simple, fast, and fun package for building command line apps in Go.
+Urfave cli is a simple, fast, and fun package for building command line apps in Go.
Visit the following resources to learn more:
-- [@opensource@Urfave CLI](https://github.com/urfave/cli)
-- [@article@Urfave Website](https://cli.urfave.org/)
+- [@opensource@Urfave cli Github Repo](https://github.com/urfave/cli)
+- [@article@Urfave cli Website](https://cli.urfave.org/)
- [@article@How to Build cli in Go](https://blog.hackajob.co/how-to-build-cli-in-go/)
- [@article@Building CLI using urfave cli](https://zerokspot.com/weblog/2021/01/25/building-a-cli-using-urfave-cli/)
- [@feed@Explore top posts about CLI](https://app.daily.dev/tags/cli?ref=roadmapsh)
--- src/data/roadmaps/golang/content/103-go-orms/index.md
@@ -1,3 +1,5 @@
# ORMs
-Object–relational mapping (ORM, O/RM, and O/R mapping tool) in computer science is a programming technique for converting data between type systems using object-oriented programming languages. This creates, in effect, a "virtual object database", hence a layer of abstraction, that can be used from within the programming language. Most common ORM library in Go is GORM.
+Object–relational mapping (ORM, O/RM, and O/R mapping tool) in computer science is a programming technique for converting data between type systems using object-oriented programming languages. This creates, in effect, a "virtual object database", hence a layer of abstraction, that can be used from within the programming language.
+
+Most common ORM library in Go is [GORM](https://gorm.io/).
--- src/data/roadmaps/golang/content/104-go-web-frameworks/100-beego.md
@@ -4,4 +4,4 @@ Beego is used for rapid development of enterprise application in Go, including R
Visit the following resources to learn more:
-- [@opensource@Beego/Beego](https://github.com/beego/beego)
+- [@opensource@Github Repository](https://github.com/beego/beego)
--- src/data/roadmaps/golang/content/104-go-web-frameworks/102-revel.md
@@ -4,5 +4,5 @@ Revel organizes endpoints into Controllers. They provide easy data binding and f
Visit the following resources to learn more:
-- [@official@Revel](https://revel.github.io/tutorial/index.html)
+- [@article@Revel](https://revel.github.io/tutorial/index.html)
- [@article@Revel Packages](https://pkg.go.dev/github.com/revel/revel)
--- src/data/roadmaps/golang/content/104-go-web-frameworks/103-echo.md
@@ -4,5 +4,5 @@ Echo is a performance-focused, extensible, open-source Go web application framew
Visit the following resources to learn more:
-- [@opensource@Echo](https://github.com/labstack/echo)
-- [@official@Echo Website](https://echo.labstack.com/)
+- [@opensource@Github Repository](https://github.com/labstack/echo)
+- [@article@Official Website](https://echo.labstack.com/)
--- src/data/roadmaps/golang/content/104-go-web-frameworks/104-gorilla.md
@@ -4,5 +4,5 @@ Gorilla is a web toolkit for the Go programming language that provides useful, c
Visit the following resources to learn more:
-- [@opensource@Gorilla](https://github.com/gorilla)
-- [@official@Gorilla Toolkit](https://www.gorillatoolkit.org/)
+- [@opensource@Github Repository](https://github.com/gorilla)
+- [@article@Official Website](https://www.gorillatoolkit.org/)
--- src/data/roadmaps/golang/content/104-go-web-frameworks/105-gofiber.md
@@ -4,5 +4,5 @@ Go Fiber is an Express-inspired framework for Golang. Go Fiber is a web framewor
Visit the following resources to learn more:
-- [@opensource@Fiber](https://github.com/gofiber/fiber)
-- [@official@Official Docs](https://docs.gofiber.io/)
+- [@opensource@Github Repository](https://github.com/gofiber/fiber)
+- [@article@Official Website Docs](https://docs.gofiber.io/)
--- src/data/roadmaps/golang/content/104-go-web-frameworks/106-buffalo.md
@@ -4,5 +4,5 @@ Buffalo helps you to generate a web project that already has everything from fro
Visit the following resources to learn more:
-- [@opensource@Gobuffalo](https://github.com/gobuffalo/buffalo)
-- [@official@Official Docs](https://gobuffalo.io/)
+- [@opensource@Github Repository](https://github.com/gobuffalo/buffalo)
+- [@article@Official Website Docs](https://gobuffalo.io/)
--- src/data/roadmaps/golang/content/105-go-logging/100-zerolog.md
@@ -6,4 +6,4 @@ Zerolog's API is designed to provide both a great developer experience and stunn
Visit the following resources to learn more:
-- [@opensource@Zerolog](https://github.com/rs/zerolog)
+- [@opensource@GitHub Repository](https://github.com/rs/zerolog)
--- src/data/roadmaps/golang/content/105-go-logging/101-zap.md
@@ -4,4 +4,4 @@ Blazing fast, structured, leveled logging in Go.
Visit the following resources to learn more:
-- [@opensource@Zap](https://github.com/uber-go/zap)
+- [@opensource@GitHub Repository](https://github.com/uber-go/zap)
--- src/data/roadmaps/golang/content/105-go-logging/102-log-slog.md
@@ -4,8 +4,8 @@ The `log` and `log/slog` (since go 1.21) packages are the standard logging packa
Visit the following resources to learn more:
-- [@official@Documentation: log](https://pkg.go.dev/log)
-- [@official@Documentation: slog](https://pkg.go.dev/log/slog)
+- [@article@Official Documentation: log](https://pkg.go.dev/log)
+- [@article@Official Documentation: log/slog](https://pkg.go.dev/log/slog) `(since go 1.21)`
- [@official@Go Blog: Structured Logging with slog](https://go.dev/blog/slog)
- [@article@Go by Example: Logging](https://gobyexample.com/logging)
- [@feed@Explore top posts about Logging](https://app.daily.dev/tags/logging?ref=roadmapsh)
--- src/data/roadmaps/golang/content/106-go-realtime-communication/100-melody.md
@@ -1,8 +1,7 @@
# Melody
-Melody is websocket framework based on gorilla/websocket that abstracts away the tedious parts of handling websockets. It gets out of your way so you can write real-time apps.
+Melody is websocket framework based on [github.com/gorilla/websocket](https://github.com/gorilla/websocket) that abstracts away the tedious parts of handling websockets. It gets out of your way so you can write real-time apps.
Visit the following resources to learn more:
-- [@opensource@Melody](https://github.com/olahol/melody)
-- [@opensource@websocket](https://github.com/gorilla/websocket)
+- [@opensource@GitHub Repository](https://github.com/olahol/melody)
--- src/data/roadmaps/golang/content/106-go-realtime-communication/101-centrifugo.md
@@ -4,5 +4,5 @@ Centrifugo is an open-source scalable real-time messaging server. Centrifugo can
Visit the following resources to learn more:
-- [@opensource@Centrifugo](https://github.com/centrifugal/centrifugo)
-- [@official@Getting Started](https://centrifugal.dev/docs/getting-started/introduction)
+- [@opensource@GitHub Repository](https://github.com/centrifugal/centrifugo)
+- [@article@Getting started](https://centrifugal.dev/docs/getting-started/introduction)
--- src/data/roadmaps/golang/content/106-go-realtime-communication/index.md
@@ -1,5 +1,6 @@
# Go realtime communication
+## What is real-time communication?
Just as it says in the name, real-time communication is the handling of requests concurrently and efficiently. Whether it is a chat/messaging app, an email service, a game server or any collaborative online project (for example, Excalidraw), there are a few different ways of handling real-time communication, but the most common is through the use of WebSockets. Other options for handling real-time communications include MQTT protocol and server-sent events, among others.
Learn more from the following resources:
--- src/data/roadmaps/golang/content/107-go-api-clients/100-rest/100-heimdall.md
@@ -10,4 +10,4 @@ All HTTP methods are exposed as a fluent interface.
Visit the following resources to learn more:
-- [@opensource@Heimdall](https://github.com/gojek/heimdall)
+- [@opensource@GitHub Repository](https://github.com/gojek/heimdall)
--- src/data/roadmaps/golang/content/107-go-api-clients/100-rest/101-grequests.md
@@ -11,4 +11,4 @@ Features:
Visit the following resources to learn more:
-- [@opensource@Grequests](https://github.com/levigross/grequests)
+- [@opensource@GitHub Repository](https://github.com/levigross/grequests)
--- src/data/roadmaps/golang/content/107-go-api-clients/101-graphql/100-graphql-go.md
@@ -6,6 +6,6 @@ Visit the following resources to learn more:
- [@article@Graphql-go homepage](https://graphql-go.github.io/graphql-go.org/)
- [@article@Graphql-go documentation](https://pkg.go.dev/github.com/graphql-go/graphql)
-- [@opensource@Graphql](https://github.com/graphql-go/graphql)
+- [@opensource@Github Repository](https://github.com/graphql-go/graphql)
- [@video@GraphQL-Go - Golang Tutorial (by TechPractice on YouTube)](https://www.youtube.com/watch?v=YK7BQfQ84ws)
- [@feed@Explore top posts about GraphQL](https://app.daily.dev/tags/graphql?ref=roadmapsh)
--- src/data/roadmaps/golang/content/107-go-api-clients/101-graphql/101-gqlgen.md
@@ -4,6 +4,6 @@ According to their documentation, it's a Golang library for building GraphQL ser
Visit the following resources to learn more:
-- [@official@Gqlgen Documentation](https://gqlgen.com/)
+- [@official@Gqlgen website documentation](https://gqlgen.com/)
- [@article@Introducing gqlgen: a GraphQL Server Generator for Go](https://99designs.com.au/blog/engineering/gqlgen-a-graphql-server-generator-for-go/)
- [@video@GraphQL in Go - GQLGen Tutorial (by acklackl on YouTube)](https://www.youtube.com/watch?v=O6jYy421tGw)
--- src/data/roadmaps/golang/content/107-go-api-clients/101-graphql/index.md
@@ -1,14 +1,17 @@
# Graphql
-`GraphQL` is a query language for APIs, it offers a service that prioritizes giving just the data that the client requested and no more. Besides, you don't need to be worried about breaking changes, versioning and backwards compatibility like REST APIs. Therefore you can implement your version and auto-document your API just by using `GraphQL`.
+`GraphQL` is a query language for [APIs](https://developer.mozilla.org/en-US/docs/Glossary/API), it offers a service that prioritizes giving just the data that the client requested and no more.
+
+Besides, you don't need to be worried about breaking changes, versioning and backwards compatibility like REST APIs. Therefore you can implement your version and auto-document your API just by using `GraphQL`.
Visit the following resources to learn more:
-- [@roadmap@Visit Dedicated GraphQL Roadmap](https://roadmap.sh/graphql)
+- [@official@GraphQL Website](https://graphql.org/)
- [@official@Learn GraphQL](https://graphql.org/learn/)
- [@official@GraphQL Tutorials](https://www.graphql.com/tutorials/)
- [@article@Red Hat: What is GraphQL?](https://www.redhat.com/en/topics/api/what-is-graphql)
- [@article@Digital Ocean: An Introduction to GraphQL](https://www.digitalocean.com/community/tutorials/an-introduction-to-graphql)
+- [@article@How to GraphQL: The Fullstack Tutorial for GraphQL](https://www.howtographql.com/)
- [@video@GraphQL Full Course - Novice to Expert](https://www.youtube.com/watch?v=ed8SzALpx1Q)
- [@video@Beginner GraphQL Series (by Ben Awad on YouTube)](https://www.youtube.com/playlist?list=PLN3n1USn4xln0j_NN9k4j5hS1thsGibKi)
- [@feed@Explore top posts about GraphQL](https://app.daily.dev/tags/graphql?ref=roadmapsh)
--- src/data/roadmaps/golang/content/108-go-testing-your-apps.md
@@ -4,7 +4,7 @@ Go has a built-in testing command that we can use to test our program.
Visit the following resources to learn more:
-- [@official@Go Tutorial: Add a Test](https://go.dev/doc/tutorial/add-a-test)
+- [@official@Official Go Tutorial: Add a test](https://go.dev/doc/tutorial/add-a-test)
- [@article@Go by Example: Testing](https://gobyexample.com/testing)
- [@article@YourBasic Go: Table-driven unit tests](https://yourbasic.org/golang/table-driven-unit-test/)
- [@article@Learn Go with Tests](https://quii.gitbook.io/learn-go-with-tests/)
--- src/data/roadmaps/golang/content/109-go-microservices/101-rpcx.md
@@ -9,6 +9,6 @@ Rpcx is a RPC (Remote Procedure Call) framework like Alibaba Dubbo and Weibo Mot
Visit the following resources to learn more:
+- [@article@Rpcx English Documentation](https://en.doc.rpcx.io/)
+- [@opensource@Rpcx Github](https://github.com/smallnest/rpcx)
- [@official@Rpcx Official Website](https://rpcx.io/)
-- [@official@Rpcx Documentation](https://en.doc.rpcx.io/)
-- [@opensource@Rpcx](https://github.com/smallnest/rpcx)
--- src/data/roadmaps/golang/content/109-go-microservices/103-micro.md
@@ -4,5 +4,5 @@ It is an API first development platform. It leverages the microservices architec
Visit the following resources to learn more:
-- [@official@Micro Website](https://micro.dev/)
-- [@opensource@Micro](https://github.com/micro/micro)
+- [@official@Official Website](https://micro.dev/)
+- [@opensource@Micro Github](https://github.com/micro/micro)
--- src/data/roadmaps/golang/content/109-go-microservices/104-go-zero.md
@@ -4,7 +4,7 @@ go-zero is a web and rpc framework with lots of engineering best practices built
Visit the following resources to learn more:
-- [@official@Go-zero](https://go-zero.dev/)
-- [@official@Go-zero Docs](https://go-zero.dev/docs/introduction)
-- [@opensource@Go-Zero](https://github.com/zeromicro/go-zero)
+- [@article@Go-zero](https://go-zero.dev/)
+- [@article@Go-zero Docs](https://go-zero.dev/docs/introduction)
+- [@opensource@GitHub Repository](https://github.com/zeromicro/go-zero)
- [@feed@Explore top posts about Golang](https://app.daily.dev/tags/golang?ref=roadmapsh)
--- src/data/roadmaps/golang/content/109-go-microservices/105-protocol-buffers.md
@@ -11,7 +11,7 @@ Some of the advantages of using protocol buffers include:
Visit the following resources to learn more:
-- [@opensource@Protobuf](https://github.com/protocolbuffers/protobuf/)
+- [@opensource@Protobuf Github](https://github.com/protocolbuffers/protobuf/)
- [@article@Protobuf Doc](https://developers.google.com/protocol-buffers/)
- [@article@Protobuf with Go](https://developers.google.com/protocol-buffers/docs/gotutorial/)
- [@feed@Explore top posts about Backend Development](https://app.daily.dev/tags/backend?ref=roadmapsh)
--- src/data/roadmaps/golang/content/109-go-microservices/106-grpc-go.md
@@ -4,7 +4,7 @@ Go language implementation of gRPC(gRPC is a technology for implementing RPC API
Visit the following resources to learn more:
-- [@opensource@gRPC-go](https://github.com/grpc/grpc-go/)
+- [@opensource@gRPC-go Github](https://github.com/grpc/grpc-go/)
- [@article@gRPC-go Doc](https://pkg.go.dev/google.golang.org/grpc/)
- [@official@Basic tutorial introduction to gRPC in Go.](https://grpc.io/docs/languages/go/basics/)
- [@feed@Explore top posts about gRPC](https://app.daily.dev/tags/grpc?ref=roadmapsh)
--- src/data/roadmaps/golang/content/109-go-microservices/107-grpc-gateway.md
@@ -4,6 +4,6 @@ gRPC-Gateway creates a layer over gRPC services that will act as a RESTful servi
Visit the following resources to learn more:
-- [@opensource@Grpc-gateway](https://github.com/grpc-ecosystem/grpc-gateway/)
-- [@official@Grpc-gateway Doc](https://grpc-ecosystem.github.io/grpc-gateway/)
+- [@opensource@Grpc-gateway Github](https://github.com/grpc-ecosystem/grpc-gateway/)
+- [@article@Grpc-gateway Doc](https://grpc-ecosystem.github.io/grpc-gateway/)
- [@feed@Explore top posts about gRPC](https://app.daily.dev/tags/grpc?ref=roadmapsh)
--- src/data/roadmaps/golang/content/109-go-microservices/108-twirp.md
@@ -2,7 +2,9 @@
Twirp is a framework for service-to-service communication emphasizing simplicity and minimalism. It generates routing and serialization from API definition files and lets you focus on your application's logic instead of thinking about folderol like HTTP methods and paths and JSON.
+Twirp is similar to gRPC, but without the custom HTTP server and transport implementations: it runs on the standard library's extremely-well-tested-and-high-performance net/http Server. It can run on HTTP 1.1, not just http/2, and supports JSON serialization for easy debugging.
+
Visit the following resources to learn more:
-- [@opensource@Twirp](https://github.com/twitchtv/twirp)
-- [@official@Getting Started](https://twitchtv.github.io/twirp/docs/intro.html)
+- [@opensource@GitHub Repository](https://github.com/twitchtv/twirp)
+- [@article@Getting started](https://twitchtv.github.io/twirp/docs/intro.html)
--- src/data/roadmaps/golang/content/109-go-microservices/index.md
@@ -4,7 +4,7 @@ Microservices are an architectural approach to software development that allows
Visit the following resources to learn more:
-- [@article@Introduction to Microservices](https://developer.ibm.com/learningpaths/get-started-application-modernization/intro-microservices/introduction/)
+- [@article@Introduction to microservices](https://developer.ibm.com/learningpaths/get-started-application-modernization/intro-microservices/introduction/)
- [@official@Microservice Patterns and Resources by Chris Richardson](https://microservices.io/index.html)
- [@article@Microservices AntiPatterns and Pitfalls - Mark Richards](https://www.oreilly.com/content/microservices-antipatterns-and-pitfalls/)
- [@article@Building Microservices, 2nd Edition - Sam Newman](https://samnewman.io/books/building_microservices_2nd_edition/)
|
developer-roadmap
|
kamranahmedse
|
TypeScript
|
TypeScript
| 309,677
| 40,429
|
Interactive roadmaps, guides and other educational content to help developers grow in their careers.
|
kamranahmedse_developer-roadmap
|
DOC_CHANGE
|
changes in md files
|
1c55a0fe5246487ec9f18e03b7f28862b76cb7ab
|
2023-09-21 16:13:00
|
Marcus Müller
|
feat(dnf): use `dnf5` if available (#11904) Co-authored-by: Marcus Müller <[email protected]>
| false
| 19
| 12
| 31
|
--- plugins/dnf/README.md
@@ -10,9 +10,6 @@ To use it, add `dnf` to the plugins array in your zshrc file:
plugins=(... dnf)
```
-Classic `dnf` is getting superseded by `dnf5`; this plugin detects the presence
-of `dnf5` and uses it as drop-in alternative to the slower `dnf`.
-
## Aliases
| Alias | Command | Description |
--- plugins/dnf/dnf.plugin.zsh
@@ -1,19 +1,15 @@
## Aliases
-local dnfprog="dnf"
-# Prefer dnf5 if installed
-command -v dnf5 > /dev/null && dnfprog=dnf5
+alias dnfl="dnf list" # List packages
+alias dnfli="dnf list installed" # List installed packages
+alias dnfgl="dnf grouplist" # List package groups
+alias dnfmc="dnf makecache" # Generate metadata cache
+alias dnfp="dnf info" # Show package information
+alias dnfs="dnf search" # Search package
-alias dnfl="${dnfprog} list" # List packages
-alias dnfli="${dnfprog} list installed" # List installed packages
-alias dnfgl="${dnfprog} grouplist" # List package groups
-alias dnfmc="${dnfprog} makecache" # Generate metadata cache
-alias dnfp="${dnfprog} info" # Show package information
-alias dnfs="${dnfprog} search" # Search package
-
-alias dnfu="sudo ${dnfprog} upgrade" # Upgrade package
-alias dnfi="sudo ${dnfprog} install" # Install package
-alias dnfgi="sudo ${dnfprog} groupinstall" # Install package group
-alias dnfr="sudo ${dnfprog} remove" # Remove package
-alias dnfgr="sudo ${dnfprog} groupremove" # Remove package group
-alias dnfc="sudo ${dnfprog} clean all" # Clean cache
+alias dnfu="sudo dnf upgrade" # Upgrade package
+alias dnfi="sudo dnf install" # Install package
+alias dnfgi="sudo dnf groupinstall" # Install package group
+alias dnfr="sudo dnf remove" # Remove package
+alias dnfgr="sudo dnf groupremove" # Remove package group
+alias dnfc="sudo dnf clean all" # Clean cache
|
ohmyzsh
|
ohmyzsh
|
Shell
|
Shell
| 176,465
| 26,013
|
🙃 A delightful community-driven (with 2,400+ contributors) framework for managing your zsh configuration. Includes 300+ optional plugins (rails, git, macOS, hub, docker, homebrew, node, php, python, etc), 140+ themes to spice up your morning, and an auto-update tool that makes it easy to keep up with the latest updates from the community.
|
ohmyzsh_ohmyzsh
|
PERF_IMPROVEMENT
|
Obvious
|
ca0acf31ef163fd2a6bf562c9b9d6077975b41f8
|
2023-01-13 19:12:11
|
Carlo Sala
|
fix(emacs): make `alternate-editor` work for emacs >28 Closes #11441
| false
| 4
| 4
| 8
|
--- plugins/emacs/README.md
@@ -25,6 +25,6 @@ The plugin uses a custom launcher (which we'll call here `$EMACS_LAUNCHER`) that
| e | `emacs` | Same as emacs alias |
| te | `$EMACS_LAUNCHER -nw` | Open terminal emacsclient |
| eeval | `$EMACS_LAUNCHER --eval` | Same as `M-x eval` but from outside Emacs |
-| eframe | `emacsclient --alternate-editor="" --create-frame` | Create new X frame |
+| eframe | `emacsclient --alternate-editor "" --create-frame` | Create new X frame |
| efile | - | Print the path to the file open in the current buffer |
| ecd | - | Print the directory of the file open in the the current buffer |
--- plugins/emacs/emacs.plugin.zsh
@@ -32,7 +32,7 @@ alias te="$EMACS_PLUGIN_LAUNCHER -nw"
# same than M-x eval but from outside Emacs.
alias eeval="$EMACS_PLUGIN_LAUNCHER --eval"
# create a new X frame
-alias eframe='emacsclient --alternate-editor="" --create-frame'
+alias eframe='emacsclient --alternate-editor "" --create-frame'
# Emacs ANSI Term tracking
if [[ -n "$INSIDE_EMACS" ]]; then
--- plugins/emacs/emacsclient.sh
@@ -15,11 +15,11 @@ emacsfun() {
# Only create another X frame if there isn't one present
if [ -z "$frames" -o "$frames" = nil ]; then
- emacsclient --alternate-editor="" --create-frame "$@"
+ emacsclient --alternate-editor "" --create-frame "$@"
return $?
fi
- emacsclient --alternate-editor="" "$@"
+ emacsclient --alternate-editor "" "$@"
}
# Adapted from https://github.com/davidshepherd7/emacs-read-stdin/blob/master/emacs-read-stdin.sh
|
ohmyzsh
|
ohmyzsh
|
Shell
|
Shell
| 176,465
| 26,013
|
🙃 A delightful community-driven (with 2,400+ contributors) framework for managing your zsh configuration. Includes 300+ optional plugins (rails, git, macOS, hub, docker, homebrew, node, php, python, etc), 140+ themes to spice up your morning, and an auto-update tool that makes it easy to keep up with the latest updates from the community.
|
ohmyzsh_ohmyzsh
|
BUG_FIX
|
Obvious
|
3697f9e4c8103163f3892a68d5d4b14cc6e13557
|
2024-02-22 22:14:54
|
Sudhakar Verma
|
cfei: don't generate if frame size is zero (#5588) ## Description
This small change fixes #5580 as described - only needed 1 additional
change and small test fix
Ping @xunilrj
## Checklist
- [x] I have linked to any relevant issues.
- [ ] I have commented my code, particularly in hard-to-understand
areas.
- [ ] I have updated the documentation where relevant (API docs, the
reference, and the Sway book).
- [ ] I have added tests that prove my fix is effective or that my
feature works.
- [x] I have added (or requested a maintainer to add) the necessary
`Breaking*` or `New Feature` labels where relevant.
- [x] I have done my best to ensure that my PR adheres to [the Fuel Labs
Code Review
Standards](https://github.com/FuelLabs/rfcs/blob/master/text/code-standards/external-contributors.md).
- [x] I have requested a review from the relevant team or maintainers.
| false
| 24
| 12
| 36
|
--- sway-core/src/asm_generation/fuel/allocated_abstract_instruction_set.rs
@@ -419,14 +419,6 @@ impl AllocatedAbstractInstructionSet {
}
}
- // cfei 0 and cfsi 0 are omitted from asm emission, don't count them for offsets
- Either::Left(AllocatedOpcode::CFEI(ref op))
- | Either::Left(AllocatedOpcode::CFSI(ref op))
- if op.value == 0 =>
- {
- 0
- }
-
// Another special case for the blob opcode, used for testing.
Either::Left(AllocatedOpcode::BLOB(ref count)) => count.value as u64,
--- sway-core/src/asm_lang/allocated_ops.rs
@@ -594,9 +594,7 @@ impl AllocatedOp {
/* Memory Instructions */
ALOC(a) => op::ALOC::new(a.to_reg_id()).into(),
- CFEI(a) if a.value == 0 => return Either::Left(vec![]),
CFEI(a) => op::CFEI::new(a.value.into()).into(),
- CFSI(a) if a.value == 0 => return Either::Left(vec![]),
CFSI(a) => op::CFSI::new(a.value.into()).into(),
CFE(a) => op::CFE::new(a.to_reg_id()).into(),
CFS(a) => op::CFS::new(a.to_reg_id()).into(),
--- test/src/e2e_vm_tests/test_programs/should_pass/require_contract_deployment/array_of_structs_caller/src/main.sw
@@ -4,7 +4,7 @@ use array_of_structs_abi::{Id, TestContract, Wrapper};
use std::hash::*;
fn main() -> u64 {
- let addr = abi(TestContract, 0xe2a4f86301f8b57ff2c93ce68366669fc2f0926dccd26f9f6550b049cb324a2c);
+ let addr = abi(TestContract, 0xbd1e3ad7022f6c170c6fb3643a1a0c4ad0f666a5a1d735b11255dbfff74e5a05);
let input = [Wrapper {
id: Id {
--- test/src/e2e_vm_tests/test_programs/should_pass/require_contract_deployment/asset_ops_test/src/main.sw
@@ -14,7 +14,7 @@ fn main() -> bool {
let fuelcoin_asset_id = AssetId::new(fuelcoin_id, DEFAULT_SUB_ID);
// contract ID for sway/test/src/e2e_vm_tests/test_programs/should_pass/test_contracts/balance_test_contract/
- let balance_test_id = ContractId::from(0xe50966cd6b1da8fe006e3e876e08f3df6948ce426e1a7cfe49fba411b0a11f89);
+ let balance_test_id = ContractId::from(0x4a00baa517980432b9274a0e2f03c88735bdb483730816679c6eb37b4046d060);
// todo: use correct type ContractId
let fuel_coin = abi(TestFuelCoin, fuelcoin_id.into());
--- test/src/e2e_vm_tests/test_programs/should_pass/require_contract_deployment/bal_opcode/src/main.sw
@@ -5,7 +5,7 @@ use balance_test_abi::BalanceTest;
fn main() -> bool {
// @todo switch to using ContractId when abi signature changes.
- let balance_test_contract_id = 0xe50966cd6b1da8fe006e3e876e08f3df6948ce426e1a7cfe49fba411b0a11f89;
+ let balance_test_contract_id = 0x4a00baa517980432b9274a0e2f03c88735bdb483730816679c6eb37b4046d060;
let balance_test_contract = abi(BalanceTest, balance_test_contract_id);
let number = balance_test_contract.get_42 {
--- test/src/e2e_vm_tests/test_programs/should_pass/require_contract_deployment/call_abi_with_tuples/src/main.sw
@@ -3,7 +3,7 @@ script;
use abi_with_tuples::*;
fn main() -> bool {
- let the_abi = abi(MyContract, 0x1200d031e9c10f8d9bd9dd556a98a0c88e74a4da991047556f78b1bcc1be2ab6);
+ let the_abi = abi(MyContract, 0xe507ae21649fbd2b48ccda116687d2ff164b190c09d33d9d480981323af16be7);
let param1 = (
Person {
--- test/src/e2e_vm_tests/test_programs/should_pass/require_contract_deployment/call_basic_storage/src/main.sw
@@ -2,7 +2,7 @@ script;
use basic_storage_abi::{BasicStorage, Quad};
fn main() -> u64 {
- let addr = abi(BasicStorage, 0xa4174c9ff114dc3a99eee9d8f43e417276852a6ba41b8ea469b54385a6596db4);
+ let addr = abi(BasicStorage, 0x68c0e1ebcddb900439182bf0673a4dde93c02f8c14072305b55f1dd4d1470def);
let key = 0x0fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff;
let value = 4242;
--- test/src/e2e_vm_tests/test_programs/should_pass/require_contract_deployment/call_storage_enum/src/main.sw
@@ -3,7 +3,7 @@ script;
use storage_enum_abi::*;
fn main() -> u64 {
- let contract_id = 0x4c01b41e6f7fc88c88a7799c43d9f695e22ee01eed90478b99fe3bfa935e3e07;
+ let contract_id = 0x21ec4784feb8a4feda42fd1ccfb6c2496d42e03ff54f88be00602086491e1f7b;
let caller = abi(StorageEnum, contract_id);
let res = caller.read_write_enums();
--- test/src/e2e_vm_tests/test_programs/should_pass/require_contract_deployment/caller_auth_test/src/main.sw
@@ -4,7 +4,7 @@ use auth_testing_abi::AuthTesting;
// should be false in the case of a script
fn main() -> bool {
- let caller = abi(AuthTesting, 0x66d9f99ddeeff7d1c6d3b986afd5d20029860289cb74c64e30c255730966d24f);
+ let caller = abi(AuthTesting, 0x10f04ba40bd185d6e2e326a9f8be6d1c1f96b7a021faecea1bd46fc4b5cce885);
let result = caller.returns_gm_one();
assert(result);
result
--- test/src/e2e_vm_tests/test_programs/should_pass/require_contract_deployment/caller_context_test/src/main.sw
@@ -6,7 +6,7 @@ use context_testing_abi::*;
fn main() -> bool {
let gas: u64 = u64::max();
let amount: u64 = 11;
- let other_contract_id = ContractId::from(0x65dae4fedb02e2d70cdb56e2b82d23a2baa69a6acdbf01cc1271c7c1a1abe2cc);
+ let other_contract_id = ContractId::from(0xa38576787f8900d66e6620548b6da8142b8bb4d129b2338609acd121ca126c10);
let other_contract_id_b256: b256 = other_contract_id.into();
let base_asset_id = BASE_ASSET_ID;
--- test/src/e2e_vm_tests/test_programs/should_pass/require_contract_deployment/nested_struct_args_caller/src/main.sw
@@ -3,7 +3,7 @@ script;
use nested_struct_args_abi::*;
fn main() -> bool {
- let contract_id = 0x0fd8fed83ef774a35708706495b49f93254cc5ded343c3bd4416a70c8eb47e01;
+ let contract_id = 0xfa4bb608c7de0db473862926816eb23d17469ec2ef08685aab3c4ddd1892f9a8;
let caller = abi(NestedStructArgs, contract_id);
let param_one = StructOne {
--- test/src/e2e_vm_tests/test_programs/should_pass/unit_tests/workspace_test/contract_multi_test/src/main.sw
@@ -17,7 +17,7 @@ fn test_foo() {
#[test(should_revert)]
fn test_fail() {
- let contract_id = 0xd1cc94578ce1595d8f350cc3ea743fbf769e93ac1b7dc1731a28563109368e0a;
+ let contract_id = 0xa5cd13d5d8ceaa436905f361853ba278f6760da2af5061ec86fe09b8a0cf59b4;
let caller = abi(MyContract, contract_id);
let result = caller.test_function {}();
assert(result == false)
@@ -25,7 +25,7 @@ fn test_fail() {
#[test]
fn test_success() {
- let contract_id = 0xd1cc94578ce1595d8f350cc3ea743fbf769e93ac1b7dc1731a28563109368e0a;
+ let contract_id = 0xa5cd13d5d8ceaa436905f361853ba278f6760da2af5061ec86fe09b8a0cf59b4;
let caller = abi(MyContract, contract_id);
let result = caller.test_function {}();
assert(result == true)
--- test/src/ir_generation/tests/empty.sw
@@ -10,8 +10,6 @@ fn main() {
// ::check-asm::
// The data section setup:
-// check: move $$$$locbase $$sp
-// check: cfei i0
// check: ret $$zero
-// nextln: .data
+// nextln: .data:
// not: data_
|
sway
|
fuellabs
|
Rust
|
Rust
| 62,435
| 5,382
|
🌴 Empowering everyone to build reliable and efficient smart contracts.
|
fuellabs_sway
|
BUG_FIX
|
obvious
|
018a57d7bbbbb81a97f3d64b017dd38715af72f5
|
2025-01-31 14:26:05
|
Netdata bot
|
Regenerate integrations docs (#19541) Co-authored-by: ilyam8 <[email protected]>
| false
| 154
| 0
| 154
|
--- src/health/notifications/smseagle/README.md
@@ -1,154 +0,0 @@
-<!--startmeta
-custom_edit_url: "https://github.com/netdata/netdata/edit/master/src/health/notifications/smseagle/README.md"
-meta_yaml: "https://github.com/netdata/netdata/edit/master/src/health/notifications/smseagle/metadata.yaml"
-sidebar_label: "SMSEagle"
-learn_status: "Published"
-learn_rel_path: "Alerts & Notifications/Notifications/Agent Dispatched Notifications"
-message: "DO NOT EDIT THIS FILE DIRECTLY, IT IS GENERATED BY THE NOTIFICATION'S metadata.yaml FILE"
-endmeta-->
-
-# SMSEagle
-
-
-<img src="https://netdata.cloud/img/smseagle.svg" width="150"/>
-
-
-Forward notifications to SMSEagle device to send SMS, MMS, wake-up, or text-to-speech calls.
-
-
-
-<img src="https://img.shields.io/badge/maintained%20by-Netdata-%2300ab44" />
-
-## Setup
-
-### Prerequisites
-
-####
-
-Before using the API, you'll need to enable API access on your SMSEagle device by following these steps:
-
-1. Navigate to the Web-GUI and select the "Users" menu.
-2. Create a new user account with "User" access level.
-3. Locate the "Access to API" option next to your newly created user.
-4. Select APIv2 and click the "Generate new token" button to create your API access token.
-5. Set up the appropriate permissions in the APIv2 Permission section.
-
-Optional: Enable the "Access to resources of all users" checkbox if you want this API key to access data across all users. By default, the API key can only access data created under its credentials.
-
-
-
-### Configuration
-
-#### File
-
-The configuration file name for this integration is `health_alarm_notify.conf`.
-
-
-You can edit the configuration file using the [`edit-config`](https://github.com/netdata/netdata/blob/master/docs/netdata-agent/configuration/README.md#edit-a-configuration-file-using-edit-config) script from the
-Netdata [config directory](https://github.com/netdata/netdata/blob/master/docs/netdata-agent/configuration/README.md#the-netdata-config-directory).
-
-```bash
-cd /etc/netdata 2>/dev/null || cd /opt/netdata/etc/netdata
-sudo ./edit-config health_alarm_notify.conf
-```
-#### Options
-
-The following options can be defined for this notification
-
-<details open><summary>Config Options</summary>
-
-| Name | Description | Default | Required |
-|:----|:-----------|:-------|:--------:|
-| DEFAULT_RECIPIENT_SMSEAGLE | If a role's recipients are not configured, a notification will be sent to this SMS recipient (empty = do not send a notification for unconfigured roles). Multiple recipients can be given like this: "PHONE1,PHONE2..." | | yes |
-| SMSEAGLE_API_URL | | | yes |
-| SMSEAGLE_API_ACCESSTOKEN | | | yes |
-| SMSEAGLE_MSG_TYPE | | sms | yes |
-| SMSEAGLE_CALL_DURATION | | 10 | yes |
-| SMSEAGLE_VOICE_ID | | 10 | yes |
-
-##### DEFAULT_RECIPIENT_SMSEAGLE
-
-All roles will default to this variable if left unconfigured.
-
-You can then have different recipients per role, by editing `DEFAULT_RECIPIENT_SMSEAGLE` with the number you want, in the following entries at the bottom of the same file:
-```
-role_recipients_smseagle[sysadmin]="+11222333444"
-role_recipients_smseagle[domainadmin]="+11222333445"
-role_recipients_smseagle[dba]="+11222333446"
-role_recipients_smseagle[webmaster]="+11222333447"
-role_recipients_smseagle[proxyadmin]="+11222333448"
-role_recipients_smseagle[sitemgr]="+11222333449"
-```
-
-
-##### SMSEAGLE_API_URL
-
-The url of the SMSEagle device accessible from NetData, e.g https://192.168.0.101
-
-
-##### SMSEAGLE_API_ACCESSTOKEN
-
-An access token for the user created at SMSEagle device
-
-
-##### SMSEAGLE_MSG_TYPE
-
-Choose a type of message/call. Available types: sms, mms, ring (wake-up call), tts (text-to-speech call), tts_advanced (multilanguage text-to-speech call). Be aware that some types require additional parameters to be set.
-
-
-##### SMSEAGLE_CALL_DURATION
-
-Call duration, parameter required for Ring, TTS and TTS Advanced.
-
-
-##### SMSEAGLE_VOICE_ID
-
-ID of the voice model, required for TTS Advanced.
-
-
-</details>
-
-#### Examples
-
-##### Basic Configuration
-
-
-
-```yaml
-#------------------------------------------------------------------------------
-# SMSEagle options
-
-SEND_SMSEAGLE="YES"
-SMSEAGLE_API_URL="XXXXXXXX"
-SMSEAGLE_API_ACCESSTOKEN="XXXXXXX"
-SMSEAGLE_MSG_TYPE="sms"
-SMSEAGLE_CALL_DURATION="10"
-SMSEAGLE_VOICE_ID="1"
-DEFAULT_RECIPIENT_SMSEAGLE="+11222333444"
-
-```
-
-
-## Troubleshooting
-
-### Test Notification
-
-You can run the following command by hand, to test alerts configuration:
-
-```bash
-# become user netdata
-sudo su -s /bin/bash netdata
-
-# enable debugging info on the console
-export NETDATA_ALARM_NOTIFY_DEBUG=1
-
-# send test alarms to sysadmin
-/usr/libexec/netdata/plugins.d/alarm-notify.sh test
-
-# send test alarms to any role
-/usr/libexec/netdata/plugins.d/alarm-notify.sh test "ROLE"
-```
-
-Note that this will test _all_ alert mechanisms for the selected role.
-
-
|
netdata
|
netdata
|
C
|
C
| 73,681
| 6,023
|
X-Ray Vision for your infrastructure!
|
netdata_netdata
|
DOC_CHANGE
|
changes in readme
|
179213f61c704d2c3a5eec7d3c1ffd6d3b1f0154
| null |
Clayton Coleman
|
local-up-cluster: terminate all processes on SIGINT
| false
| 1
| 1
| 0
|
--- local-up-cluster.sh
@@ -147,6 +147,6 @@ cleanup()
exit 0
}
-trap cleanup EXIT
+trap cleanup EXIT SIGINT
while true; do read x; done
|
kubernetes_kubernetes.json
| null | null | null | null | null | null |
kubernetes_kubernetes.json
|
NEW_FEAT
|
5, obvious
|
db17929375470a85584b8aab02ef68edf2616bf9
|
2025-03-21 17:02:59
|
Pierre-Emmanuel Patry
|
gccrs: Use a reference wrapper to please GCC 4.8 gcc/rust/ChangeLog: * backend/rust-compile-expr.cc (CompileExpr::visit): Change call. (CompileExpr::resolve_operator_overload): Update function arguments. * backend/rust-compile-expr.h: Change the function's prototype to use a reference wrapper instead of a reference within the optional. Signed-off-by: Pierre-Emmanuel Patry <[email protected]>
| false
| 11
| 10
| 21
|
--- gcc/rust/backend/rust-compile-expr.cc
@@ -31,7 +31,6 @@
#include "convert.h"
#include "print-tree.h"
#include "rust-system.h"
-#include <functional>
namespace Rust {
namespace Compile {
@@ -153,9 +152,8 @@ CompileExpr::visit (HIR::ArithmeticOrLogicalExpr &expr)
{
auto lang_item_type
= LangItem::OperatorToLangItem (expr.get_expr_type ());
- translated = resolve_operator_overload (
- lang_item_type, expr, lhs, rhs, expr.get_lhs (),
- tl::optional<std::reference_wrapper<HIR::Expr>> (expr.get_rhs ()));
+ translated = resolve_operator_overload (lang_item_type, expr, lhs, rhs,
+ expr.get_lhs (), expr.get_rhs ());
return;
}
@@ -1478,9 +1476,10 @@ CompileExpr::get_receiver_from_dyn (const TyTy::DynamicObjectType *dyn,
}
tree
-CompileExpr::resolve_operator_overload (
- LangItem::Kind lang_item_type, HIR::OperatorExprMeta expr, tree lhs, tree rhs,
- HIR::Expr &lhs_expr, tl::optional<std::reference_wrapper<HIR::Expr>> rhs_expr)
+CompileExpr::resolve_operator_overload (LangItem::Kind lang_item_type,
+ HIR::OperatorExprMeta expr, tree lhs,
+ tree rhs, HIR::Expr &lhs_expr,
+ tl::optional<HIR::Expr &> rhs_expr)
{
TyTy::FnType *fntype;
bool is_op_overload = ctx->get_tyctx ()->lookup_operator_overload (
--- gcc/rust/backend/rust-compile-expr.h
@@ -96,10 +96,10 @@ protected:
TyTy::BaseType *receiver, TyTy::FnType *fntype,
tree receiver_ref, location_t expr_locus);
- tree resolve_operator_overload (
- LangItem::Kind lang_item_type, HIR::OperatorExprMeta expr, tree lhs,
- tree rhs, HIR::Expr &lhs_expr,
- tl::optional<std::reference_wrapper<HIR::Expr>> rhs_expr);
+ tree resolve_operator_overload (LangItem::Kind lang_item_type,
+ HIR::OperatorExprMeta expr, tree lhs,
+ tree rhs, HIR::Expr &lhs_expr,
+ tl::optional<HIR::Expr &> rhs_expr);
tree compile_bool_literal (const HIR::LiteralExpr &expr,
const TyTy::BaseType *tyty);
|
gcc
|
gcc-mirror
|
C
|
C
| null | null |
Compiler
|
gcc-mirror_gcc
|
CODE_IMPROVEMENT
|
Non-functional code changes to improve readability, migration etc.
|
0f39267a197e1e6ba08dfffa2d9df774fe25c6fe
|
2024-01-27 16:37:10
|
Jakub Klimek
|
refactor: #2366 Change h2 database usage to in mem (#2776) * fix: Change h2 database usage to in mem (#2366)
* #2366 Add delay option
| false
| 37
| 13
| 50
|
--- dao/src/main/java/com/iluwatar/dao/App.java
@@ -44,7 +44,7 @@ import org.h2.jdbcx.JdbcDataSource;
*/
@Slf4j
public class App {
- private static final String DB_URL = "jdbc:h2:mem:dao;DB_CLOSE_DELAY=-1";
+ private static final String DB_URL = "jdbc:h2:~/dao";
private static final String ALL_CUSTOMERS = "customerDao.getAllCustomers(): ";
/**
--- dao/src/test/java/com/iluwatar/dao/DbCustomerDaoTest.java
@@ -49,7 +49,7 @@ import org.mockito.Mockito;
*/
class DbCustomerDaoTest {
- private static final String DB_URL = "jdbc:h2:mem:dao;DB_CLOSE_DELAY=-1";
+ private static final String DB_URL = "jdbc:h2:~/dao";
private DbCustomerDao dao;
private final Customer existingCustomer = new Customer(1, "Freddy", "Krueger");
--- domain-model/src/main/java/com/iluwatar/domainmodel/App.java
@@ -49,7 +49,7 @@ import org.joda.money.Money;
*/
public class App {
- public static final String H2_DB_URL = "jdbc:h2:mem:testdb;DB_CLOSE_DELAY=-1";
+ public static final String H2_DB_URL = "jdbc:h2:~/test";
public static final String CREATE_SCHEMA_SQL =
"CREATE TABLE CUSTOMERS (name varchar primary key, money decimal);"
--- layers/src/main/resources/application.properties
@@ -2,7 +2,7 @@
spring.main.web-application-type=none
#datasource settings
-spring.datasource.url=jdbc:h2:mem:databases-cake
+spring.datasource.url=jdbc:h2:~/databases/cake
spring.datasource.driverClassName=org.h2.Driver
spring.datasource.username=sa
spring.datasource.password=sa
--- repository/src/main/java/com/iluwatar/repository/AppConfig.java
@@ -54,7 +54,7 @@ public class AppConfig {
public DataSource dataSource() {
var basicDataSource = new BasicDataSource();
basicDataSource.setDriverClassName("org.h2.Driver");
- basicDataSource.setUrl("jdbc:h2:mem:databases-person");
+ basicDataSource.setUrl("jdbc:h2:~/databases/person");
basicDataSource.setUsername("sa");
basicDataSource.setPassword("sa");
return basicDataSource;
--- repository/src/main/resources/applicationContext.xml
@@ -30,7 +30,7 @@
</bean>
<bean id="dataSource" class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close">
<property name="driverClassName" value="org.h2.Driver" />
- <property name="url" value="jdbc:h2:mem:databases-person;DB_CLOSE_DELAY=-1" />
+ <property name="url" value="jdbc:h2:~/databases/person" />
<property name="username" value="sa" />
<property name="password" value="sa" />
</bean>
--- serialized-entity/README.md
@@ -136,7 +136,7 @@ methods to read and deserialize data items to `Country` objects.
```java
@Slf4j
public class App {
- private static final String DB_URL = "jdbc:h2:mem:testdb;DB_CLOSE_DELAY=-1";
+ private static final String DB_URL = "jdbc:h2:~/test";
private App() {
--- serialized-entity/src/main/java/com/iluwatar/serializedentity/App.java
@@ -48,7 +48,7 @@ import org.h2.jdbcx.JdbcDataSource;
*/
@Slf4j
public class App {
- private static final String DB_URL = "jdbc:h2:mem:testdb";
+ private static final String DB_URL = "jdbc:h2:~/test";
private App() {
--- serialized-entity/src/test/java/com/iluwatar/serializedentity/AppTest.java
@@ -1,27 +1,3 @@
-/*
- * This project is licensed under the MIT license. Module model-view-viewmodel is using ZK framework licensed under LGPL (see lgpl-3.0.txt).
- *
- * The MIT License
- * Copyright © 2014-2022 Ilkka Seppälä
- *
- * Permission is hereby granted, free of charge, to any person obtaining a copy
- * of this software and associated documentation files (the "Software"), to deal
- * in the Software without restriction, including without limitation the rights
- * to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
- * copies of the Software, and to permit persons to whom the Software is
- * furnished to do so, subject to the following conditions:
- *
- * The above copyright notice and this permission notice shall be included in
- * all copies or substantial portions of the Software.
- *
- * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
- * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
- * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
- * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
- * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
- * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
- * THE SOFTWARE.
- */
package com.iluwatar.serializedentity;
import org.junit.jupiter.api.Test;
--- table-module/src/main/java/com/iluwatar/tablemodule/App.java
@@ -45,7 +45,7 @@ import org.h2.jdbcx.JdbcDataSource;
*/
@Slf4j
public final class App {
- private static final String DB_URL = "jdbc:h2:mem:testdb;DB_CLOSE_DELAY=-1";
+ private static final String DB_URL = "jdbc:h2:~/test";
/**
* Private constructor.
--- table-module/src/test/java/com/iluwatar/tablemodule/UserTableModuleTest.java
@@ -36,7 +36,7 @@ import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertThrows;
class UserTableModuleTest {
- private static final String DB_URL = "jdbc:h2:mem:testdb;DB_CLOSE_DELAY=-1";
+ private static final String DB_URL = "jdbc:h2:~/test";
private static DataSource createDataSource() {
var dataSource = new JdbcDataSource();
--- transaction-script/src/main/java/com/iluwatar/transactionscript/App.java
@@ -46,7 +46,7 @@ import org.slf4j.LoggerFactory;
*/
public class App {
- private static final String H2_DB_URL = "jdbc:h2:mem:testdb;DB_CLOSE_DELAY=-1";
+ private static final String H2_DB_URL = "jdbc:h2:~/test";
private static final Logger LOGGER = LoggerFactory.getLogger(App.class);
/**
--- transaction-script/src/test/java/com/iluwatar/transactionscript/HotelDaoImplTest.java
@@ -49,7 +49,7 @@ import org.mockito.Mockito;
*/
class HotelDaoImplTest {
- private static final String DB_URL = "jdbc:h2:mem:testdb;DB_CLOSE_DELAY=-1";
+ private static final String DB_URL = "jdbc:h2:~/test";
private HotelDaoImpl dao;
private Room existingRoom = new Room(1, "Single", 50, false);
--- transaction-script/src/test/java/com/iluwatar/transactionscript/HotelTest.java
@@ -39,7 +39,7 @@ import org.junit.jupiter.api.Test;
*/
class HotelTest {
- private static final String H2_DB_URL = "jdbc:h2:mem:testdb;DB_CLOSE_DELAY=-1";
+ private static final String H2_DB_URL = "jdbc:h2:~/test";
private Hotel hotel;
private HotelDaoImpl dao;
|
java-design-patterns
|
iluwatar
|
Java
|
Java
| 90,911
| 26,831
|
Design patterns implemented in Java
|
iluwatar_java-design-patterns
|
CODE_IMPROVEMENT
|
obvious
|
c18858d63314c93c085e3f95b47d2b26d74690d1
|
2025-03-12 01:50:58
|
Nikita Shulga
|
[MPS] Make `torch.mps.compile_shader` public (#148972) It was a private method in 2.6, but nothin changes in its API for 2.7 and it will likely remain the same in 2.8, so time to remove underscore from its name Fixes #ISSUE_NUMBER Pull Request resolved: https://github.com/pytorch/pytorch/pull/148972 Approved by: https://github.com/Skylion007, https://github.com/atalman, https://github.com/seemethere, https://github.com/albanD, https://github.com/dcci
| false
| 13
| 11
| 24
|
--- docs/source/mps.rst
@@ -18,7 +18,6 @@ torch.mps
current_allocated_memory
driver_allocated_memory
recommended_max_memory
- compile_shader
MPS Profiler
------------
--- test/test_mps.py
@@ -12795,7 +12795,7 @@ class TestCommon(TestCase):
class TestMetalLibrary(TestCaseMPS):
def test_metal_arange(self):
x = torch.zeros(12, device="mps", dtype=torch.half)
- lib = torch.mps.compile_shader("""
+ lib = torch.mps._compile_shader("""
kernel void arange(device half* x, uint idx [[thread_position_in_grid]]) {
x[idx] = idx;
}
@@ -12807,7 +12807,7 @@ class TestMetalLibrary(TestCaseMPS):
x = torch.empty(12, device="mps")
y = torch.empty_like(x)
z = torch.empty_like(x)
- lib = torch.mps.compile_shader("""
+ lib = torch.mps._compile_shader("""
kernel void arange_x(device float* x, uint3 idx [[thread_position_in_grid]]) {
x[idx.x + idx.y + idx.z] = idx.x;
}
@@ -12834,7 +12834,7 @@ class TestMetalLibrary(TestCaseMPS):
def test_metal_arange_with_arg(self, start=3.14, step=.5):
x = torch.zeros(12, device="mps")
- lib = torch.mps.compile_shader("""
+ lib = torch.mps._compile_shader("""
kernel void arange(device float* x, constant float& start, constant float& step,
uint idx [[thread_position_in_grid]]) {
x[idx] = start + idx * step;
@@ -12849,7 +12849,7 @@ class TestMetalLibrary(TestCaseMPS):
def test_metal_arange_with_arg_and_cast(self):
x = torch.zeros(12, device="mps", dtype=torch.half)
y = torch.zeros(12, device="mps", dtype=torch.half)
- lib = torch.mps.compile_shader("""
+ lib = torch.mps._compile_shader("""
kernel void arange_all_half(device half* x, constant half2& start_step,
uint idx [[thread_position_in_grid]]) {
x[idx] = start_step.x + idx * start_step.y;
@@ -12867,10 +12867,10 @@ class TestMetalLibrary(TestCaseMPS):
def test_metal_error_checking(self):
# Syntax error asserts
- self.assertRaises(SyntaxError, lambda: torch.mps.compile_shader("Syntax error"))
+ self.assertRaises(SyntaxError, lambda: torch.mps._compile_shader("Syntax error"))
cpu_tensor = torch.rand(3)
mps_tensor = torch.rand(3, device="mps")
- lib = torch.mps.compile_shader("kernel void full(device half* x) { x[0] = 1.0; }")
+ lib = torch.mps._compile_shader("kernel void full(device half* x) { x[0] = 1.0; }")
# Passing CPU tensor asserts
self.assertRaises(RuntimeError, lambda: lib.full(cpu_tensor))
# Passing invalid shader name asserts
@@ -12885,12 +12885,12 @@ class TestMetalLibrary(TestCaseMPS):
def test_metal_include(self):
# Checks that includes embedding works
- lib = torch.mps.compile_shader("#include <c10/metal/special_math.h>")
+ lib = torch.mps._compile_shader("#include <c10/metal/special_math.h>")
self.assertIsNotNone(lib)
@unittest.skipIf(not torch.mps.profiler.is_metal_capture_enabled(), "Set MTL_CAPTURE_ENABLED and try again")
def test_metal_capture(self):
- lib = torch.mps.compile_shader("kernel void full(device float* x, uint idx [[thread_position_in_grid]]) { x[idx] = 1.0; }")
+ lib = torch.mps._compile_shader("kernel void full(device float* x, uint idx [[thread_position_in_grid]]) { x[idx] = 1.0; }")
mps_tensor = torch.rand(32, device="mps")
capture_name = f"lib_full{''.join(random.choice('0123456789') for i in range(5))}"
capture_dirname = f"0000-{capture_name}.gputrace"
--- torch/_inductor/runtime/runtime_utils.py
@@ -176,6 +176,6 @@ def compile_mps_shader(source: str) -> Any:
Compiles shader source but raise more actionable error message when needed
"""
try:
- return torch.mps.compile_shader(source)
+ return torch.mps._compile_shader(source)
except SyntaxError as err:
raise SyntaxError(f"failed to compile {source} with {err.msg}") from err
--- torch/mps/__init__.py
@@ -140,13 +140,13 @@ def recommended_max_memory() -> int:
return torch._C._mps_recommendedMaxMemory()
-def compile_shader(source: str):
+def _compile_shader(source: str):
r"""Compiles compute shader from source and allows one to invoke kernels
defined there from the comfort of Python runtime
Example::
>>> # xdoctest: +REQUIRES(env:TORCH_DOCTEST_MPS)
- >>> lib = torch.mps.compile_shader(
+ >>> lib = torch.mps._compile_shader(
... "kernel void full(device float* out, constant float& val, uint idx [[thread_position_in_grid]]) { out[idx] = val; }"
... )
>>> x = torch.zeros(16, device="mps")
@@ -175,7 +175,6 @@ from .event import Event
__all__ = [
- "compile_shader",
"device_count",
"get_rng_state",
"manual_seed",
|
pytorch
| null |
python
|
Python
| null | null |
Tensors and Dynamic neural networks in Python with strong GPU acceleration
|
_pytorch
|
CODE_IMPROVEMENT
|
Code change: type annotation added
|
29581229b1c4624f92c7193a7c51c08c4d5bf1f1
|
2025-04-03 23:30:32
|
Garrick Aden-Buie
|
feat(ContentToolResult): `error` can be a string or an error condition (#421) * feat(ContentToolResult): `error` can be a string or an error condition * docs: Add news bullet * refactor: Apply code review suggestion
| false
| 70
| 14
| 84
|
--- NEWS.md
@@ -1,8 +1,5 @@
# ellmer (development version)
-* `ContentToolResult` objects now include the error condition in the `error`
- property when a tool call fails (#421, @gadenbuie).
-
* Several chat functions were renamed to better align with the companies
providing the API (#382, @gadenbuie):
--- R/content-tools.R
@@ -44,7 +44,7 @@ extract_tool_requests <- function(contents) {
contents[is_tool_request]
}
-# Also need to handle edge cases: https://platform.openai.com/docs/guides/function-calling/edge-cases
+# Also need to handle edge caess: https://platform.openai.com/docs/guides/function-calling/edge-cases
invoke_tool <- function(fun, arguments, id) {
if (is.null(fun)) {
return(ContentToolResult(id = id, error = "Unknown tool"))
@@ -54,7 +54,7 @@ invoke_tool <- function(fun, arguments, id) {
ContentToolResult(id, do.call(fun, arguments)),
error = function(e) {
# TODO: We need to report this somehow; it's way too hidden from the user
- ContentToolResult(id, error = e)
+ ContentToolResult(id, error = conditionMessage(e))
}
)
}
@@ -72,7 +72,7 @@ on_load(
},
error = function(e) {
# TODO: We need to report this somehow; it's way too hidden from the user
- ContentToolResult(id, error = e)
+ ContentToolResult(id, error = conditionMessage(e))
}
)
})
--- R/content.R
@@ -198,37 +198,21 @@ method(format, ContentToolRequest) <- function(x, ...) {
#' @rdname Content
#' @export
-#' @param value The results of calling the tool function, if it succeeded.
-#' @param error The error message, as a string, or the error condition thrown
-#' as a result of a failure when calling the tool function. Must be `NULL`
-#' when the tool call is successful.
+#' @param value,error Either the results of calling the function if
+#' it succeeded, otherwise the error message, as a string. One of
+#' `value` and `error` will always be `NULL`.
ContentToolResult <- new_class(
"ContentToolResult",
parent = Content,
properties = list(
id = prop_string(),
value = class_any,
- error = new_property(
- class = NULL | class_character | new_S3_class("condition"),
- default = NULL,
- validator = function(value) {
- ok <- is.null(value) || is_string(value) || inherits(value, "condition")
- if (ok) {
- return()
- }
-
- paste0(
- "must be a single string or a condition object, not ",
- obj_type_friendly(value),
- "."
- )
- }
- )
+ error = prop_string(allow_null = TRUE)
)
)
method(format, ContentToolResult) <- function(x, ...) {
if (tool_errored(x)) {
- value <- paste0(cli::col_red("Error: "), tool_error_string(x))
+ value <- paste0(cli::col_red("Error: "), x@error)
} else {
value <- x@value
}
@@ -236,12 +220,9 @@ method(format, ContentToolResult) <- function(x, ...) {
}
tool_errored <- function(x) !is.null(x@error)
-tool_error_string <- function(x) {
- if (inherits(x@error, "condition")) conditionMessage(x@error) else x@error
-}
tool_string <- function(x) {
if (tool_errored(x)) {
- paste0("Tool calling failed with error ", tool_error_string(x))
+ paste0("Tool calling failed with error ", x@error)
} else if (inherits(x@value, "AsIs")) {
x@value
} else if (inherits(x@value, "json")) {
--- man/Content.Rd
@@ -51,11 +51,9 @@ ContentPDF(type = stop("Required"), data = stop("Required"))
\item{arguments}{Named list of arguments to call the function with.}
-\item{value}{The results of calling the tool function, if it succeeded.}
-
-\item{error}{The error message, as a string, or the error condition thrown
-as a result of a failure when calling the tool function. Must be \code{NULL}
-when the tool call is successful.}
+\item{value, error}{Either the results of calling the function if
+it succeeded, otherwise the error message, as a string. One of
+\code{value} and \code{error} will always be \code{NULL}.}
\item{thinking}{The text of the thinking output.}
--- tests/testthat/_snaps/content.md
@@ -125,18 +125,3 @@
<p>A <strong>thought</strong>.</p>
</details>
-# ContentToolResult@error requires a string or an error condition
-
- Code
- ContentToolResult("id", error = TRUE)
- Condition
- Error:
- ! <ellmer::ContentToolResult> object properties are invalid:
- - @error must be <NULL>, <character>, or S3<condition>, not <logical>
- Code
- ContentToolResult("id", error = c("one", "two"))
- Condition
- Error:
- ! <ellmer::ContentToolResult> object properties are invalid:
- - @error must be a single string or a condition object, not a character vector.
-
--- tests/testthat/test-content-tools.R
@@ -3,23 +3,18 @@ test_that("invoke_tool returns a tool_result", {
expect_s3_class(res, "ellmer::ContentToolResult")
expect_equal(res@id, "x")
expect_equal(res@error, NULL)
- expect_false(tool_errored(res))
expect_equal(res@value, 1)
res <- invoke_tool(function() 1, list(x = 1), id = "x")
expect_s3_class(res, "ellmer::ContentToolResult")
expect_equal(res@id, "x")
- expect_s3_class(res@error, "condition")
- expect_true(tool_errored(res))
- expect_equal(tool_error_string(res), "unused argument (x = 1)")
+ expect_equal(res@error, "unused argument (x = 1)")
expect_equal(res@value, NULL)
res <- invoke_tool(NULL, list(x = 1), id = "x")
expect_s3_class(res, "ellmer::ContentToolResult")
expect_equal(res@id, "x")
expect_equal(res@error, "Unknown tool")
- expect_equal(tool_error_string(res), "Unknown tool")
- expect_true(tool_errored(res))
expect_equal(res@value, NULL)
})
@@ -28,22 +23,17 @@ test_that("invoke_tool_async returns a tool_result", {
expect_s3_class(res, "ellmer::ContentToolResult")
expect_equal(res@id, "x")
expect_equal(res@error, NULL)
- expect_false(tool_errored(res))
expect_equal(res@value, 1)
res <- sync(invoke_tool_async(function() 1, list(x = 1), id = "x"))
expect_s3_class(res, "ellmer::ContentToolResult")
expect_equal(res@id, "x")
- expect_s3_class(res@error, "condition")
- expect_equal(tool_error_string(res), "unused argument (x = 1)")
- expect_true(tool_errored(res))
+ expect_equal(res@error, "unused argument (x = 1)")
expect_equal(res@value, NULL)
res <- sync(invoke_tool_async(NULL, list(x = 1), id = "x"))
expect_s3_class(res, "ellmer::ContentToolResult")
expect_equal(res@id, "x")
expect_equal(res@error, "Unknown tool")
- expect_equal(tool_error_string(res), "Unknown tool")
- expect_true(tool_errored(res))
expect_equal(res@value, NULL)
})
--- tests/testthat/test-content.R
@@ -60,10 +60,3 @@ test_that("thinking has useful representations", {
)
expect_snapshot(cat(contents_html(ct)))
})
-
-test_that("ContentToolResult@error requires a string or an error condition", {
- expect_snapshot(error = TRUE, {
- ContentToolResult("id", error = TRUE)
- ContentToolResult("id", error = c("one", "two"))
- })
-})
|
ellmer
|
tidyverse
|
R
|
R
| 401
| 55
|
Call LLM APIs from R
|
tidyverse_ellmer
|
NEW_FEAT
|
Obvious
|
73160869417275200be19bd37372b6218dbc5f63
| null |
Jan Scheffler
|
feat: extend husky checks (#7574)
| false
| 2
| 1
| 1
|
--- package.json
@@ -116,8 +116,9 @@
},
"husky": {
"hooks": {
+ "pre-commit": "npm run eslint",
"commit-msg": "commitlint --env HUSKY_GIT_PARAMS",
- "pre-push": "npm run ensure-pinned-deps"
+ "pre-push": "npm run tsc && npm run eslint && npm run doc && npm run ensure-pinned-deps"
}
}
}
|
puppeteer_puppeteer.json
| null | null | null | null | null | null |
puppeteer_puppeteer.json
|
NEW_FEAT
|
4, feat in commit message
|
b8ebe99bb4b6b516b4a2e01ef103fc536a0cf502
|
2022-01-10 02:47:22
|
Yuri Schimke
|
Fix indexing (#7003) * Fix indexing
* Fix tests
* Fix tests
| false
| 37
| 48
| 85
|
--- okhttp/src/commonMain/kotlin/okhttp3/internal/-HeadersCommon.kt
@@ -19,9 +19,9 @@ package okhttp3.internal
import okhttp3.Headers
-internal fun Headers.commonName(index: Int): String = namesAndValues.getOrNull(index * 2) ?: throw IndexOutOfBoundsException("name[$index]")
+internal fun Headers.commonName(index: Int): String = namesAndValues[index * 2]
-internal fun Headers.commonValue(index: Int): String = namesAndValues.getOrNull(index * 2 + 1) ?: throw IndexOutOfBoundsException("value[$index]")
+internal fun Headers.commonValue(index: Int): String = namesAndValues[index * 2 + 1]
internal fun Headers.commonValues(name: String): List<String> {
var result: MutableList<String>? = null
--- okhttp/src/commonTest/kotlin/okhttp3/HeadersTest.kt
@@ -283,36 +283,4 @@ class HeadersTest {
.build()
assertThat(headers.toString()).isEqualTo("A: a\nA: aa\na: aa\nB: bb\nC: c\n")
}
-
- @Test fun nameIndexesAreStrict() {
- val headers = Headers.headersOf("a", "b", "c", "d")
- try {
- headers.name(-1)
- fail()
- } catch (expected: IndexOutOfBoundsException) {
- }
- assertThat(headers.name(0)).isEqualTo("a")
- assertThat(headers.name(1)).isEqualTo("c")
- try {
- headers.name(2)
- fail()
- } catch (expected: IndexOutOfBoundsException) {
- }
- }
-
- @Test fun valueIndexesAreStrict() {
- val headers = Headers.headersOf("a", "b", "c", "d")
- try {
- headers.value(-1)
- fail()
- } catch (expected: IndexOutOfBoundsException) {
- }
- assertThat(headers.value(0)).isEqualTo("b")
- assertThat(headers.value(1)).isEqualTo("d")
- try {
- headers.value(2)
- fail()
- } catch (expected: IndexOutOfBoundsException) {
- }
- }
}
--- okhttp/src/jsTest/kotlin/okhttp3/HeadersJsTest.kt
@@ -22,9 +22,20 @@ import kotlin.test.Test
class HeadersJsTest {
@Test
- fun names() {
+ fun nameIndexesAreStrict() {
val headers = Headers.headersOf("a", "b", "c", "d")
+ assertThat(headers.name(-1)).isEqualTo(undefined)
+ assertThat(headers.name(0)).isEqualTo("a")
+ assertThat(headers.name(1)).isEqualTo("c")
+ assertThat(headers.name(2)).isEqualTo(undefined)
+ }
- assertThat(headers.names()).isEqualTo(setOf("a", "c"))
+ @Test
+ fun valueIndexesAreStrict() {
+ val headers = Headers.headersOf("a", "b", "c", "d")
+ assertThat(headers.value(-1)).isEqualTo(undefined)
+ assertThat(headers.value(0)).isEqualTo("b")
+ assertThat(headers.value(1)).isEqualTo("d")
+ assertThat(headers.value(2)).isEqualTo(undefined)
}
}
--- okhttp/src/jvmTest/java/okhttp3/HeadersJvmTest.kt
@@ -182,4 +182,36 @@ class HeadersJvmTest {
assertThat(headerMap["cache-control"]!!.size).isEqualTo(2)
assertThat(headerMap["Cache-Control"]!!.size).isEqualTo(2)
}
+
+ @Test fun nameIndexesAreStrict() {
+ val headers = Headers.headersOf("a", "b", "c", "d")
+ try {
+ headers.name(-1)
+ fail()
+ } catch (expected: IndexOutOfBoundsException) {
+ }
+ assertThat(headers.name(0)).isEqualTo("a")
+ assertThat(headers.name(1)).isEqualTo("c")
+ try {
+ headers.name(2)
+ fail()
+ } catch (expected: IndexOutOfBoundsException) {
+ }
+ }
+
+ @Test fun valueIndexesAreStrict() {
+ val headers = Headers.headersOf("a", "b", "c", "d")
+ try {
+ headers.value(-1)
+ fail()
+ } catch (expected: IndexOutOfBoundsException) {
+ }
+ assertThat(headers.value(0)).isEqualTo("b")
+ assertThat(headers.value(1)).isEqualTo("d")
+ try {
+ headers.value(2)
+ fail()
+ } catch (expected: IndexOutOfBoundsException) {
+ }
+ }
}
--- okhttp/src/nonJvmMain/kotlin/okhttp3/Headers.kt
@@ -50,7 +50,7 @@ actual class Headers internal actual constructor(
actual fun value(index: Int): String = commonValue(index)
actual fun names(): Set<String> {
- return (0 until size).map { name(it) }.distinctBy { it.lowercase() }.toSet()
+ return (0 until size step 2).map { name(it) }.distinctBy { it.lowercase() }.toSet()
}
actual fun values(name: String): List<String> = commonValues(name)
|
okhttp
|
square
|
Kotlin
|
Kotlin
| 46,179
| 9,194
|
Square’s meticulous HTTP client for the JVM, Android, and GraalVM.
|
square_okhttp
|
BUG_FIX
|
obvious
|
04ae1a551726dd6e0047a942b459d18e1dcb1935
|
2024-11-01 10:49:19
|
Nathan Whitaker
|
fix(cli): set `npm_config_user_agent` when running npm packages or tasks (#26639) Fixes #25342.
Still not sure on the exact user agent to set (should it include
`node`?).
After this PR, here's the state of running some `create-*` packages
(just ones I could think of off the top of my head):
| package | prints/runs/suggests deno install | notes |
| ---------------- | ------------- | ------ |
| `create-next-app` | ❌ | falls back to npm, needs a PR
([code](https://github.com/vercel/next.js/blob/c32e2802097c03fd9f95b1dae228d6e0257569c0/packages/create-next-app/helpers/get-pkg-manager.ts#L3))
| `sv create` | ❌ | uses `package-manager-detector`, needs a PR
([code](https://github.com/antfu-collective/package-manager-detector/tree/main))
| `create-qwik` | ✅ | runs `deno install` but suggests `deno start`
which doesn't work (should be `deno task start` or `deno run start`)
| `create-astro` | ✅ | runs `deno install` but suggests `npm run dev`
later in output, probably needs a PR
| `nuxi init` | ❌ | deno not an option in dialog, needs a PR
([code](https://github.com/nuxt/cli/blob/f04e2e894446f597da9d971b7eb03191d5a0da7e/src/commands/init.ts#L96-L102))
| `create-react-app` | ❌ | uses npm
| `ng new` (`@angular/cli`) | ❌ | uses npm
| `create-vite` | ✅ | suggests working deno commands 🎉
| `create-solid` | ❌ | suggests npm commands, needs PR
It's possible that fixing `package-manager-detector` or other packages
might make some of these just work, but haven't looked too carefully at
each
| false
| 123
| 1
| 124
|
--- cli/npm/mod.rs
@@ -189,15 +189,3 @@ impl NpmFetchResolver {
info
}
}
-
-pub const NPM_CONFIG_USER_AGENT_ENV_VAR: &str = "npm_config_user_agent";
-
-pub fn get_npm_config_user_agent() -> String {
- format!(
- "deno/{} npm/? deno/{} {} {}",
- env!("CARGO_PKG_VERSION"),
- env!("CARGO_PKG_VERSION"),
- std::env::consts::OS,
- std::env::consts::ARCH
- )
-}
--- cli/task_runner.rs
@@ -155,12 +155,6 @@ fn prepare_env_vars(
initial_cwd.to_string_lossy().to_string(),
);
}
- if !env_vars.contains_key(crate::npm::NPM_CONFIG_USER_AGENT_ENV_VAR) {
- env_vars.insert(
- crate::npm::NPM_CONFIG_USER_AGENT_ENV_VAR.into(),
- crate::npm::get_npm_config_user_agent(),
- );
- }
if let Some(node_modules_dir) = node_modules_dir {
prepend_to_path(
&mut env_vars,
@@ -210,7 +204,7 @@ impl ShellCommand for NpmCommand {
mut context: ShellCommandContext,
) -> LocalBoxFuture<'static, ExecuteResult> {
if context.args.first().map(|s| s.as_str()) == Some("run")
- && context.args.len() >= 2
+ && context.args.len() > 2
// for now, don't run any npm scripts that have a flag because
// we don't handle stuff like `--workspaces` properly
&& !context.args.iter().any(|s| s.starts_with('-'))
@@ -273,12 +267,10 @@ impl ShellCommand for NodeCommand {
)
.execute(context);
}
-
args.extend(["run", "-A"].into_iter().map(|s| s.to_string()));
args.extend(context.args.iter().cloned());
let mut state = context.state;
-
state.apply_env_var(USE_PKG_JSON_HIDDEN_ENV_VAR_NAME, "1");
ExecutableCommand::new("deno".to_string(), std::env::current_exe().unwrap())
.execute(ShellCommandContext {
--- cli/tools/run/mod.rs
@@ -30,16 +30,6 @@ To grant permissions, set them before the script argument. For example:
}
}
-fn set_npm_user_agent() {
- static ONCE: std::sync::Once = std::sync::Once::new();
- ONCE.call_once(|| {
- std::env::set_var(
- crate::npm::NPM_CONFIG_USER_AGENT_ENV_VAR,
- crate::npm::get_npm_config_user_agent(),
- );
- });
-}
-
pub async fn run_script(
mode: WorkerExecutionMode,
flags: Arc<Flags>,
@@ -68,10 +58,6 @@ pub async fn run_script(
let main_module = cli_options.resolve_main_module()?;
- if main_module.scheme() == "npm" {
- set_npm_user_agent();
- }
-
maybe_npm_install(&factory).await?;
let worker_factory = factory.create_cli_main_worker_factory().await?;
@@ -133,10 +119,6 @@ async fn run_with_watch(
let cli_options = factory.cli_options()?;
let main_module = cli_options.resolve_main_module()?;
- if main_module.scheme() == "npm" {
- set_npm_user_agent();
- }
-
maybe_npm_install(&factory).await?;
let _ = watcher_communicator.watch_paths(cli_options.watch_paths());
--- tests/registry/npm/@denotest/print-npm-user-agent/1.0.0/index.js
@@ -1,2 +0,0 @@
-#!/usr/bin/env node
-console.log(`npm_config_user_agent: ${process.env["npm_config_user_agent"]}`);
\ No newline at end of file
--- tests/registry/npm/@denotest/print-npm-user-agent/1.0.0/package.json
@@ -1,10 +0,0 @@
-{
- "name": "@denotest/print-npm-user-agent",
- "version": "1.0.0",
- "bin": {
- "print-npm-user-agent": "index.js"
- },
- "scripts": {
- "postinstall": "echo postinstall && node index.js && exit 1"
- }
-}
\ No newline at end of file
--- tests/specs/npm/user_agent_env_var/__test__.jsonc
@@ -1,46 +0,0 @@
-{
- "tempDir": true,
- "tests": {
- "set_for_npm_package": {
- "steps": [
- {
- "args": "install",
- "output": "[WILDCARD]"
- },
- {
- "args": "run -A npm:@denotest/print-npm-user-agent",
- "output": "run.out"
- }
- ]
- },
- "unset_for_local_file": {
- "steps": [
- {
- "args": "run -A main.ts",
- "output": "Download [WILDCARD]\nnpm_config_user_agent: undefined\n"
- }
- ]
- },
- "set_for_tasks": {
- "steps": [
- {
- "args": "install",
- "output": "[WILDCARD]"
- },
- {
- "args": "task run-via-bin",
- "output": "bin_command.out"
- }
- ]
- },
- "set_for_lifecycle_scripts": {
- "steps": [
- {
- "args": "install --allow-scripts",
- "output": "postinstall.out",
- "exitCode": 1
- }
- ]
- }
- }
-}
--- tests/specs/npm/user_agent_env_var/bin_command.out
@@ -1,2 +0,0 @@
-Task run-via-bin print-npm-user-agent
-npm_config_user_agent: deno/[WILDCARD] npm/? deno/[WILDCARD] [WILDCARD] [WILDCARD]
--- tests/specs/npm/user_agent_env_var/deno.jsonc
@@ -1,3 +0,0 @@
-{
- "nodeModulesDir": "auto"
-}
--- tests/specs/npm/user_agent_env_var/main.ts
@@ -1 +0,0 @@
-console.log(`npm_config_user_agent: ${Deno.env.get("npm_config_user_agent")}`);
--- tests/specs/npm/user_agent_env_var/package.json
@@ -1,8 +0,0 @@
-{
- "scripts": {
- "run-via-bin": "print-npm-user-agent"
- },
- "dependencies": {
- "@denotest/print-npm-user-agent": "1.0.0"
- }
-}
--- tests/specs/npm/user_agent_env_var/postinstall.out
@@ -1,10 +0,0 @@
-Download http://localhost:4260/@denotest%2fprint-npm-user-agent
-Download http://localhost:4260/@denotest/print-npm-user-agent/1.0.0.tgz
-Initialize @denotest/[email protected]
-Initialize @denotest/[email protected]: running 'postinstall' script
-error: script 'postinstall' in '@denotest/[email protected]' failed with exit code 1
-stdout:
-postinstall
-npm_config_user_agent: deno/[WILDCARD] npm/? deno/[WILDCARD] [WILDCARD] [WILDCARD]
-
-error: failed to run scripts for packages: @denotest/[email protected]
--- tests/specs/npm/user_agent_env_var/run.out
@@ -1 +0,0 @@
-npm_config_user_agent: deno/[WILDCARD] npm/? deno/[WILDCARD] [WILDCARD] [WILDCARD]
--- tests/specs/npm/user_agent_env_var/test.mjs
@@ -1 +0,0 @@
-console.log(process.env.npm_config_user_agent);
|
deno
|
denoland
|
Rust
|
Rust
| 102,021
| 5,502
|
A modern runtime for JavaScript and TypeScript.
|
denoland_deno
|
BUG_FIX
|
obvious
|
a6bddbdb2b2767c6d4b0a3c3efcb3c9e63d5e1c0
|
2023-01-02 19:28:52
|
WebSnke
|
Add Lichess to Games
| false
| 1
| 0
| 1
|
--- README.md
@@ -753,7 +753,6 @@
- [CollectionNode](https://github.com/bwide/CollectionNode) - A swift framework for a collectionView in SpriteKit.
- [AssetImportKit](https://github.com/eugenebokhan/AssetImportKit) - Swifty cross platform library (macOS, iOS) that converts Assimp supported models to SceneKit scenes.
- [glide engine](https://github.com/cocoatoucher/Glide) - SpriteKit and GameplayKit based engine for making 2d games, with practical examples and tutorials.
-- [Lichess mobile](https://github.com/lichess-org/lichobile) - A mobile client for lichess.org.
- [SwiftFortuneWheel](https://github.com/sh-khashimov/SwiftFortuneWheel) - A cross-platform framework for games like a Wheel of Fortune.
## GCD
|
awesome-ios
|
vsouza
|
Swift
|
Swift
| 48,363
| 6,877
|
A curated list of awesome iOS ecosystem, including Objective-C and Swift Projects
|
vsouza_awesome-ios
|
CONFIG_CHANGE
|
Very small changes
|
3b19cdba2a090772b2e886dbfbf712992fafe0cd
|
2024-08-19 22:08:53
|
Daniel Hiltgen
|
Remove Jetpack
| false
| 0
| 42
| 42
|
--- Dockerfile
@@ -5,6 +5,9 @@ ARG CUDA_V11_ARCHITECTURES="50;52;53;60;61;62;70;72;75;80;86"
ARG CUDA_VERSION_12=12.4.0
ARG CUDA_V12_ARCHITECTURES="60;61;62;70;72;75;80;86;87;89;90;90a"
ARG ROCM_VERSION=6.1.2
+ARG JETPACK_6=r36.2.0
+ARG JETPACK_5=r35.4.1
+ARG JETPACK_4=r32.7.1
# Copy the minimal context we need to run the generate scripts
FROM scratch AS llm-code
@@ -81,6 +84,39 @@ RUN --mount=type=cache,target=/root/.ccache \
OLLAMA_CUSTOM_CUDA_DEFS="-DGGML_CUDA_USE_GRAPHS=on" \
bash gen_linux.sh
+FROM --platform=linux/arm64 nvcr.io/nvidia/l4t-jetpack:${JETPACK_6} AS cuda-build-jetpack6-arm64
+ARG CMAKE_VERSION
+RUN apt-get update && apt-get install -y git curl && \
+ curl -s -L https://github.com/Kitware/CMake/releases/download/v${CMAKE_VERSION}/cmake-${CMAKE_VERSION}-linux-$(uname -m).tar.gz | tar -zx -C /usr --strip-components 1
+COPY --from=llm-code / /go/src/github.com/ollama/ollama/
+WORKDIR /go/src/github.com/ollama/ollama/llm/generate
+ARG CGO_CFLAGS
+ENV GOARCH arm64
+ENV LIBRARY_PATH /usr/local/cuda/lib64/stubs
+RUN --mount=type=cache,target=/root/.ccache \
+ OLLAMA_SKIP_STATIC_GENERATE=1 \
+ OLLAMA_SKIP_CPU_GENERATE=1 \
+ CUDA_VARIANT="_jetpack6" \
+ CUDA_DIST_DIR="/go/src/github.com/ollama/ollama/dist/linux-arm64/ollama_libs/cuda_jetpack6" \
+ CMAKE_CUDA_ARCHITECTURES="87" \
+ bash gen_linux.sh
+
+FROM --platform=linux/arm64 nvcr.io/nvidia/l4t-jetpack:${JETPACK_5} AS cuda-build-jetpack5-arm64
+ARG CMAKE_VERSION
+RUN apt-get update && apt-get install -y git curl && \
+ curl -s -L https://github.com/Kitware/CMake/releases/download/v${CMAKE_VERSION}/cmake-${CMAKE_VERSION}-linux-$(uname -m).tar.gz | tar -zx -C /usr --strip-components 1
+COPY --from=llm-code / /go/src/github.com/ollama/ollama/
+WORKDIR /go/src/github.com/ollama/ollama/llm/generate
+ARG CGO_CFLAGS
+ENV GOARCH arm64
+ENV LIBRARY_PATH /usr/local/cuda/lib64/stubs
+RUN --mount=type=cache,target=/root/.ccache \
+ OLLAMA_SKIP_STATIC_GENERATE=1 \
+ OLLAMA_SKIP_CPU_GENERATE=1 \
+ CUDA_VARIANT="_jetpack5" \
+ CUDA_DIST_DIR="/go/src/github.com/ollama/ollama/dist/linux-arm64/ollama_libs/cuda_jetpack5" \
+ CMAKE_CUDA_ARCHITECTURES="72;87" \
+ bash gen_linux.sh
FROM --platform=linux/amd64 rocm/dev-centos-7:${ROCM_VERSION}-complete AS rocm-build-amd64
ARG CMAKE_VERSION
@@ -173,6 +209,12 @@ COPY --from=cuda-11-build-server-arm64 /go/src/github.com/ollama/ollama/dist/ di
COPY --from=cuda-11-build-server-arm64 /go/src/github.com/ollama/ollama/llm/build/linux/ llm/build/linux/
COPY --from=cuda-12-build-server-arm64 /go/src/github.com/ollama/ollama/dist/ dist/
COPY --from=cuda-12-build-server-arm64 /go/src/github.com/ollama/ollama/llm/build/linux/ llm/build/linux/
+## arm binary += 381M
+COPY --from=cuda-build-jetpack6-arm64 /go/src/github.com/ollama/ollama/llm/build/linux/ llm/build/linux/
+COPY --from=cuda-build-jetpack6-arm64 /go/src/github.com/ollama/ollama/dist/ dist/
+## arm binary += 330M
+COPY --from=cuda-build-jetpack5-arm64 /go/src/github.com/ollama/ollama/llm/build/linux/ llm/build/linux/
+COPY --from=cuda-build-jetpack5-arm64 /go/src/github.com/ollama/ollama/dist/ dist/
ARG GOFLAGS
ARG CGO_CFLAGS
RUN --mount=type=cache,target=/root/.ccache \
|
ollama
|
ollama
|
Go
|
Go
| 131,099
| 10,753
|
Get up and running with Llama 3.3, DeepSeek-R1, Phi-4, Gemma 2, and other large language models.
|
ollama_ollama
|
CONFIG_CHANGE
|
Obvious
|
005e4a96490919c52ed7fb54d0926a2eab75cc1a
|
2024-08-02 19:02:39
|
Lucas Nogueira
|
chore: promote API package to RC
| false
| 7
| 1
| 8
|
--- tooling/api/CHANGELOG.md
@@ -1,11 +1,5 @@
# Changelog
-## \[2.0.0-rc.0]
-
-### Changes
-
-- Promoted to RC!
-
## \[2.0.0-beta.16]
### New Features
--- tooling/api/package.json
@@ -1,6 +1,6 @@
{
"name": "@tauri-apps/api",
- "version": "2.0.0-rc.0",
+ "version": "2.0.0-beta.16",
"description": "Tauri API definitions",
"funding": {
"type": "opencollective",
|
tauri
|
tauri-apps
|
Rust
|
Rust
| 90,101
| 2,752
|
Build smaller, faster, and more secure desktop and mobile applications with a web frontend.
|
tauri-apps_tauri
|
BUG_FIX
|
correcting display behavior under Wayland
|
89e254c1412bfcf590fc63935c03e1c1d34bfd14
|
2022-01-17 14:46:23
|
Matheus Felipe
|
Fix test title of check_auth
| false
| 1
| 1
| 2
|
--- scripts/tests/test_validate_format.py
@@ -208,7 +208,7 @@ class TestValidadeFormat(unittest.TestCase):
self.assertIsInstance(err_msg, str)
self.assertEqual(err_msg, expected_err_msg)
- def test_check_auth_with_valid_auth(self):
+ def test_check_auth_with_correct_auth(self):
auth_valid = [f'`{auth}`' for auth in auth_keys if auth != 'No']
auth_valid.append('No')
|
public-apis
|
public-apis
|
Python
|
Python
| 329,015
| 34,881
|
A collective list of free APIs
|
public-apis_public-apis
|
BUG_FIX
|
obvious
|
fdb77da8d92e8be5ee5859da352f4d776093ec4c
| null |
Angelos Chalaris
|
Update prettyBytes.md
| false
| 1
| 2
| -1
|
--- prettyBytes.md
@@ -8,8 +8,7 @@ Return the prettified string by building it up, taking into account the supplied
negative or not.
```js
-const prettyBytes = (num, options) => {
- options = { precision: 3, addSpace: true, ...options };
+const prettyBytes = (num, precision = 3, addSpace = true) {
const UNITS = ['B','KB','MB','GB','TB','PB','EB','ZB','YB'];
if (num < 1) return (num < 0 ? '-' : '') + num + ' B';
const exponent = Math.min(Math.floor(Math.log10(num < 0 ? -num : num) / 3), UNITS.length - 1);
|
Chalarangelo_30-seconds-of-code.json
| null | null | null | null | null | null |
Chalarangelo_30-seconds-of-code.json
|
CONFIG_CHANGE
|
5, obvious
|
dc39e673e2dff26a3eb3f02e38c2065807ebefe1
|
2025-03-25 07:29:57
|
Scott Wolchok
|
Remove aten.elu core ATen decomp because it is now core ATen (#149780) Per @larryliu0820. Pull Request resolved: https://github.com/pytorch/pytorch/pull/149780 Approved by: https://github.com/larryliu0820
| false
| 3
| 2
| 5
|
--- test/expect/HasDecompTest.test_aten_core_operators.expect
@@ -193,8 +193,6 @@ aten::div_.Scalar
aten::div_.Scalar_mode
aten::div_.Tensor
aten::div_.Tensor_mode
-aten::elu
-aten::elu.out
aten::embedding
aten::embedding.out
aten::empty_strided
--- test/export/test_export.py
@@ -13465,6 +13465,7 @@ class TestExportCustomClass(TorchTestCase):
)
decomp_table = default_decompositions()
+ del decomp_table[torch.ops.aten.elu.default]
ep = ep.run_decompositions(
decomp_table=decomp_table,
--- torch/_decomp/__init__.py
@@ -332,6 +332,7 @@ def _core_aten_decompositions_post_autograd() -> dict[
aten.diagonal_copy,
aten.dot,
aten.vdot,
+ aten.elu,
aten.elu_,
aten.elu_backward,
aten._embedding_bag,
--- torch/_inductor/decomposition.py
@@ -61,7 +61,6 @@ inductor_decompositions = get_decompositions(
aten.bitwise_or_,
aten.clamp_min_,
aten.dist,
- aten.elu,
aten.empty_like,
aten.flip,
aten.gelu,
|
pytorch
| null |
python
|
Python
| null | null |
Tensors and Dynamic neural networks in Python with strong GPU acceleration
|
_pytorch
|
CODE_IMPROVEMENT
|
Non-functional code changes to improve readability, migration etc.
|
92fd99bda11ce831e1b33bdb9c42e0f4fb02dd69
|
2025-01-16 03:29:28
|
Dmitry Werner
|
KAFKA-18479: Remove keepPartitionMetadataFile in UnifiedLog and LogMan… (#18491) Reviewers: Jun Rao <[email protected]>
| false
| 38
| 123
| 161
|
--- core/src/main/java/kafka/server/builders/LogManagerBuilder.java
@@ -56,6 +56,7 @@ public class LogManagerBuilder {
private BrokerTopicStats brokerTopicStats = null;
private LogDirFailureChannel logDirFailureChannel = null;
private Time time = Time.SYSTEM;
+ private boolean keepPartitionMetadataFile = true;
private boolean remoteStorageSystemEnable = false;
private long initialTaskDelayMs = ServerLogConfigs.LOG_INITIAL_TASK_DELAY_MS_DEFAULT;
@@ -144,6 +145,11 @@ public class LogManagerBuilder {
return this;
}
+ public LogManagerBuilder setKeepPartitionMetadataFile(boolean keepPartitionMetadataFile) {
+ this.keepPartitionMetadataFile = keepPartitionMetadataFile;
+ return this;
+ }
+
public LogManagerBuilder setRemoteStorageSystemEnable(boolean remoteStorageSystemEnable) {
this.remoteStorageSystemEnable = remoteStorageSystemEnable;
return this;
@@ -180,6 +186,7 @@ public class LogManagerBuilder {
brokerTopicStats,
logDirFailureChannel,
time,
+ keepPartitionMetadataFile,
remoteStorageSystemEnable,
initialTaskDelayMs);
}
--- core/src/main/scala/kafka/log/LogManager.scala
@@ -78,6 +78,7 @@ class LogManager(logDirs: Seq[File],
brokerTopicStats: BrokerTopicStats,
logDirFailureChannel: LogDirFailureChannel,
time: Time,
+ val keepPartitionMetadataFile: Boolean,
remoteStorageSystemEnable: Boolean,
val initialTaskDelayMs: Long) extends Logging {
@@ -345,6 +346,7 @@ class LogManager(logDirs: Seq[File],
logDirFailureChannel = logDirFailureChannel,
lastShutdownClean = hadCleanShutdown,
topicId = None,
+ keepPartitionMetadataFile = keepPartitionMetadataFile,
numRemainingSegments = numRemainingSegments,
remoteStorageSystemEnable = remoteStorageSystemEnable)
@@ -1072,6 +1074,7 @@ class LogManager(logDirs: Seq[File],
brokerTopicStats = brokerTopicStats,
logDirFailureChannel = logDirFailureChannel,
topicId = topicId,
+ keepPartitionMetadataFile = keepPartitionMetadataFile,
remoteStorageSystemEnable = remoteStorageSystemEnable)
if (isFuture)
@@ -1549,7 +1552,8 @@ object LogManager {
kafkaScheduler: Scheduler,
time: Time,
brokerTopicStats: BrokerTopicStats,
- logDirFailureChannel: LogDirFailureChannel): LogManager = {
+ logDirFailureChannel: LogDirFailureChannel,
+ keepPartitionMetadataFile: Boolean): LogManager = {
val defaultProps = config.extractLogConfigMap
LogConfig.validateBrokerLogConfigValues(defaultProps, config.remoteLogManagerConfig.isRemoteStorageSystemEnabled())
@@ -1574,6 +1578,7 @@ object LogManager {
brokerTopicStats = brokerTopicStats,
logDirFailureChannel = logDirFailureChannel,
time = time,
+ keepPartitionMetadataFile = keepPartitionMetadataFile,
interBrokerProtocolVersion = config.interBrokerProtocolVersion,
remoteStorageSystemEnable = config.remoteLogManagerConfig.isRemoteStorageSystemEnabled(),
initialTaskDelayMs = config.logInitialTaskDelayMs)
--- core/src/main/scala/kafka/log/UnifiedLog.scala
@@ -85,6 +85,13 @@ import scala.jdk.OptionConverters.{RichOption, RichOptional, RichOptionalInt}
* @param _topicId optional Uuid to specify the topic ID for the topic if it exists. Should only be specified when
* first creating the log through Partition.makeLeader or Partition.makeFollower. When reloading a log,
* this field will be populated by reading the topic ID value from partition.metadata if it exists.
+ * @param keepPartitionMetadataFile boolean flag to indicate whether the partition.metadata file should be kept in the
+ * log directory. A partition.metadata file is only created when the raft controller is used
+ * or the ZK controller and this broker's inter-broker protocol version is at least 2.8.
+ * This file will persist the topic ID on the broker. If inter-broker protocol for a ZK controller
+ * is downgraded below 2.8, a topic ID may be lost and a new ID generated upon re-upgrade.
+ * If the inter-broker protocol version on a ZK cluster is below 2.8, partition.metadata
+ * will be deleted to avoid ID conflicts upon re-upgrade.
* @param remoteStorageSystemEnable flag to indicate whether the system level remote log storage is enabled or not.
*/
@threadsafe
@@ -95,6 +102,7 @@ class UnifiedLog(@volatile var logStartOffset: Long,
@volatile var leaderEpochCache: LeaderEpochFileCache,
val producerStateManager: ProducerStateManager,
@volatile private var _topicId: Option[Uuid],
+ val keepPartitionMetadataFile: Boolean,
val remoteStorageSystemEnable: Boolean = false,
@volatile private var logOffsetsListener: LogOffsetsListener = LogOffsetsListener.NO_OP_OFFSETS_LISTENER) extends Logging with AutoCloseable {
@@ -182,26 +190,40 @@ class UnifiedLog(@volatile var logStartOffset: Long,
/**
* Initialize topic ID information for the log by maintaining the partition metadata file and setting the in-memory _topicId.
+ * Delete partition metadata file if the version does not support topic IDs.
* Set _topicId based on a few scenarios:
- * - Recover topic ID if present. Ensure we do not try to assign a provided topicId that is inconsistent
+ * - Recover topic ID if present and topic IDs are supported. Ensure we do not try to assign a provided topicId that is inconsistent
* with the ID on file.
- * - If we were provided a topic ID when creating the log and one does not yet exist
+ * - If we were provided a topic ID when creating the log, partition metadata files are supported, and one does not yet exist
* set _topicId and write to the partition metadata file.
+ * - Otherwise set _topicId to None
*/
private def initializeTopicId(): Unit = {
val partMetadataFile = partitionMetadataFile.getOrElse(
throw new KafkaException("The partitionMetadataFile should have been initialized"))
if (partMetadataFile.exists()) {
- val fileTopicId = partMetadataFile.read().topicId
- if (_topicId.isDefined && !_topicId.contains(fileTopicId))
- throw new InconsistentTopicIdException(s"Tried to assign topic ID $topicId to log for topic partition $topicPartition," +
- s"but log already contained topic ID $fileTopicId")
+ if (keepPartitionMetadataFile) {
+ val fileTopicId = partMetadataFile.read().topicId
+ if (_topicId.isDefined && !_topicId.contains(fileTopicId))
+ throw new InconsistentTopicIdException(s"Tried to assign topic ID $topicId to log for topic partition $topicPartition," +
+ s"but log already contained topic ID $fileTopicId")
- _topicId = Some(fileTopicId)
- } else {
+ _topicId = Some(fileTopicId)
+
+ } else {
+ try partMetadataFile.delete()
+ catch {
+ case e: IOException =>
+ error(s"Error while trying to delete partition metadata file $partMetadataFile", e)
+ }
+ }
+ } else if (keepPartitionMetadataFile) {
_topicId.foreach(partMetadataFile.record)
scheduler.scheduleOnce("flush-metadata-file", () => maybeFlushMetadataFile())
+ } else {
+ // We want to keep the file and the in-memory topic ID in sync.
+ _topicId = None
}
}
@@ -471,15 +493,17 @@ class UnifiedLog(@volatile var logStartOffset: Long,
}
case None =>
- _topicId = Some(topicId)
- partitionMetadataFile match {
- case Some(partMetadataFile) =>
- if (!partMetadataFile.exists()) {
- partMetadataFile.record(topicId)
- scheduler.scheduleOnce("flush-metadata-file", () => maybeFlushMetadataFile())
- }
- case _ => warn(s"The topic id $topicId will not be persisted to the partition metadata file " +
- "since the partition is deleted")
+ if (keepPartitionMetadataFile) {
+ _topicId = Some(topicId)
+ partitionMetadataFile match {
+ case Some(partMetadataFile) =>
+ if (!partMetadataFile.exists()) {
+ partMetadataFile.record(topicId)
+ scheduler.scheduleOnce("flush-metadata-file", () => maybeFlushMetadataFile())
+ }
+ case _ => warn(s"The topic id $topicId will not be persisted to the partition metadata file " +
+ "since the partition is deleted")
+ }
}
}
}
@@ -1965,6 +1989,7 @@ object UnifiedLog extends Logging {
logDirFailureChannel: LogDirFailureChannel,
lastShutdownClean: Boolean = true,
topicId: Option[Uuid],
+ keepPartitionMetadataFile: Boolean,
numRemainingSegments: ConcurrentMap[String, Integer] = new ConcurrentHashMap[String, Integer],
remoteStorageSystemEnable: Boolean = false,
logOffsetsListener: LogOffsetsListener = LogOffsetsListener.NO_OP_OFFSETS_LISTENER): UnifiedLog = {
@@ -2009,6 +2034,7 @@ object UnifiedLog extends Logging {
leaderEpochCache,
producerStateManager,
topicId,
+ keepPartitionMetadataFile,
remoteStorageSystemEnable,
logOffsetsListener)
}
--- core/src/main/scala/kafka/raft/KafkaMetadataLog.scala
@@ -620,7 +620,8 @@ object KafkaMetadataLog extends Logging {
producerIdExpirationCheckIntervalMs = Int.MaxValue,
logDirFailureChannel = new LogDirFailureChannel(5),
lastShutdownClean = false,
- topicId = Some(topicId)
+ topicId = Some(topicId),
+ keepPartitionMetadataFile = true
)
val metadataLog = new KafkaMetadataLog(
--- core/src/main/scala/kafka/server/BrokerServer.scala
@@ -216,7 +216,8 @@ class BrokerServer(
kafkaScheduler,
time,
brokerTopicStats,
- logDirFailureChannel)
+ logDirFailureChannel,
+ keepPartitionMetadataFile = true)
remoteLogManagerOpt = createRemoteLogManager()
--- core/src/test/scala/unit/kafka/cluster/PartitionLockTest.scala
@@ -452,7 +452,8 @@ class PartitionLockTest extends Logging {
log.producerIdExpirationCheckIntervalMs,
leaderEpochCache,
producerStateManager,
- _topicId = None) {
+ _topicId = None,
+ keepPartitionMetadataFile = true) {
override def appendAsLeader(records: MemoryRecords, leaderEpoch: Int, origin: AppendOrigin,
requestLocal: RequestLocal, verificationGuard: VerificationGuard): LogAppendInfo = {
--- core/src/test/scala/unit/kafka/cluster/PartitionTest.scala
@@ -3622,7 +3622,8 @@ class PartitionTest extends AbstractPartitionTest {
log.producerIdExpirationCheckIntervalMs,
leaderEpochCache,
producerStateManager,
- _topicId = None) {
+ _topicId = None,
+ keepPartitionMetadataFile = true) {
override def appendAsFollower(records: MemoryRecords): LogAppendInfo = {
appendSemaphore.acquire()
--- core/src/test/scala/unit/kafka/log/AbstractLogCleanerIntegrationTest.scala
@@ -117,7 +117,8 @@ abstract class AbstractLogCleanerIntegrationTest {
producerStateManagerConfig = new ProducerStateManagerConfig(TransactionLogConfig.PRODUCER_ID_EXPIRATION_MS_DEFAULT, false),
producerIdExpirationCheckIntervalMs = TransactionLogConfig.PRODUCER_ID_EXPIRATION_CHECK_INTERVAL_MS_DEFAULT,
logDirFailureChannel = new LogDirFailureChannel(10),
- topicId = None)
+ topicId = None,
+ keepPartitionMetadataFile = true)
logMap.put(partition, log)
this.logs += log
}
--- core/src/test/scala/unit/kafka/log/BrokerCompressionTest.scala
@@ -69,7 +69,8 @@ class BrokerCompressionTest {
producerStateManagerConfig = new ProducerStateManagerConfig(TransactionLogConfig.PRODUCER_ID_EXPIRATION_MS_DEFAULT, false),
producerIdExpirationCheckIntervalMs = TransactionLogConfig.PRODUCER_ID_EXPIRATION_CHECK_INTERVAL_MS_DEFAULT,
logDirFailureChannel = new LogDirFailureChannel(10),
- topicId = None
+ topicId = None,
+ keepPartitionMetadataFile = true
)
/* append two messages */
--- core/src/test/scala/unit/kafka/log/LogCleanerManagerTest.scala
@@ -133,7 +133,7 @@ class LogCleanerManagerTest extends Logging {
// the exception should be caught and the partition that caused it marked as uncleanable
class LogMock extends UnifiedLog(offsets.logStartOffset, localLog, new BrokerTopicStats,
producerIdExpirationCheckIntervalMs, leaderEpochCache,
- producerStateManager, _topicId = None) {
+ producerStateManager, _topicId = None, keepPartitionMetadataFile = true) {
// Throw an error in getFirstBatchTimestampForSegments since it is called in grabFilthiestLog()
override def getFirstBatchTimestampForSegments(segments: util.Collection[LogSegment]): util.Collection[java.lang.Long] =
throw new IllegalStateException("Error!")
@@ -821,7 +821,8 @@ class LogCleanerManagerTest extends Logging {
producerStateManagerConfig = producerStateManagerConfig,
producerIdExpirationCheckIntervalMs = TransactionLogConfig.PRODUCER_ID_EXPIRATION_CHECK_INTERVAL_MS_DEFAULT,
logDirFailureChannel = new LogDirFailureChannel(10),
- topicId = None)
+ topicId = None,
+ keepPartitionMetadataFile = true)
}
private def createLowRetentionLogConfig(segmentSize: Int, cleanupPolicy: String): LogConfig = {
@@ -874,7 +875,8 @@ class LogCleanerManagerTest extends Logging {
producerStateManagerConfig = producerStateManagerConfig,
producerIdExpirationCheckIntervalMs = TransactionLogConfig.PRODUCER_ID_EXPIRATION_CHECK_INTERVAL_MS_DEFAULT,
logDirFailureChannel = new LogDirFailureChannel(10),
- topicId = None
+ topicId = None,
+ keepPartitionMetadataFile = true
)
}
--- core/src/test/scala/unit/kafka/log/LogCleanerTest.scala
@@ -216,7 +216,8 @@ class LogCleanerTest extends Logging {
producerIdExpirationCheckIntervalMs = producerIdExpirationCheckIntervalMs,
leaderEpochCache = leaderEpochCache,
producerStateManager = producerStateManager,
- _topicId = None) {
+ _topicId = None,
+ keepPartitionMetadataFile = true) {
override def replaceSegments(newSegments: Seq[LogSegment], oldSegments: Seq[LogSegment]): Unit = {
deleteStartLatch.countDown()
if (!deleteCompleteLatch.await(5000, TimeUnit.MILLISECONDS)) {
@@ -2092,7 +2093,8 @@ class LogCleanerTest extends Logging {
producerStateManagerConfig = producerStateManagerConfig,
producerIdExpirationCheckIntervalMs = TransactionLogConfig.PRODUCER_ID_EXPIRATION_CHECK_INTERVAL_MS_DEFAULT,
logDirFailureChannel = new LogDirFailureChannel(10),
- topicId = None
+ topicId = None,
+ keepPartitionMetadataFile = true
)
}
--- core/src/test/scala/unit/kafka/log/LogConcurrencyTest.scala
@@ -156,7 +156,8 @@ class LogConcurrencyTest {
producerStateManagerConfig = new ProducerStateManagerConfig(TransactionLogConfig.PRODUCER_ID_EXPIRATION_MS_DEFAULT, false),
producerIdExpirationCheckIntervalMs = TransactionLogConfig.PRODUCER_ID_EXPIRATION_CHECK_INTERVAL_MS_DEFAULT,
logDirFailureChannel = new LogDirFailureChannel(10),
- topicId = None
+ topicId = None,
+ keepPartitionMetadataFile = true
)
}
--- core/src/test/scala/unit/kafka/log/LogLoaderTest.scala
@@ -127,6 +127,7 @@ class LogLoaderTest {
brokerTopicStats = new BrokerTopicStats(),
logDirFailureChannel = logDirFailureChannel,
time = time,
+ keepPartitionMetadataFile = true,
remoteStorageSystemEnable = config.remoteLogManagerConfig.isRemoteStorageSystemEnabled(),
initialTaskDelayMs = config.logInitialTaskDelayMs) {
@@ -323,7 +324,7 @@ class LogLoaderTest {
logDirFailureChannel)
new UnifiedLog(offsets.logStartOffset, localLog, brokerTopicStats,
producerIdExpirationCheckIntervalMs, leaderEpochCache, producerStateManager,
- None)
+ None, keepPartitionMetadataFile = true)
}
// Retain snapshots for the last 2 segments
@@ -446,7 +447,8 @@ class LogLoaderTest {
producerIdExpirationCheckIntervalMs = 30000,
leaderEpochCache = leaderEpochCache,
producerStateManager = stateManager,
- _topicId = None)
+ _topicId = None,
+ keepPartitionMetadataFile = true)
verify(stateManager).updateMapEndOffset(0L)
verify(stateManager).removeStraySnapshots(any())
@@ -555,7 +557,8 @@ class LogLoaderTest {
producerIdExpirationCheckIntervalMs = 30000,
leaderEpochCache = leaderEpochCache,
producerStateManager = stateManager,
- _topicId = None)
+ _topicId = None,
+ keepPartitionMetadataFile = true)
verify(stateManager).removeStraySnapshots(any[java.util.List[java.lang.Long]])
verify(stateManager, times(2)).updateMapEndOffset(0L)
--- core/src/test/scala/unit/kafka/log/LogManagerTest.scala
@@ -975,6 +975,7 @@ class LogManagerTest {
// not clean shutdown
lastShutdownClean = false,
topicId = None,
+ keepPartitionMetadataFile = false,
// pass mock map for verification later
numRemainingSegments = mockMap)
@@ -1382,6 +1383,7 @@ class LogManagerTest {
time = Time.SYSTEM,
brokerTopicStats = new BrokerTopicStats,
logDirFailureChannel = new LogDirFailureChannel(1),
+ keepPartitionMetadataFile = true,
interBrokerProtocolVersion = MetadataVersion.latestTesting,
remoteStorageSystemEnable = false,
initialTaskDelayMs = 0)
--- core/src/test/scala/unit/kafka/log/LogTestUtils.scala
@@ -103,6 +103,7 @@ object LogTestUtils {
producerIdExpirationCheckIntervalMs: Int = TransactionLogConfig.PRODUCER_ID_EXPIRATION_MS_DEFAULT,
lastShutdownClean: Boolean = true,
topicId: Option[Uuid] = None,
+ keepPartitionMetadataFile: Boolean = true,
numRemainingSegments: ConcurrentMap[String, Integer] = new ConcurrentHashMap[String, Integer],
remoteStorageSystemEnable: Boolean = false,
remoteLogManager: Option[RemoteLogManager] = None,
@@ -121,6 +122,7 @@ object LogTestUtils {
logDirFailureChannel = new LogDirFailureChannel(10),
lastShutdownClean = lastShutdownClean,
topicId = topicId,
+ keepPartitionMetadataFile = keepPartitionMetadataFile,
numRemainingSegments = numRemainingSegments,
remoteStorageSystemEnable = remoteStorageSystemEnable,
logOffsetsListener = logOffsetsListener
--- core/src/test/scala/unit/kafka/log/UnifiedLogTest.scala
@@ -1964,6 +1964,26 @@ class UnifiedLogTest {
log.close()
}
+ @Test
+ def testNoOpWhenKeepPartitionMetadataFileIsFalse(): Unit = {
+ val logConfig = LogTestUtils.createLogConfig()
+ val log = createLog(logDir, logConfig, keepPartitionMetadataFile = false)
+
+ val topicId = Uuid.randomUuid()
+ log.assignTopicId(topicId)
+ // We should not write to this file or set the topic ID
+ assertFalse(log.partitionMetadataFile.get.exists())
+ assertEquals(None, log.topicId)
+ log.close()
+
+ val log2 = createLog(logDir, logConfig, topicId = Some(Uuid.randomUuid()), keepPartitionMetadataFile = false)
+
+ // We should not write to this file or set the topic ID
+ assertFalse(log2.partitionMetadataFile.get.exists())
+ assertEquals(None, log2.topicId)
+ log2.close()
+ }
+
@Test
def testLogFailsWhenInconsistentTopicIdSet(): Unit = {
val logConfig = LogTestUtils.createLogConfig()
@@ -4479,12 +4499,13 @@ class UnifiedLogTest {
producerIdExpirationCheckIntervalMs: Int = TransactionLogConfig.PRODUCER_ID_EXPIRATION_CHECK_INTERVAL_MS_DEFAULT,
lastShutdownClean: Boolean = true,
topicId: Option[Uuid] = None,
+ keepPartitionMetadataFile: Boolean = true,
remoteStorageSystemEnable: Boolean = false,
remoteLogManager: Option[RemoteLogManager] = None,
logOffsetsListener: LogOffsetsListener = LogOffsetsListener.NO_OP_OFFSETS_LISTENER): UnifiedLog = {
val log = LogTestUtils.createLog(dir, config, brokerTopicStats, scheduler, time, logStartOffset, recoveryPoint,
maxTransactionTimeoutMs, producerStateManagerConfig, producerIdExpirationCheckIntervalMs,
- lastShutdownClean, topicId, new ConcurrentHashMap[String, Integer],
+ lastShutdownClean, topicId, keepPartitionMetadataFile, new ConcurrentHashMap[String, Integer],
remoteStorageSystemEnable, remoteLogManager, logOffsetsListener)
logsToClose = logsToClose :+ log
log
--- core/src/test/scala/unit/kafka/server/ReplicaManagerTest.scala
@@ -2929,7 +2929,8 @@ class ReplicaManagerTest {
producerIdExpirationCheckIntervalMs = 30000,
leaderEpochCache = leaderEpochCache,
producerStateManager = producerStateManager,
- _topicId = topicId) {
+ _topicId = topicId,
+ keepPartitionMetadataFile = true) {
override def endOffsetForEpoch(leaderEpoch: Int): Option[OffsetAndEpoch] = {
assertEquals(leaderEpoch, leaderEpochFromLeader)
--- core/src/test/scala/unit/kafka/tools/DumpLogSegmentsTest.scala
@@ -100,7 +100,8 @@ class DumpLogSegmentsTest {
producerStateManagerConfig = new ProducerStateManagerConfig(TransactionLogConfig.PRODUCER_ID_EXPIRATION_MS_DEFAULT, false),
producerIdExpirationCheckIntervalMs = TransactionLogConfig.PRODUCER_ID_EXPIRATION_CHECK_INTERVAL_MS_DEFAULT,
logDirFailureChannel = new LogDirFailureChannel(10),
- topicId = None
+ topicId = None,
+ keepPartitionMetadataFile = true
)
log
}
--- core/src/test/scala/unit/kafka/utils/SchedulerTest.scala
@@ -164,7 +164,7 @@ class SchedulerTest {
localLog = localLog,
brokerTopicStats, producerIdExpirationCheckIntervalMs,
leaderEpochCache, producerStateManager,
- _topicId = None)
+ _topicId = None, keepPartitionMetadataFile = true)
assertTrue(scheduler.taskRunning(log.producerExpireCheck))
log.close()
assertFalse(scheduler.taskRunning(log.producerExpireCheck))
--- core/src/test/scala/unit/kafka/utils/TestUtils.scala
@@ -988,6 +988,7 @@ object TestUtils extends Logging {
time = time,
brokerTopicStats = new BrokerTopicStats,
logDirFailureChannel = new LogDirFailureChannel(logDirs.size),
+ keepPartitionMetadataFile = true,
interBrokerProtocolVersion = interBrokerProtocolVersion,
remoteStorageSystemEnable = remoteStorageSystemEnable,
initialTaskDelayMs = initialTaskDelayMs)
--- jmh-benchmarks/src/main/java/org/apache/kafka/jmh/fetcher/ReplicaFetcherThreadBenchmark.java
@@ -146,6 +146,7 @@ public class ReplicaFetcherThreadBenchmark {
setBrokerTopicStats(brokerTopicStats).
setLogDirFailureChannel(logDirFailureChannel).
setTime(Time.SYSTEM).
+ setKeepPartitionMetadataFile(true).
build();
replicaManager = new ReplicaManagerBuilder().
--- jmh-benchmarks/src/main/java/org/apache/kafka/jmh/log/StressTestLog.java
@@ -78,6 +78,7 @@ public class StressTestLog {
new LogDirFailureChannel(10),
true,
Option.empty(),
+ true,
new ConcurrentHashMap<>(),
false,
LogOffsetsListener.NO_OP_OFFSETS_LISTENER
--- jmh-benchmarks/src/main/java/org/apache/kafka/jmh/log/TestLinearWriteSpeed.java
@@ -314,6 +314,7 @@ public class TestLinearWriteSpeed {
new LogDirFailureChannel(10),
true,
Option.empty(),
+ true,
new CopyOnWriteMap<>(),
false,
LogOffsetsListener.NO_OP_OFFSETS_LISTENER
--- jmh-benchmarks/src/main/java/org/apache/kafka/jmh/partition/PartitionMakeFollowerBenchmark.java
@@ -115,7 +115,7 @@ public class PartitionMakeFollowerBenchmark {
setScheduler(scheduler).
setBrokerTopicStats(brokerTopicStats).
setLogDirFailureChannel(logDirFailureChannel).
- setTime(Time.SYSTEM).
+ setTime(Time.SYSTEM).setKeepPartitionMetadataFile(true).
build();
TopicPartition tp = new TopicPartition("topic", 0);
--- jmh-benchmarks/src/main/java/org/apache/kafka/jmh/partition/UpdateFollowerFetchStateBenchmark.java
@@ -106,6 +106,7 @@ public class UpdateFollowerFetchStateBenchmark {
setBrokerTopicStats(brokerTopicStats).
setLogDirFailureChannel(logDirFailureChannel).
setTime(Time.SYSTEM).
+ setKeepPartitionMetadataFile(true).
build();
OffsetCheckpoints offsetCheckpoints = Mockito.mock(OffsetCheckpoints.class);
Mockito.when(offsetCheckpoints.fetch(logDir.getAbsolutePath(), topicPartition)).thenReturn(Optional.of(0L));
--- jmh-benchmarks/src/main/java/org/apache/kafka/jmh/server/PartitionCreationBench.java
@@ -141,6 +141,7 @@ public class PartitionCreationBench {
setBrokerTopicStats(brokerTopicStats).
setLogDirFailureChannel(failureChannel).
setTime(Time.SYSTEM).
+ setKeepPartitionMetadataFile(true).
build();
scheduler.startup();
this.quotaManagers = QuotaFactory.instantiate(this.brokerProperties, this.metrics, this.time, "");
|
apache-kafka
| null |
Java
|
Java
| null | null |
a distributed, open-source streaming platform designed for building real-time data pipelines and streaming applications
|
_apache-kafka
|
CODE_IMPROVEMENT
|
probably refactoring
|
e515ed23e8af7f849cad9a74e40c64c973149e6d
|
2024-11-21 20:13:51
|
David Sherret
|
fix(task): ensure root config always looks up dependencies in root (#26959) We were accidentally looking up dependencies in the member.
| false
| 202
| 67
| 269
|
--- cli/tools/task.rs
@@ -12,7 +12,6 @@ use deno_config::workspace::FolderConfigs;
use deno_config::workspace::TaskDefinition;
use deno_config::workspace::TaskOrScript;
use deno_config::workspace::WorkspaceDirectory;
-use deno_config::workspace::WorkspaceMemberTasksConfig;
use deno_config::workspace::WorkspaceTasksConfig;
use deno_core::anyhow::anyhow;
use deno_core::anyhow::bail;
@@ -271,7 +270,11 @@ impl<'a> TaskRunner<'a> {
pkg_tasks_config: &PackageTaskInfo,
) -> Result<i32, deno_core::anyhow::Error> {
match sort_tasks_topo(pkg_tasks_config) {
- Ok(sorted) => self.run_tasks_in_parallel(sorted).await,
+ Ok(sorted) => {
+ self
+ .run_tasks_in_parallel(&pkg_tasks_config.tasks_config, sorted)
+ .await
+ }
Err(err) => match err {
TaskError::NotFound(name) => {
if self.task_flags.is_run {
@@ -305,62 +308,64 @@ impl<'a> TaskRunner<'a> {
async fn run_tasks_in_parallel(
&self,
- tasks: Vec<ResolvedTask<'a>>,
+ tasks_config: &WorkspaceTasksConfig,
+ task_names: Vec<String>,
) -> Result<i32, deno_core::anyhow::Error> {
- struct PendingTasksContext<'a> {
- completed: HashSet<usize>,
- running: HashSet<usize>,
- tasks: &'a [ResolvedTask<'a>],
+ struct PendingTasksContext {
+ completed: HashSet<String>,
+ running: HashSet<String>,
+ task_names: Vec<String>,
}
- impl<'a> PendingTasksContext<'a> {
+ impl PendingTasksContext {
fn has_remaining_tasks(&self) -> bool {
- self.completed.len() < self.tasks.len()
+ self.completed.len() < self.task_names.len()
}
- fn mark_complete(&mut self, task: &ResolvedTask) {
- self.running.remove(&task.id);
- self.completed.insert(task.id);
+ fn mark_complete(&mut self, task_name: String) {
+ self.running.remove(&task_name);
+ self.completed.insert(task_name);
}
- fn get_next_task<'b>(
+ fn get_next_task<'a>(
&mut self,
- runner: &'b TaskRunner<'b>,
- ) -> Option<
- LocalBoxFuture<'b, Result<(i32, &'a ResolvedTask<'a>), AnyError>>,
- >
- where
- 'a: 'b,
- {
- for task in self.tasks.iter() {
- if self.completed.contains(&task.id)
- || self.running.contains(&task.id)
- {
+ runner: &'a TaskRunner<'a>,
+ tasks_config: &'a WorkspaceTasksConfig,
+ ) -> Option<LocalBoxFuture<'a, Result<(i32, String), AnyError>>> {
+ for name in &self.task_names {
+ if self.completed.contains(name) || self.running.contains(name) {
continue;
}
- let should_run = task
- .dependencies
- .iter()
- .all(|dep_id| self.completed.contains(dep_id));
+ let Some((folder_url, task_or_script)) = tasks_config.task(name)
+ else {
+ continue;
+ };
+ let should_run = match task_or_script {
+ TaskOrScript::Task(_, def) => def
+ .dependencies
+ .iter()
+ .all(|dep| self.completed.contains(dep)),
+ TaskOrScript::Script(_, _) => true,
+ };
+
if !should_run {
continue;
}
- self.running.insert(task.id);
+ self.running.insert(name.clone());
+ let name = name.clone();
return Some(
async move {
- match task.task_or_script {
+ match task_or_script {
TaskOrScript::Task(_, def) => {
- runner.run_deno_task(task.folder_url, task.name, def).await
+ runner.run_deno_task(folder_url, &name, def).await
}
TaskOrScript::Script(scripts, _) => {
- runner
- .run_npm_script(task.folder_url, task.name, scripts)
- .await
+ runner.run_npm_script(folder_url, &name, scripts).await
}
}
- .map(|exit_code| (exit_code, task))
+ .map(|exit_code| (exit_code, name))
}
.boxed_local(),
);
@@ -370,16 +375,16 @@ impl<'a> TaskRunner<'a> {
}
let mut context = PendingTasksContext {
- completed: HashSet::with_capacity(tasks.len()),
+ completed: HashSet::with_capacity(task_names.len()),
running: HashSet::with_capacity(self.concurrency),
- tasks: &tasks,
+ task_names,
};
let mut queue = futures_unordered::FuturesUnordered::new();
while context.has_remaining_tasks() {
while queue.len() < self.concurrency {
- if let Some(task) = context.get_next_task(self) {
+ if let Some(task) = context.get_next_task(self, tasks_config) {
queue.push(task);
} else {
break;
@@ -388,7 +393,7 @@ impl<'a> TaskRunner<'a> {
// If queue is empty at this point, then there are no more tasks in the queue.
let Some(result) = queue.next().await else {
- debug_assert_eq!(context.tasks.len(), 0);
+ debug_assert_eq!(context.task_names.len(), 0);
break;
};
@@ -516,105 +521,46 @@ enum TaskError {
TaskDepCycle { path: Vec<String> },
}
-struct ResolvedTask<'a> {
- id: usize,
- name: &'a str,
- folder_url: &'a Url,
- task_or_script: TaskOrScript<'a>,
- dependencies: Vec<usize>,
-}
-
-fn sort_tasks_topo<'a>(
- pkg_task_config: &'a PackageTaskInfo,
-) -> Result<Vec<ResolvedTask<'a>>, TaskError> {
- trait TasksConfig {
- fn task(
- &self,
- name: &str,
- ) -> Option<(&Url, TaskOrScript, &dyn TasksConfig)>;
- }
-
- impl TasksConfig for WorkspaceTasksConfig {
- fn task(
- &self,
- name: &str,
- ) -> Option<(&Url, TaskOrScript, &dyn TasksConfig)> {
- if let Some(member) = &self.member {
- if let Some((dir_url, task_or_script)) = member.task(name) {
- return Some((dir_url, task_or_script, self as &dyn TasksConfig));
- }
- }
- if let Some(root) = &self.root {
- if let Some((dir_url, task_or_script)) = root.task(name) {
- // switch to only using the root tasks for the dependencies
- return Some((dir_url, task_or_script, root as &dyn TasksConfig));
- }
- }
- None
- }
- }
-
- impl TasksConfig for WorkspaceMemberTasksConfig {
- fn task(
- &self,
- name: &str,
- ) -> Option<(&Url, TaskOrScript, &dyn TasksConfig)> {
- self.task(name).map(|(dir_url, task_or_script)| {
- (dir_url, task_or_script, self as &dyn TasksConfig)
- })
- }
- }
-
+fn sort_tasks_topo(
+ pkg_task_config: &PackageTaskInfo,
+) -> Result<Vec<String>, TaskError> {
fn sort_visit<'a>(
name: &'a str,
- sorted: &mut Vec<ResolvedTask<'a>>,
- mut path: Vec<(&'a Url, &'a str)>,
- tasks_config: &'a dyn TasksConfig,
- ) -> Result<usize, TaskError> {
- let Some((folder_url, task_or_script, tasks_config)) =
- tasks_config.task(name)
- else {
- return Err(TaskError::NotFound(name.to_string()));
- };
-
- if let Some(existing_task) = sorted
- .iter()
- .find(|task| task.name == name && task.folder_url == folder_url)
- {
- // already exists
- return Ok(existing_task.id);
+ sorted: &mut Vec<String>,
+ mut path: Vec<&'a str>,
+ tasks_config: &'a WorkspaceTasksConfig,
+ ) -> Result<(), TaskError> {
+ // Already sorted
+ if sorted.iter().any(|sorted_name| sorted_name == name) {
+ return Ok(());
}
- if path.contains(&(folder_url, name)) {
- path.push((folder_url, name));
+ // Graph has a cycle
+ if path.contains(&name) {
+ path.push(name);
return Err(TaskError::TaskDepCycle {
- path: path.iter().map(|(_, s)| s.to_string()).collect(),
+ path: path.iter().map(|s| s.to_string()).collect(),
});
}
- let mut dependencies: Vec<usize> = Vec::new();
+ let Some((_, task_or_script)) = tasks_config.task(name) else {
+ return Err(TaskError::NotFound(name.to_string()));
+ };
+
if let TaskOrScript::Task(_, task) = task_or_script {
- dependencies.reserve(task.dependencies.len());
for dep in &task.dependencies {
let mut path = path.clone();
- path.push((folder_url, name));
- dependencies.push(sort_visit(dep, sorted, path, tasks_config)?);
+ path.push(name);
+ sort_visit(dep, sorted, path, tasks_config)?
}
}
- let id = sorted.len();
- sorted.push(ResolvedTask {
- id,
- name,
- folder_url,
- task_or_script,
- dependencies,
- });
+ sorted.push(name.to_string());
- Ok(id)
+ Ok(())
}
- let mut sorted: Vec<ResolvedTask<'a>> = vec![];
+ let mut sorted: Vec<String> = vec![];
for name in &pkg_task_config.matched_tasks {
sort_visit(name, &mut sorted, Vec::new(), &pkg_task_config.tasks_config)?;
--- tests/specs/task/dependencies_root_not_cycle/__test__.jsonc
@@ -1,5 +0,0 @@
-{
- "args": "task a",
- "cwd": "member",
- "output": "task.out"
-}
--- tests/specs/task/dependencies_root_not_cycle/deno.json
@@ -1,10 +0,0 @@
-{
- "tasks": {
- "a": "echo root-a",
- "b": {
- "dependencies": ["a"],
- "command": "echo b"
- }
- },
- "workspace": ["./member"]
-}
--- tests/specs/task/dependencies_root_not_cycle/member/deno.json
@@ -1,8 +0,0 @@
-{
- "tasks": {
- "a": {
- "dependencies": ["b"],
- "command": "echo a"
- }
- }
-}
--- tests/specs/task/dependencies_root_not_cycle/task.out
@@ -1,6 +0,0 @@
-Task a echo root-a
-root-a
-Task b echo b
-b
-Task a echo a
-a
--- tests/specs/task/dependencies_shadowed_root_name/__test__.jsonc
@@ -1,14 +0,0 @@
-{
- "tests": {
- "root_dependending_root": {
- "args": "task root-depending-root",
- "cwd": "member",
- "output": "root_dependending_root.out"
- },
- "member_depending_root_and_member": {
- "args": "task member-dependending-root-and-member",
- "cwd": "member",
- "output": "member_depending_root_and_member.out"
- }
- }
-}
--- tests/specs/task/dependencies_shadowed_root_name/deno.jsonc
@@ -1,12 +0,0 @@
-{
- "tasks": {
- "build": "echo root",
- "root-depending-root": {
- "dependencies": [
- "build"
- ],
- "command": "echo test"
- }
- },
- "workspace": ["./member"]
-}
--- tests/specs/task/dependencies_shadowed_root_name/member/deno.jsonc
@@ -1,12 +0,0 @@
-{
- "tasks": {
- "build": "echo member",
- "member-dependending-root-and-member": {
- "dependencies": [
- "build",
- "root-depending-root"
- ],
- "command": "echo member-test"
- }
- }
-}
--- tests/specs/task/dependencies_shadowed_root_name/member_depending_root_and_member.out
@@ -1,10 +0,0 @@
-[UNORDERED_START]
-Task build echo member
-member
-Task build echo root
-root
-Task root-depending-root echo test
-test
-[UNORDERED_END]
-Task member-dependending-root-and-member echo member-test
-member-test
--- tests/specs/task/dependencies_shadowed_root_name/root_dependending_root.out
@@ -1,4 +0,0 @@
-Task build echo root
-root
-Task root-depending-root echo test
-test
|
deno
|
denoland
|
Rust
|
Rust
| 102,021
| 5,502
|
A modern runtime for JavaScript and TypeScript.
|
denoland_deno
|
BUG_FIX
|
obvious
|
46aae1d708f077ad229f200053cab8276fd6317c
|
2023-02-25 05:34:35
|
Oleksii Trekhleb
|
Add bakers
| false
| 50
| 5
| 55
|
--- BACKERS.md
@@ -8,41 +8,12 @@
## `O(n²)` Backers
-<ul>
- <li>
- <a href="https://github.com/newrelic">
- <img
- src="https://avatars.githubusercontent.com/u/31739?s=200&v=4"
- width="30"
- height="30"
- /></a>
-  
- <a href="https://github.com/newrelic">newrelic</a>
- </li>
-</ul>
+`null`
## `O(n×log(n))` Backers
`null`
-<!--
-<table>
- <tr>
- <td align="center">
- <a href="[PROFILE_URL]">
- <img
- src="[PROFILE_IMG_SRC]"
- width="50"
- height="50"
- />
- </a>
- <br />
- <a href="[PROFILE_URL]">[PROFILE_NAME]</a>
- </td>
- </tr>
-</table>
--->
-
<!--
<ul>
<li>
--- README.md
@@ -361,26 +361,10 @@ Below is the list of some of the most used Big O notations and their performance
> You may support this project via ❤️️ [GitHub](https://github.com/sponsors/trekhleb) or ❤️️ [Patreon](https://www.patreon.com/trekhleb).
-[Folks who are backing this project](https://github.com/trekhleb/javascript-algorithms/blob/master/BACKERS.md) `∑ = 1`
-
-<table>
- <tr>
- <td align="center">
- <a href="https://github.com/newrelic">
- <img
- src="https://avatars.githubusercontent.com/u/31739?s=200&v=4"
- width="50"
- height="50"
- />
- </a>
- <br />
- <a href="https://github.com/newrelic">newrelic</a>
- </td>
- </tr>
-</table>
+[Folks who are backing this project](https://github.com/trekhleb/javascript-algorithms/blob/master/BACKERS.md) `∑ = 0`
-## Author
+> ℹ️ A few more [projects](https://trekhleb.dev/projects/) and [articles](https://trekhleb.dev/blog/) about JavaScript and algorithms on [trekhleb.dev](https://trekhleb.dev)
-[@trekhleb](https://trekhleb.dev)
+## Author
-A few more [projects](https://trekhleb.dev/projects/) and [articles](https://trekhleb.dev/blog/) about JavaScript and algorithms on [trekhleb.dev](https://trekhleb.dev)
+- [@trekhleb](https://trekhleb.dev)
|
javascript-algorithms
|
trekhleb
|
JavaScript
|
JavaScript
| 190,336
| 30,518
|
📝 Algorithms and data structures implemented in JavaScript with explanations and links to further readings
|
trekhleb_javascript-algorithms
|
DOC_CHANGE
|
Obvious
|
5b2ffeb63d1b763a9ea9de2cc4bf7c3c1c7f3e0e
| null |
Matt
|
update React to React/addons
| false
| 1
| 1
| 0
|
--- classable.js
@@ -1,4 +1,4 @@
-var React = require('react'),
+var React = require('react/addons'),
classSet = React.addons.classSet;
module.exports = {
|
mui_material-ui.json
| null | null | null | null | null | null |
mui_material-ui.json
|
CONFIG_CHANGE
|
5, obvious
|
cb9d1b88bed2ecc2434ff1017526697037144b27
| null |
Marina Mosti
|
Adds cursor-text class
| false
| 1
| 0
| 1
|
--- cursor.js
@@ -8,6 +8,7 @@ export default function() {
'.cursor-wait': { cursor: 'wait' },
'.cursor-move': { cursor: 'move' },
'.cursor-not-allowed': { cursor: 'not-allowed' },
+ '.cursor-text': { cursor: 'text'}
},
config('modules.cursor')
)
|
tailwindlabs_tailwindcss.json
| null | null | null | null | null | null |
tailwindlabs_tailwindcss.json
|
NEW_FEAT
|
5, obvious
|
6735d3654d699bf63be2c626c0fe37035ea8a9f4
|
2024-10-31 00:14:32
|
Edward Hsing
|
Update github-kyc.md
| false
| 1
| 1
| 2
|
--- .github/ISSUE_TEMPLATE/github-kyc.md
@@ -11,7 +11,7 @@ assignees: ''
Please make sure your GitHub account meets the following requirements (all are required):
-- [ ] The issue title must be: Request GitHub KYC - Your US.KG Panel registered username. This ensures our automated script can detect your registered username. (For example, your title should be: **Request GitHub KYC-example**)
+- [ ] The issue title must be: Request GitHub KYC - Your US.KG Panel registered username. This ensures our automated script can detect your registered username.
- [ ] This is not a new GitHub account; it has existing repositories and stars.
- [ ] This is my first US.KG NIC Panel account.
- [ ] I promise not to use the domain name for criminal or abusive purposes.
|
freedomain
|
digitalplatdev
|
HTML
|
HTML
| 41,142
| 933
|
DigitalPlat FreeDomain: Free Domain For Everyone
|
digitalplatdev_freedomain
|
DOC_CHANGE
|
changes in md file
|
80966ce5c44dcf79b7617c592044469db85b1d59
|
2025-02-19 19:52:58
|
Stanislav Láznička
|
integration: svm: use consistent path args pattern in etcd fetch functions Use function argument order at which the strings would appear in the etcd path.
| false
| 8
| 8
| 16
|
--- test/integration/storageversionmigrator/storageversionmigrator_test.go
@@ -89,7 +89,7 @@ func TestStorageVersionMigration(t *testing.T) {
}
wantPrefix := "k8s:enc:aescbc:v1:key2"
- etcdSecret, err := svmTest.getRawSecretFromETCD(t, secret.Namespace, secret.Name)
+ etcdSecret, err := svmTest.getRawSecretFromETCD(t, secret.Name, secret.Namespace)
if err != nil {
t.Fatalf("Failed to get secret from etcd: %v", err)
}
--- test/integration/storageversionmigrator/util.go
@@ -387,9 +387,9 @@ func (svm *svmTest) createSecret(ctx context.Context, t *testing.T, name, namesp
return svm.client.CoreV1().Secrets(secret.Namespace).Create(ctx, secret, metav1.CreateOptions{})
}
-func (svm *svmTest) getRawSecretFromETCD(t *testing.T, namespace, name string) ([]byte, error) {
+func (svm *svmTest) getRawSecretFromETCD(t *testing.T, name, namespace string) ([]byte, error) {
t.Helper()
- secretETCDPath := getETCDPathForResource(t, svm.storageConfig.Prefix, "", "secrets", namespace, name)
+ secretETCDPath := getETCDPathForResource(t, svm.storageConfig.Prefix, "", "secrets", name, namespace)
etcdResponse, err := svm.readRawRecordFromETCD(t, secretETCDPath)
if err != nil {
return nil, fmt.Errorf("failed to read %s from etcd: %w", secretETCDPath, err)
@@ -397,7 +397,7 @@ func (svm *svmTest) getRawSecretFromETCD(t *testing.T, namespace, name string) (
return etcdResponse.Kvs[0].Value, nil
}
-func getETCDPathForResource(t *testing.T, storagePrefix, group, resource, namespaceName, name string) string {
+func getETCDPathForResource(t *testing.T, storagePrefix, group, resource, name, namespaceName string) string {
t.Helper()
groupResource := resource
if group != "" {
@@ -431,9 +431,9 @@ func (svm *svmTest) readRawRecordFromETCD(t *testing.T, path string) (*clientv3.
return response, nil
}
-func (svm *svmTest) getRawCRFromETCD(t *testing.T, crdGroup, crdName, namespace, name string) ([]byte, error) {
+func (svm *svmTest) getRawCRFromETCD(t *testing.T, name, namespace, crdGroup, crdName string) ([]byte, error) {
t.Helper()
- crdETCDPath := getETCDPathForResource(t, svm.storageConfig.Prefix, crdGroup, crdName, namespace, name)
+ crdETCDPath := getETCDPathForResource(t, svm.storageConfig.Prefix, crdGroup, crdName, name, namespace)
etcdResponse, err := svm.readRawRecordFromETCD(t, crdETCDPath)
if err != nil {
t.Fatalf("failed to read %s from etcd: %v", crdETCDPath, err)
@@ -1056,7 +1056,7 @@ func (svm *svmTest) setupServerCert(t *testing.T) *certContext {
func (svm *svmTest) isCRStoredAtVersion(t *testing.T, version, crName string) bool {
t.Helper()
- data, err := svm.getRawCRFromETCD(t, crdGroup, crdName+"s", defaultNamespace, crName)
+ data, err := svm.getRawCRFromETCD(t, crName, defaultNamespace, crdGroup, crdName+"s")
if err != nil {
t.Fatalf("Failed to get CR from etcd: %v", err)
}
@@ -1135,7 +1135,7 @@ func (svm *svmTest) validateRVAndGeneration(ctx context.Context, t *testing.T, c
for crName, version := range crVersions {
// get CR from etcd
- data, err := svm.getRawCRFromETCD(t, crdGroup, crdName+"s", defaultNamespace, crName)
+ data, err := svm.getRawCRFromETCD(t, crName, defaultNamespace, crdGroup, crdName+"s")
if err != nil {
t.Fatalf("Failed to get CR from etcd: %v", err)
}
|
kubernetes
|
kubernetes
|
Go
|
Go
| 113,460
| 40,344
|
Production-Grade Container Scheduling and Management
|
nan_kubernetes
|
BUG_FIX
|
Obvious
|
4843d7d6c2883465ee9c3c5cafcae805b19ce615
|
2024-10-03 03:08:14
|
Kieran
|
Updated tzinfo package (#402)
| false
| 1
| 1
| 2
|
--- mix.lock
@@ -61,7 +61,7 @@
"telemetry_metrics": {:hex, :telemetry_metrics, "0.6.2", "2caabe9344ec17eafe5403304771c3539f3b6e2f7fb6a6f602558c825d0d0bfb", [:mix], [{:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "9b43db0dc33863930b9ef9d27137e78974756f5f198cae18409970ed6fa5b561"},
"telemetry_poller": {:hex, :telemetry_poller, "1.0.0", "db91bb424e07f2bb6e73926fcafbfcbcb295f0193e0a00e825e589a0a47e8453", [:rebar3], [{:telemetry, "~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "b3a24eafd66c3f42da30fc3ca7dda1e9d546c12250a2d60d7b81d264fbec4f6e"},
"timex": {:hex, :timex, "3.7.11", "bb95cb4eb1d06e27346325de506bcc6c30f9c6dea40d1ebe390b262fad1862d1", [:mix], [{:combine, "~> 0.10", [hex: :combine, repo: "hexpm", optional: false]}, {:gettext, "~> 0.20", [hex: :gettext, repo: "hexpm", optional: false]}, {:tzdata, "~> 1.1", [hex: :tzdata, repo: "hexpm", optional: false]}], "hexpm", "8b9024f7efbabaf9bd7aa04f65cf8dcd7c9818ca5737677c7b76acbc6a94d1aa"},
- "tzdata": {:hex, :tzdata, "1.1.2", "45e5f1fcf8729525ec27c65e163be5b3d247ab1702581a94674e008413eef50b", [:mix], [{:hackney, "~> 1.17", [hex: :hackney, repo: "hexpm", optional: false]}], "hexpm", "cec7b286e608371602318c414f344941d5eb0375e14cfdab605cca2fe66cba8b"},
+ "tzdata": {:hex, :tzdata, "1.1.1", "20c8043476dfda8504952d00adac41c6eda23912278add38edc140ae0c5bcc46", [:mix], [{:hackney, "~> 1.17", [hex: :hackney, repo: "hexpm", optional: false]}], "hexpm", "a69cec8352eafcd2e198dea28a34113b60fdc6cb57eb5ad65c10292a6ba89787"},
"unicode_util_compat": {:hex, :unicode_util_compat, "0.7.0", "bc84380c9ab48177092f43ac89e4dfa2c6d62b40b8bd132b1059ecc7232f9a78", [:rebar3], [], "hexpm", "25eee6d67df61960cf6a794239566599b09e17e668d3700247bc498638152521"},
"websock": {:hex, :websock, "0.5.3", "2f69a6ebe810328555b6fe5c831a851f485e303a7c8ce6c5f675abeb20ebdadc", [:mix], [], "hexpm", "6105453d7fac22c712ad66fab1d45abdf049868f253cf719b625151460b8b453"},
"websock_adapter": {:hex, :websock_adapter, "0.5.7", "65fa74042530064ef0570b75b43f5c49bb8b235d6515671b3d250022cb8a1f9e", [:mix], [{:bandit, ">= 0.6.0", [hex: :bandit, repo: "hexpm", optional: true]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 2.6", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:websock, "~> 0.5", [hex: :websock, repo: "hexpm", optional: false]}], "hexpm", "d0f478ee64deddfec64b800673fd6e0c8888b079d9f3444dd96d2a98383bdbd1"},
|
pinchflat
|
kieraneglin
|
Elixir
|
Elixir
| 2,779
| 59
|
Your next YouTube media manager
|
kieraneglin_pinchflat
|
CONFIG_CHANGE
|
version updates are done
|
8f3800545c9e719b3a34debf18991098860dfbac
|
2024-11-30 00:02:30
|
Edward Hsing
|
Update README.md
| false
| 1
| 1
| 2
|
--- README.md
@@ -32,7 +32,7 @@ Jump in and register your domain by visiting our site:
### 🌟 Trusted by Thousands
-With over 90,000 domains already registered, DigitalPlat FreeDomain is a trusted choice for individuals and organizations alike. Join our growing community and claim your own free domain today!
+With over 80,000 domains already registered, DigitalPlat FreeDomain is a trusted choice for individuals and organizations alike. Join our growing community and claim your own free domain today!
---
|
freedomain
|
digitalplatdev
|
HTML
|
HTML
| 41,142
| 933
|
DigitalPlat FreeDomain: Free Domain For Everyone
|
digitalplatdev_freedomain
|
DOC_CHANGE
|
changes in readme
|
e5bb89b0b09ee6807de70d716fe0d521d25ff728
|
2023-03-03 18:25:12
|
Richard McElreath
|
week 8 solutions and week 9 hw
| false
| 0
| 0
| 0
|
--- homework/week08_solutions.pdf
Binary files a/homework/week08_solutions.pdf and /dev/null differ
--- homework/week09.pdf
Binary files a/homework/week09.pdf and /dev/null differ
|
stat_rethinking_2024
|
rmcelreath
|
R
|
R
| 1,474
| 151
| null |
rmcelreath_stat_rethinking_2024
|
DOC_CHANGE
|
binary changes in .py files inside docs folder
|
94136578bc89e4b973c471050ae9c2d83ffcb7c6
|
2024-07-04 13:17:44
|
Amr Bashir
|
fix(cli/migrate): fix `clipboard `permissions migration (#10186) closes #10185
The plugin has been updated recently and its permissions has changed.
| false
| 8
| 2
| 10
|
--- .changes/cli-clipboard-manager-migrate-perms.md
@@ -1,6 +0,0 @@
----
-"tauri-cli": "patch:bug"
-"@tauri-apps/cli": "patch:bug"
----
-
-Fix `migrate` command, migrating incorrect permissions for `clipboard`.
--- tooling/cli/src/migrate/config.rs
@@ -501,8 +501,8 @@ fn allowlist_to_permissions(
permissions!(allowlist, permissions, process, relaunch => "process:allow-restart");
permissions!(allowlist, permissions, process, exit => "process:allow-exit");
// clipboard
- permissions!(allowlist, permissions, clipboard, read_text => "clipboard-manager:allow-read-text");
- permissions!(allowlist, permissions, clipboard, write_text => "clipboard-manager:allow-write-text");
+ permissions!(allowlist, permissions, clipboard, read_text => "clipboard-manager:allow-read");
+ permissions!(allowlist, permissions, clipboard, write_text => "clipboard-manager:allow-write");
// app
permissions!(allowlist, permissions, app, show => "app:allow-app-show");
permissions!(allowlist, permissions, app, hide => "app:allow-app-hide");
|
tauri
|
tauri-apps
|
Rust
|
Rust
| 90,101
| 2,752
|
Build smaller, faster, and more secure desktop and mobile applications with a web frontend.
|
tauri-apps_tauri
|
BUG_FIX
|
obvious
|
1c2a26dfd2defd3513e3caa72f9d16d46590ac08
|
2023-02-14 01:13:20
|
Richard McElreath
|
Update README.md
| false
| 1
| 1
| 2
|
--- README.md
@@ -37,7 +37,7 @@ Note about slides: In some browsers, the slides don't show correctly. If points
| Week 04 | 27 January | Chapters 7,8,9 | [7] <[Overfitting](https://www.youtube.com/watch?v=1VgYIsANQck&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=7)> <[Slides](https://speakerdeck.com/rmcelreath/statistical-rethinking-2023-lecture-07)> <br> [8] <[MCMC](https://www.youtube.com/watch?v=rZk2FqX2XnY&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=8)> <[Slides](https://speakerdeck.com/rmcelreath/statistical-rethinking-2023-lecture-08)>
| Week 05 | 03 February | Chapters 10 and 11 | [9] <[Modeling Events](https://www.youtube.com/watch?v=Zi6N3GLUJmw&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=9)> <[Slides](https://speakerdeck.com/rmcelreath/statistical-rethinking-2023-lecture-09)> <br> [10] <[Counts and Confounds](https://www.youtube.com/watch?v=jokxu18egu0&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=10)> <[Slides](https://speakerdeck.com/rmcelreath/statistical-rethinking-2023-lecture-10)>
| Week 06 | 10 February | Chapters 11 and 12 | [11] <[Ordered Categories](https://www.youtube.com/watch?v=VVQaIkom5D0&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=11)> <[Slides](https://github.com/rmcelreath/stat_rethinking_2023/raw/main/slides/Lecture_11-ord_logit.pdf)> <br> [12] <[Multilevel Models](https://www.youtube.com/watch?v=iwVqiiXYeC4&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=12)> <[Slides](https://raw.githubusercontent.com/rmcelreath/stat_rethinking_2023/main/slides/Lecture_12-GLMM1.pdf)>
-| Week 07 | 17 February | Chapter 13 | [13] <[Multilevel Adventures](https://www.youtube.com/watch?v=sgqMkZeslxA&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=13)> <[Slides](https://raw.githubusercontent.com/rmcelreath/stat_rethinking_2023/main/slides/Lecture_13-GLMM2.pdf)> <br> [14] More Multilevel Models
+| Week 07 | 17 February | Chapter 13 | [13] <[Multilevel Adventures]> <[Slides](https://raw.githubusercontent.com/rmcelreath/stat_rethinking_2023/main/slides/Lecture_13-GLMM2.pdf)> <br> [14] More Multilevel Models
| Week 08 | 24 February | Chapter 14 | [15] Social networks <br> [16] Gaussian Processes
| Week 09 | 03 March | Chapter 15 | [17] Measurement Error <br> [18] Missing Data
| Week 10 | 10 March | Chapters 16 and 17 | [19] Beyond GLMs: State-space Models, ODEs <br> [20] Horoscopes
|
stat_rethinking_2024
|
rmcelreath
|
R
|
R
| 1,474
| 151
| null |
rmcelreath_stat_rethinking_2024
|
CODE_IMPROVEMENT
|
Code change: type annotation added
|
d90cf9b8ac02e7335997ca695f4269c91e7a406d
| null |
Clayton Coleman
|
Boilerplate checks on 2014, but commit included a 2015 (breaks travis) Revert boilerplate to 2014 until hack/verify-boilerplate.sh is fixed
| false
| 1
| 1
| 0
|
--- e2e-from-release.sh
@@ -1,6 +1,6 @@
#!/bin/bash
-# Copyright 2015 Google Inc. All rights reserved.
+# Copyright 2014 Google Inc. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
|
kubernetes_kubernetes.json
| null | null | null | null | null | null |
kubernetes_kubernetes.json
|
BUG_FIX
|
5, fix written in commit msg
|
65944b7bfa8e9d00eab1212e87c211bfbb76225a
|
2024-01-29 10:53:39
|
cllei12
|
Closed gesiscss/awesome-computational-social-science#59
| false
| 18
| 11
| 29
|
--- README.md
@@ -9,7 +9,7 @@ The order of entries within categories is either alphabetically or
chronologically.
**Please add your resources according to the respective ordering**
-
+
## Contents
@@ -132,17 +132,6 @@ chronologically.
> Bachelor, Master, PhD programs (alphabetically by country using [ISO 3166-1 alpha 3](https://en.wikipedia.org/wiki/ISO_3166-1) codes)
-#### Bachelor programs
-
-- [Bachelor of Social Sciences in Data Science and Policy Studies](https://dsps.ssc.cuhk.edu.hk/), Chinese University of Hong Kong, HKG
-- [Bachelor of Arts and Sciences in Social Data Science](https://web.edu.hku.hk/programme/bsds), University of Hong Kong, HKG
-- [Bachelor Computational Social Science](https://www.uva.nl/en/programmes/bachelors/computational-social-science/study-programme/study-programme.html), University of Amsterdam, NLD
-- [BS in Social Data Science](https://sdsc.umd.edu/), University of Maryland, USA
-- [BS in Social Data Analytics](https://soda.la.psu.edu/programs/undergraduate/), Pennsylvania State University, USA
-- [BS in Computational Social Science](https://www.dins.pitt.edu/academics/bs-computational-social-science), University of Pittsburgh, USA
-
-#### Master programs
-
- [Master Computational Social System](https://www.tugraz.at/en/studying-and-teaching/degree-and-certificate-programmes/masters-degree-programmes/computational-social-systems),
TU Graz, AUT
- [Master of Science program in Social Data Science](https://networkdatascience.ceu.edu/msc-social-data-science), Central European University, AUT
@@ -164,13 +153,18 @@ University of Konstanz, DEU
- [MSc Human and Social Data Science](https://www.sussex.ac.uk/study/masters/courses/human-and-social-data-science-msc), University of Sussex, GBR
- [MSc Social Data Science](https://www.exeter.ac.uk/study/postgraduate/courses/politics/socialdatasciencemsc/), University of Exeter, GBR
- [MSc Social Data Science](https://www.oii.ox.ac.uk/study/msc-in-social-data-science/), University of Oxford, GBR
+- [DPhil Social Data Science](https://www.oii.ox.ac.uk/study/dphil-in-social-data-science/), University of Oxford, GBR
+- [Bachelor of Social Sciences in Data Science and Policy Studies](https://dsps.ssc.cuhk.edu.hk/), Chinese University of Hong Kong, HKG
+- [Bachelor of Arts and Sciences in Social Data Science](https://web.edu.hku.hk/programme/bsds), University of Hong Kong, HKG
- [Masters in Computational Social Science](https://sola.iitj.ac.in/postgraduate-program/), Indian Institute of Technology (IIT) Jodhpur, IND
- [Master Politics and Data Science](https://www.ucd.ie/connected_politics/studywithus/), University College Dublin, IRE
- [MSc Social Data Science](https://hub.ucd.ie/usis/!W_HU_MENU.P_PUBLISH?p_tag=PROG&MAJR=W559), University College Dublin, IRE
+- [PhD Quantitative and Computational Social Science](https://www.ucd.ie/spire/study/prospectivephdstudents/phdquantitativeandcomputationalsocialscience/), University College Dublin, IRE
- [MSc/PG Diploma Applied Social Data Science](https://www.tcd.ie/Political_Science/programmes/postgraduate/pg-dip-applied-social-data-science/), Trinity College Dublin, IRE
- [Master Data Science for Economics](https://dse.cdl.unimi.it/en), University of Milan, ITA
- [Master in Social Data Science](https://sds.sociologia.unimib.it/), University of Milano-Bicocca, ITA
- [Master (Research) in Societal Resilience - Big Data for Society](https://www.resilience-institute.nl/en/research-master/), Vrije Universiteit Amsterdam, NLD
+- [Bachelor Computational Social Science](https://www.uva.nl/en/programmes/bachelors/computational-social-science/study-programme/study-programme.html), University of Amsterdam, NLD
- [Master's Programme Computational Social Science](https://liu.se/en/education/program/f7mcd), Linköping University, SWE
- [Master Computational Social Science](https://gsssh.ku.edu.tr/en/departments/computational-social-sciences/),
Koç University, TUR
@@ -178,20 +172,19 @@ Koç University, TUR
- [M.A. in Computational Social Science](https://macss.uchicago.edu/), University of Chicago, USA
- [M.S. in Computational Analysis & Public Policy](https://capp.uchicago.edu/), University of Chicago, USA
- [Master of Science in Data Analytics & Computational Social Science](https://www.umass.edu/social-sciences/academics/data-analytics-computational-social-science/ms-dacss), University of Massachusetts Amherst, USA
+- [Complex Networks and Systems track of the PhD in Informatics](https://informatics.indiana.edu/programs/phd-informatics/complex-networks-and-systems.html), Indiana University Bloomington, USA
- [Master of Arts in Interdisciplinary Studies: Computational Social Science Concentration](https://mais.gmu.edu/programs/la-mais-isin-css), George Mason University, USA
+- [PhD in Computational Social Science](https://science.gmu.edu/academics/departments-units/computational-data-sciences/computational-social-science-phd), George Mason University, USA
- [Master of Science in Data Science for Public Policy](https://mccourt.georgetown.edu/master-of-science-in-data-science-for-public-policy/), Georgetown University, USA
- [MA in Quantitative Methods in the Social Sciences: Data Science Focus](https://www.qmss.columbia.edu/), Columbia University, USA
+- [BS in Social Data Science](https://sdsc.umd.edu/), University of Maryland, USA
- [Master of Science in Public Policy and Data Science](https://priceschool.usc.edu/mppds/), University of Southern California, USA
- [Master's Degree Applied Urban Science and Informatics](https://cusp.nyu.edu/masters-degree/), New York University, USA
- [Master of Science in Survey and Data Science](https://surveydatascience.isr.umich.edu/survey-and-data-science-masters-degree-program), University of Michigan, USA
- [Master of Science in Social Policy + Data Analytics for Social Policy Certificate](https://www.sp2.upenn.edu/program/master-of-science-in-social-policy-data-analytics-for-social-policy-certificate/), University of Pennsylvania, USA
+- [BS in Social Data Analytics](https://soda.la.psu.edu/programs/undergraduate/), Pennsylvania State University, USA
+- [BS in Computational Social Science](https://www.dins.pitt.edu/academics/bs-computational-social-science), University of Pittsburgh, USA
-#### PhD programs
-
-- [DPhil Social Data Science](https://www.oii.ox.ac.uk/study/dphil-in-social-data-science/), University of Oxford, GBR
-- [PhD Quantitative and Computational Social Science](https://www.ucd.ie/spire/study/prospectivephdstudents/phdquantitativeandcomputationalsocialscience/), University College Dublin, IRE
-- [Complex Networks and Systems track of the PhD in Informatics](https://informatics.indiana.edu/programs/phd-informatics/complex-networks-and-systems.html), Indiana University Bloomington, USA
-- [PhD in Computational Social Science](https://science.gmu.edu/academics/departments-units/computational-data-sciences/computational-social-science-phd), George Mason University, USA
## Research Groups
|
awesome-computational-social-science
|
gesiscss
|
R
|
R
| 648
| 83
|
A list of awesome resources for Computational Social Science
|
gesiscss_awesome-computational-social-science
|
DOC_CHANGE
|
Matched \.md\b in diff
|
ee13ef64125be8113de3dba3d91f0f99fbed99f9
|
2023-01-11 04:36:34
|
Kebin Liu
|
Try to use macos-latest runner
| false
| 7
| 5
| 12
|
--- .github/workflows/main.yml
@@ -5,8 +5,8 @@ on: [push,pull_request]
jobs:
build:
- runs-on: macos-latest
-
+ runs-on: macos-10.15
+
steps:
- name: Checkout
--- deps/Makefile
@@ -1,9 +1,7 @@
-override ARCH ?= $(shell uname -m)
-
+ARCH ?= $(shell uname -m)
ifeq ($(ARCH),arm64)
-override ARCH := aarch64
+ARCH := aarch64
endif
-
TARGET := $(ARCH)-apple-macos10.12
JOBS := $(shell getconf _NPROCESSORS_ONLN)
GOROOT := $${PWD}/../dist/go
@@ -175,7 +173,7 @@ shadowsocks-libev: pcre libev c-ares libsodium mbedtls
simple-obfs:
cd simple-obfs \
&& ./autogen.sh \
- && CFLAGS="-target $(TARGET)" ./configure --prefix=$${PWD}/../dist/$(ARCH)/simple-obfs \
+ && CXXFLAGS="-target $(TARGET)" CFLAGS="-target $(TARGET)" ./configure --prefix=$${PWD}/../dist/$(ARCH)/simple-obfs \
--host=$(TARGET) \
--disable-dependency-tracking \
--disable-documentation \
|
shadowsocksx-ng
|
shadowsocks
|
Swift
|
Swift
| 32,651
| 7,935
|
Next Generation of ShadowsocksX
|
shadowsocks_shadowsocksx-ng
|
CONFIG_CHANGE
|
Obvious
|
525775b62269d61bfbc5b96564fd06f2089ec050
|
2024-02-14 01:08:49
|
Danyal Prout
|
Upgrade to v1.5.1 and v1.101308.0 (#193)
| false
| 4
| 4
| 8
|
--- Dockerfile
@@ -3,9 +3,9 @@ FROM golang:1.21 as op
WORKDIR /app
ENV REPO=https://github.com/ethereum-optimism/optimism.git
-ENV VERSION=v1.5.1
+ENV VERSION=v1.5.0
# for verification:
-ENV COMMIT=c934019745c98c749af986e75b2ec72c9c406723
+ENV COMMIT=6de6b5fc81d8ee03bb776219ba25189a04712f99
RUN git clone $REPO --branch op-node/$VERSION --single-branch . && \
git switch -c branch-$VERSION && \
@@ -19,9 +19,9 @@ FROM golang:1.21 as geth
WORKDIR /app
ENV REPO=https://github.com/ethereum-optimism/op-geth.git
-ENV VERSION=v1.101308.0
+ENV VERSION=v1.101305.3
# for verification:
-ENV COMMIT=70103aa6866c3d965f1f71154958cd9c1a7beb5b
+ENV COMMIT=ea3c3044010f97fb3d9affa0dd3c0c2beea85882
# avoid depth=1, so the geth build can read tags
RUN git clone $REPO --branch $VERSION --single-branch . && \
|
node
|
base
|
Shell
|
Shell
| 68,555
| 2,658
|
Everything required to run your own Base node
|
base_node
|
CONFIG_CHANGE
|
Obvious
|
393428e5c85a341edf121f30e1f3d64d3877b6ab
|
2024-08-07 23:20:33
|
github-actions[bot]
|
apply version updates (#10524) Co-authored-by: lucasfernog <[email protected]>
| false
| 174
| 47
| 221
|
--- .changes/pre.json
@@ -4,25 +4,17 @@
".changes/android-dev-open-adb-fix.md",
".changes/asset-resolver-dev-fallback.md",
".changes/change-pr-10435.md",
- ".changes/change-pr-10498.md",
- ".changes/check-android-lib-symbols.md",
".changes/cli-desktop-port-exposure.md",
- ".changes/cli-mobile-checks.md",
".changes/core-plugin-namespace.md",
".changes/dev-url-localhost-mobile.md",
".changes/fix-adb.md",
- ".changes/fix-colon-in-file-path.md",
".changes/fix-conf-parsing-error-filepath.md",
".changes/fix-usage-without-compression.md",
- ".changes/ios-custom-project-template.md",
".changes/ios-frameworks.md",
".changes/isolation-main-frame-origin.md",
".changes/linux-option-gtk-app-id.md",
- ".changes/min-ios-version.md",
".changes/plugin-builder-failable.md",
".changes/rc-migration.md",
- ".changes/remove-open-command.md",
- ".changes/remove-unsecure-configs.md",
- ".changes/v1-migrate-updater.md"
+ ".changes/remove-unsecure-configs.md"
]
}
--- Cargo.lock
@@ -3654,7 +3654,7 @@ checksum = "e1fc403891a21bcfb7c37834ba66a547a8f402146eba7265b5a6d88059c9ff2f"
[[package]]
name = "tauri"
-version = "2.0.0-rc.1"
+version = "2.0.0-rc.0"
dependencies = [
"anyhow",
"bytes",
@@ -3723,7 +3723,7 @@ dependencies = [
[[package]]
name = "tauri-build"
-version = "2.0.0-rc.1"
+version = "2.0.0-rc.0"
dependencies = [
"anyhow",
"cargo_toml",
@@ -3745,7 +3745,7 @@ dependencies = [
[[package]]
name = "tauri-codegen"
-version = "2.0.0-rc.1"
+version = "2.0.0-rc.0"
dependencies = [
"base64 0.22.1",
"brotli",
@@ -3782,7 +3782,7 @@ dependencies = [
[[package]]
name = "tauri-macros"
-version = "2.0.0-rc.1"
+version = "2.0.0-rc.0"
dependencies = [
"heck 0.5.0",
"proc-macro2",
@@ -3794,7 +3794,7 @@ dependencies = [
[[package]]
name = "tauri-plugin"
-version = "2.0.0-rc.1"
+version = "2.0.0-rc.0"
dependencies = [
"anyhow",
"glob",
@@ -3809,7 +3809,7 @@ dependencies = [
[[package]]
name = "tauri-runtime"
-version = "2.0.0-rc.1"
+version = "2.0.0-rc.0"
dependencies = [
"dpi",
"gtk",
@@ -3826,7 +3826,7 @@ dependencies = [
[[package]]
name = "tauri-runtime-wry"
-version = "2.0.0-rc.1"
+version = "2.0.0-rc.0"
dependencies = [
"cocoa",
"gtk",
@@ -3849,7 +3849,7 @@ dependencies = [
[[package]]
name = "tauri-utils"
-version = "2.0.0-rc.1"
+version = "2.0.0-rc.0"
dependencies = [
"aes-gcm",
"brotli",
--- core/tauri-build/CHANGELOG.md
@@ -1,12 +1,5 @@
# Changelog
-## \[2.0.0-rc.1]
-
-### Dependencies
-
-- Upgraded to `[email protected]`
-- Upgraded to `[email protected]`
-
## \[2.0.0-rc.0]
### Dependencies
--- core/tauri-build/Cargo.toml
@@ -1,6 +1,6 @@
[package]
name = "tauri-build"
-version = "2.0.0-rc.1"
+version = "2.0.0-rc.0"
description = "build time code to pair with https://crates.io/crates/tauri"
exclude = [ "CHANGELOG.md", "/target" ]
readme = "README.md"
@@ -28,8 +28,8 @@ rustdoc-args = [ "--cfg", "docsrs" ]
[dependencies]
anyhow = "1"
quote = { version = "1", optional = true }
-tauri-codegen = { version = "2.0.0-rc.1", path = "../tauri-codegen", optional = true }
-tauri-utils = { version = "2.0.0-rc.1", path = "../tauri-utils", features = [ "build", "resources" ] }
+tauri-codegen = { version = "2.0.0-rc.0", path = "../tauri-codegen", optional = true }
+tauri-utils = { version = "2.0.0-rc.0", path = "../tauri-utils", features = [ "build", "resources" ] }
cargo_toml = "0.17"
serde = "1"
serde_json = "1"
--- core/tauri-codegen/CHANGELOG.md
@@ -1,11 +1,5 @@
# Changelog
-## \[2.0.0-rc.1]
-
-### Dependencies
-
-- Upgraded to `[email protected]`
-
## \[2.0.0-rc.0]
### Enhancements
--- core/tauri-codegen/Cargo.toml
@@ -1,6 +1,6 @@
[package]
name = "tauri-codegen"
-version = "2.0.0-rc.1"
+version = "2.0.0-rc.0"
description = "code generation meant to be consumed inside of `tauri` through `tauri-build` or `tauri-macros`"
exclude = [ "CHANGELOG.md", "/target" ]
readme = "README.md"
@@ -20,7 +20,7 @@ quote = "1"
syn = "2"
serde = { version = "1", features = [ "derive" ] }
serde_json = "1"
-tauri-utils = { version = "2.0.0-rc.1", path = "../tauri-utils", features = [ "build" ] }
+tauri-utils = { version = "2.0.0-rc.0", path = "../tauri-utils", features = [ "build" ] }
thiserror = "1"
walkdir = "2"
brotli = { version = "3", optional = true, default-features = false, features = [ "std" ] }
--- core/tauri-macros/CHANGELOG.md
@@ -1,12 +1,5 @@
# Changelog
-## \[2.0.0-rc.1]
-
-### Dependencies
-
-- Upgraded to `[email protected]`
-- Upgraded to `[email protected]`
-
## \[2.0.0-rc.0]
### Dependencies
--- core/tauri-macros/Cargo.toml
@@ -1,6 +1,6 @@
[package]
name = "tauri-macros"
-version = "2.0.0-rc.1"
+version = "2.0.0-rc.0"
description = "Macros for the tauri crate."
exclude = [ "CHANGELOG.md", "/target" ]
readme = "README.md"
@@ -20,8 +20,8 @@ proc-macro2 = { version = "1", features = [ "span-locations" ] }
quote = "1"
syn = { version = "2", features = [ "full" ] }
heck = "0.5"
-tauri-codegen = { version = "2.0.0-rc.1", default-features = false, path = "../tauri-codegen" }
-tauri-utils = { version = "2.0.0-rc.1", path = "../tauri-utils" }
+tauri-codegen = { version = "2.0.0-rc.0", default-features = false, path = "../tauri-codegen" }
+tauri-utils = { version = "2.0.0-rc.0", path = "../tauri-utils" }
[features]
custom-protocol = [ ]
--- core/tauri-plugin/CHANGELOG.md
@@ -1,11 +1,5 @@
# Changelog
-## \[2.0.0-rc.1]
-
-### Dependencies
-
-- Upgraded to `[email protected]`
-
## \[2.0.0-rc.0]
### Dependencies
--- core/tauri-plugin/Cargo.toml
@@ -1,6 +1,6 @@
[package]
name = "tauri-plugin"
-version = "2.0.0-rc.1"
+version = "2.0.0-rc.0"
description = "Build script and runtime Tauri plugin definitions"
authors = { workspace = true }
homepage = { workspace = true }
@@ -30,7 +30,7 @@ runtime = [ ]
[dependencies]
anyhow = { version = "1", optional = true }
serde = { version = "1", optional = true }
-tauri-utils = { version = "2.0.0-rc.1", default-features = false, features = [ "build" ], path = "../tauri-utils" }
+tauri-utils = { version = "2.0.0-rc.0", default-features = false, features = [ "build" ], path = "../tauri-utils" }
serde_json = { version = "1", optional = true }
glob = { version = "0.3", optional = true }
toml = { version = "0.8", optional = true }
--- core/tauri-runtime-wry/CHANGELOG.md
@@ -1,12 +1,5 @@
# Changelog
-## \[2.0.0-rc.1]
-
-### Dependencies
-
-- Upgraded to `[email protected]`
-- Upgraded to `[email protected]`
-
## \[2.0.0-rc.0]
### Dependencies
--- core/tauri-runtime-wry/Cargo.toml
@@ -1,6 +1,6 @@
[package]
name = "tauri-runtime-wry"
-version = "2.0.0-rc.1"
+version = "2.0.0-rc.0"
description = "Wry bindings to the Tauri runtime"
exclude = [ "CHANGELOG.md", "/target" ]
readme = "README.md"
@@ -15,8 +15,8 @@ rust-version = { workspace = true }
[dependencies]
wry = { version = "0.41", default-features = false, features = [ "drag-drop", "protocol", "os-webview" ] }
tao = { version = "0.28.1", default-features = false, features = [ "rwh_06" ] }
-tauri-runtime = { version = "2.0.0-rc.1", path = "../tauri-runtime" }
-tauri-utils = { version = "2.0.0-rc.1", path = "../tauri-utils" }
+tauri-runtime = { version = "2.0.0-rc.0", path = "../tauri-runtime" }
+tauri-utils = { version = "2.0.0-rc.0", path = "../tauri-utils" }
raw-window-handle = "0.6"
http = "1.1"
url = "2"
--- core/tauri-runtime/CHANGELOG.md
@@ -1,11 +1,5 @@
# Changelog
-## \[2.0.0-rc.1]
-
-### Dependencies
-
-- Upgraded to `[email protected]`
-
## \[2.0.0-rc.0]
### Dependencies
--- core/tauri-runtime/Cargo.toml
@@ -1,6 +1,6 @@
[package]
name = "tauri-runtime"
-version = "2.0.0-rc.1"
+version = "2.0.0-rc.0"
description = "Runtime for Tauri applications"
exclude = [ "CHANGELOG.md", "/target" ]
readme = "README.md"
@@ -29,7 +29,7 @@ targets = [
serde = { version = "1.0", features = [ "derive" ] }
serde_json = "1.0"
thiserror = "1.0"
-tauri-utils = { version = "2.0.0-rc.1", path = "../tauri-utils" }
+tauri-utils = { version = "2.0.0-rc.0", path = "../tauri-utils" }
http = "1.1"
raw-window-handle = "0.6"
url = { version = "2" }
--- core/tauri-utils/CHANGELOG.md
@@ -1,16 +1,5 @@
# Changelog
-## \[2.0.0-rc.1]
-
-### New Features
-
-- [`8dc81b6cc`](https://www.github.com/tauri-apps/tauri/commit/8dc81b6cc2b8235b11f74a971d6aa3a5df5e9f68) ([#10496](https://www.github.com/tauri-apps/tauri/pull/10496) by [@lucasfernog](https://www.github.com/tauri-apps/tauri/../../lucasfernog)) Added `bundle > ios > template` configuration option for custom Xcode project YML Handlebars template using XcodeGen.
-- [`02c00abc6`](https://www.github.com/tauri-apps/tauri/commit/02c00abc63cf86e9bf9179cbb143d5145a9397b6) ([#10495](https://www.github.com/tauri-apps/tauri/pull/10495) by [@lucasfernog](https://www.github.com/tauri-apps/tauri/../../lucasfernog)) Added `bundle > ios > minimumSystemVersion` configuration option.
-
-### Bug Fixes
-
-- [`7e810cb2a`](https://www.github.com/tauri-apps/tauri/commit/7e810cb2a3fd934017ae973e737864dfa4bdf64e) ([#10485](https://www.github.com/tauri-apps/tauri/pull/10485) by [@anatawa12](https://www.github.com/tauri-apps/tauri/../../anatawa12)) Fixed an issue where permission files will be generated with ':' in the file path.
-
## \[2.0.0-rc.0]
### New Features
--- core/tauri-utils/Cargo.toml
@@ -1,6 +1,6 @@
[package]
name = "tauri-utils"
-version = "2.0.0-rc.1"
+version = "2.0.0-rc.0"
description = "Utilities for Tauri"
exclude = [ "CHANGELOG.md", "/target" ]
readme = "README.md"
--- core/tauri/CHANGELOG.md
@@ -1,15 +1,5 @@
# Changelog
-## \[2.0.0-rc.1]
-
-### Dependencies
-
-- Upgraded to `[email protected]`
-- Upgraded to `[email protected]`
-- Upgraded to `[email protected]`
-- Upgraded to `[email protected]`
-- Upgraded to `[email protected]`
-
## \[2.0.0-rc.0]
### Bug Fixes
--- core/tauri/Cargo.toml
@@ -1,6 +1,6 @@
[package]
name = "tauri"
-version = "2.0.0-rc.1"
+version = "2.0.0-rc.0"
description = "Make tiny, secure apps for all desktop platforms with Tauri"
exclude = [ "/test", "/.scripts", "CHANGELOG.md", "/target" ]
readme = "README.md"
@@ -51,10 +51,10 @@ uuid = { version = "1", features = [ "v4" ], optional = true }
url = "2"
anyhow = "1.0"
thiserror = "1.0"
-tauri-runtime = { version = "2.0.0-rc.1", path = "../tauri-runtime" }
-tauri-macros = { version = "2.0.0-rc.1", path = "../tauri-macros" }
-tauri-utils = { version = "2.0.0-rc.1", features = [ "resources" ], path = "../tauri-utils" }
-tauri-runtime-wry = { version = "2.0.0-rc.1", path = "../tauri-runtime-wry", optional = true }
+tauri-runtime = { version = "2.0.0-rc.0", path = "../tauri-runtime" }
+tauri-macros = { version = "2.0.0-rc.0", path = "../tauri-macros" }
+tauri-utils = { version = "2.0.0-rc.0", features = [ "resources" ], path = "../tauri-utils" }
+tauri-runtime-wry = { version = "2.0.0-rc.0", path = "../tauri-runtime-wry", optional = true }
getrandom = "0.2"
serde_repr = "0.1"
state = "0.6"
@@ -110,8 +110,8 @@ swift-rs = "1.0.6"
[build-dependencies]
heck = "0.5"
-tauri-build = { path = "../tauri-build/", default-features = false, version = "2.0.0-rc.1" }
-tauri-utils = { path = "../tauri-utils/", version = "2.0.0-rc.1", features = [ "build" ] }
+tauri-build = { path = "../tauri-build/", default-features = false, version = "2.0.0-rc.0" }
+tauri-utils = { path = "../tauri-utils/", version = "2.0.0-rc.0", features = [ "build" ] }
[dev-dependencies]
proptest = "1.4.0"
--- tooling/bundler/CHANGELOG.md
@@ -1,15 +1,5 @@
# Changelog
-## \[2.0.1-rc.0]
-
-### Bug Fixes
-
-- [`a440a3f9d`](https://www.github.com/tauri-apps/tauri/commit/a440a3f9d85376d994f2ba904b1ae0828c5a0fbb) ([#10498](https://www.github.com/tauri-apps/tauri/pull/10498) by [@catalinsh](https://www.github.com/tauri-apps/tauri/../../catalinsh)) Correct nsis pre-uninstall hook to post-uninstall
-
-### Dependencies
-
-- Upgraded to `[email protected]`
-
## \[2.0.0-rc.0]
### Dependencies
--- tooling/bundler/Cargo.toml
@@ -2,7 +2,7 @@ workspace = { }
[package]
name = "tauri-bundler"
-version = "2.0.1-rc.0"
+version = "2.0.0-rc.0"
authors = [
"George Burton <[email protected]>",
"Tauri Programme within The Commons Conservancy"
@@ -17,7 +17,7 @@ rust-version = "1.70"
exclude = [ "CHANGELOG.md", "/target", "rustfmt.toml" ]
[dependencies]
-tauri-utils = { version = "2.0.0-rc.1", path = "../../core/tauri-utils", features = [ "resources" ] }
+tauri-utils = { version = "2.0.0-rc.0", path = "../../core/tauri-utils", features = [ "resources" ] }
image = "0.24.9"
flate2 = "1.0"
anyhow = "1.0"
--- tooling/cli/CHANGELOG.md
@@ -1,30 +1,5 @@
# Changelog
-## \[2.0.0-rc.2]
-
-### New Features
-
-- [`8dc81b6cc`](https://www.github.com/tauri-apps/tauri/commit/8dc81b6cc2b8235b11f74a971d6aa3a5df5e9f68) ([#10496](https://www.github.com/tauri-apps/tauri/pull/10496) by [@lucasfernog](https://www.github.com/tauri-apps/tauri/../../lucasfernog)) Added `bundle > ios > template` configuration option for custom Xcode project YML Handlebars template using XcodeGen.
-- [`02c00abc6`](https://www.github.com/tauri-apps/tauri/commit/02c00abc63cf86e9bf9179cbb143d5145a9397b6) ([#10495](https://www.github.com/tauri-apps/tauri/pull/10495) by [@lucasfernog](https://www.github.com/tauri-apps/tauri/../../lucasfernog)) Added `bundle > ios > minimumSystemVersion` configuration option.
-
-### Enhancements
-
-- [`8e1e15304`](https://www.github.com/tauri-apps/tauri/commit/8e1e15304e9dc98d7f875fc8dceb7d4ce19adc47) ([#10483](https://www.github.com/tauri-apps/tauri/pull/10483) by [@lucasfernog](https://www.github.com/tauri-apps/tauri/../../lucasfernog)) Check if the Rust library contains the symbols required at runtime for Android and iOS apps.
-- [`ca6868956`](https://www.github.com/tauri-apps/tauri/commit/ca68689564cbc8dfa9a5220d3daf81a44ef81fcc) ([#10479](https://www.github.com/tauri-apps/tauri/pull/10479) by [@lucasfernog](https://www.github.com/tauri-apps/tauri/../../lucasfernog)) Check if identifier or lib name changed when running mobile commands.
-
-### Bug Fixes
-
-- [`2e8ab7bac`](https://www.github.com/tauri-apps/tauri/commit/2e8ab7bac12046d734fb07a1b4fe5e03004b305e) ([#10481](https://www.github.com/tauri-apps/tauri/pull/10481) by [@lucasfernog](https://www.github.com/tauri-apps/tauri/../../lucasfernog)) Migration from v1 to v2 now adds the updater plugin when it is active.
-
-### What's Changed
-
-- [`a3cd9779a`](https://www.github.com/tauri-apps/tauri/commit/a3cd9779a47428e306a628d658740669faf69ccd) ([#10480](https://www.github.com/tauri-apps/tauri/pull/10480) by [@lucasfernog](https://www.github.com/tauri-apps/tauri/../../lucasfernog)) Removed the `[android|ios] open` command. It is recommended to use `[android|ios] dev --open` or `[android|ios] build --open` instead.
-
-### Dependencies
-
-- Upgraded to `[email protected]`
-- Upgraded to `[email protected]`
-
## \[2.0.0-rc.1]
### Bug Fixes
--- tooling/cli/Cargo.lock
@@ -5080,7 +5080,7 @@ dependencies = [
[[package]]
name = "tauri-bundler"
-version = "2.0.1-rc.0"
+version = "2.0.0-rc.0"
dependencies = [
"anyhow",
"ar",
@@ -5109,7 +5109,7 @@ dependencies = [
"tar",
"tauri-icns",
"tauri-macos-sign",
- "tauri-utils 2.0.0-rc.1",
+ "tauri-utils 2.0.0-rc.0",
"tempfile",
"thiserror",
"time",
@@ -5123,7 +5123,7 @@ dependencies = [
[[package]]
name = "tauri-cli"
-version = "2.0.0-rc.2"
+version = "2.0.0-rc.1"
dependencies = [
"anyhow",
"axum",
@@ -5182,7 +5182,7 @@ dependencies = [
"tauri-icns",
"tauri-macos-sign",
"tauri-utils 1.5.4",
- "tauri-utils 2.0.0-rc.1",
+ "tauri-utils 2.0.0-rc.0",
"tokio",
"toml 0.8.10",
"toml_edit 0.22.6",
@@ -5262,7 +5262,7 @@ dependencies = [
[[package]]
name = "tauri-utils"
-version = "2.0.0-rc.1"
+version = "2.0.0-rc.0"
dependencies = [
"aes-gcm",
"ctor",
--- tooling/cli/Cargo.toml
@@ -3,7 +3,7 @@ members = [ "node" ]
[package]
name = "tauri-cli"
-version = "2.0.0-rc.2"
+version = "2.0.0-rc.1"
authors = [ "Tauri Programme within The Commons Conservancy" ]
edition = "2021"
rust-version = "1.70"
@@ -48,7 +48,7 @@ sublime_fuzzy = "0.7"
clap_complete = "4"
clap = { version = "4.5", features = [ "derive", "env" ] }
anyhow = "1.0"
-tauri-bundler = { version = "2.0.1-rc.0", default-features = false, path = "../bundler" }
+tauri-bundler = { version = "2.0.0-rc.0", default-features = false, path = "../bundler" }
colored = "2.1"
serde = { version = "1.0", features = [ "derive" ] }
serde_json = { version = "1.0", features = [ "preserve_order" ] }
@@ -58,7 +58,7 @@ shared_child = "1.0"
duct = "0.13"
toml_edit = { version = "0.22", features = [ "serde" ] }
json-patch = "1.2"
-tauri-utils = { version = "2.0.0-rc.1", path = "../../core/tauri-utils", features = [ "isolation", "schema", "config-json5", "config-toml" ] }
+tauri-utils = { version = "2.0.0-rc.0", path = "../../core/tauri-utils", features = [ "isolation", "schema", "config-json5", "config-toml" ] }
tauri-utils-v1 = { version = "1", package = "tauri-utils", features = [ "isolation", "schema", "config-json5", "config-toml" ] }
toml = "0.8"
jsonschema = "0.17"
--- tooling/cli/metadata-v2.json
@@ -1,9 +1,9 @@
{
"cli.js": {
- "version": "2.0.0-rc.2",
+ "version": "2.0.0-rc.1",
"node": ">= 10.0.0"
},
- "tauri": "2.0.0-rc.1",
- "tauri-build": "2.0.0-rc.1",
- "tauri-plugin": "2.0.0-rc.1"
+ "tauri": "2.0.0-rc.0",
+ "tauri-build": "2.0.0-rc.0",
+ "tauri-plugin": "2.0.0-rc.0"
}
--- tooling/cli/node/CHANGELOG.md
@@ -1,29 +1,5 @@
# Changelog
-## \[2.0.0-rc.2]
-
-### New Features
-
-- [`8dc81b6cc`](https://www.github.com/tauri-apps/tauri/commit/8dc81b6cc2b8235b11f74a971d6aa3a5df5e9f68) ([#10496](https://www.github.com/tauri-apps/tauri/pull/10496) by [@lucasfernog](https://www.github.com/tauri-apps/tauri/../../lucasfernog)) Added `bundle > ios > template` configuration option for custom Xcode project YML Handlebars template using XcodeGen.
-- [`02c00abc6`](https://www.github.com/tauri-apps/tauri/commit/02c00abc63cf86e9bf9179cbb143d5145a9397b6) ([#10495](https://www.github.com/tauri-apps/tauri/pull/10495) by [@lucasfernog](https://www.github.com/tauri-apps/tauri/../../lucasfernog)) Added `bundle > ios > minimumSystemVersion` configuration option.
-
-### Enhancements
-
-- [`8e1e15304`](https://www.github.com/tauri-apps/tauri/commit/8e1e15304e9dc98d7f875fc8dceb7d4ce19adc47) ([#10483](https://www.github.com/tauri-apps/tauri/pull/10483) by [@lucasfernog](https://www.github.com/tauri-apps/tauri/../../lucasfernog)) Check if the Rust library contains the symbols required at runtime for Android and iOS apps.
-- [`ca6868956`](https://www.github.com/tauri-apps/tauri/commit/ca68689564cbc8dfa9a5220d3daf81a44ef81fcc) ([#10479](https://www.github.com/tauri-apps/tauri/pull/10479) by [@lucasfernog](https://www.github.com/tauri-apps/tauri/../../lucasfernog)) Check if identifier or lib name changed when running mobile commands.
-
-### Bug Fixes
-
-- [`2e8ab7bac`](https://www.github.com/tauri-apps/tauri/commit/2e8ab7bac12046d734fb07a1b4fe5e03004b305e) ([#10481](https://www.github.com/tauri-apps/tauri/pull/10481) by [@lucasfernog](https://www.github.com/tauri-apps/tauri/../../lucasfernog)) Migration from v1 to v2 now adds the updater plugin when it is active.
-
-### What's Changed
-
-- [`a3cd9779a`](https://www.github.com/tauri-apps/tauri/commit/a3cd9779a47428e306a628d658740669faf69ccd) ([#10480](https://www.github.com/tauri-apps/tauri/pull/10480) by [@lucasfernog](https://www.github.com/tauri-apps/tauri/../../lucasfernog)) Removed the `[android|ios] open` command. It is recommended to use `[android|ios] dev --open` or `[android|ios] build --open` instead.
-
-### Dependencies
-
-- Upgraded to `[email protected]`
-
## \[2.0.0-rc.1]
### Bug Fixes
--- tooling/cli/node/package.json
@@ -1,6 +1,6 @@
{
"name": "@tauri-apps/cli",
- "version": "2.0.0-rc.2",
+ "version": "2.0.0-rc.1",
"description": "Command line interface for building Tauri apps",
"funding": {
"type": "opencollective",
|
tauri
|
tauri-apps
|
Rust
|
Rust
| 90,101
| 2,752
|
Build smaller, faster, and more secure desktop and mobile applications with a web frontend.
|
tauri-apps_tauri
|
CONFIG_CHANGE
|
version updates
|
38fd3f293c93a0af99617840cf75034254c00845
|
2023-08-14 11:48:40
|
Jelly Lee
|
Create README.md
| false
| 6
| 0
| 6
|
--- model-compression/quantization/llm-qat/cfd70ff/README.md
@@ -1,6 +0,0 @@
-
-
-
-```
-bash run_train.sh 8 8 8
-```
|
llm-action
|
liguodongiot
|
HTML
|
HTML
| 15,588
| 1,812
|
本项目旨在分享大模型相关技术原理以及实战经验(大模型工程化、大模型应用落地)
|
liguodongiot_llm-action
|
DOC_CHANGE
|
Obvious
|
02a7730756e5f2100b9408a03c79cc974efe7696
|
2023-08-15 19:45:11
|
Jacob Lärfors
|
add pledge from verifa
| false
| 1
| 0
| 1
|
--- index.html
@@ -175,7 +175,6 @@
<li><a href="https://rivet.gg">Rivet</a></li>
<li><a href="https://terramate.io">Terramate</a></li>
<li><a href="https://terrateam.io">Terrateam</a></li>
- <li><a href="https://verifa.io">Verifa</a></li>
</ul>
<h2>Contact us</h2>
|
manifesto
|
opentofu
|
HTML
|
HTML
| 36,134
| 1,083
|
The OpenTF Manifesto expresses concern over HashiCorp's switch of the Terraform license from open-source to the Business Source License (BSL) and calls for the tool's return to a truly open-source license.
|
opentofu_manifesto
|
NEW_FEAT
|
obvious
|
6f6523fa7506f400f32f504b341da850fa9a6353
|
2024-05-23 14:31:13
|
Haydn Vestal
|
Create CONTRIBUTING.md
| false
| 21
| 0
| 21
|
--- CONTRIBUTING.md
@@ -1,21 +0,0 @@
-## How to Contribute
-
-We appreciate your interest in Warp Factory! If you're new to the project, please [create an issue](https://github.com/NerdsWithAttitudes/WarpFactory/issues) before sending a pull request so that we can assess how your use-case fits in with our overall project goals and how best to support it.
-
-You can find instructions for creating a pull request in
-[GitHub Help](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/about-pull-requests).
-
-If you'd like to support the project in other ways, please consider:
- - Citing it in a paper (see the instructions in the README)
- - Applying for a [Warp Grant](https://appliedphysics.org/warp-grants/)
- - Starring the project
- - Tweeting about it
-
-## Forking the project
-
-Since Warp Factory is released under the MIT license, anyone is free to fork the project and release derivative works so long as Warp Factory is properly cited. If you choose this route, we're still super interested in your work!
-Send us an email at [[email protected]](mailto:[email protected]) so we can keep in touch.
-
-## Licensing
-
-By contributing code to Warp Factory, you represent that you own the copyright on your contributions, or that you have followed the licensing requirements of the copyright holder, and that Applied Physics may use your code without any further restrictions other than those specified in the MIT license. A copy of the license can be found in the `LICENSE` file in the root directory of the project.
|
warpfactory
|
nerdswithattitudes
|
MATLAB
|
MATLAB
| 298
| 41
|
WarpFactory is a numerical toolkit for analyzing warp drive spacetimes.
|
nerdswithattitudes_warpfactory
|
DOC_CHANGE
|
Obvious
|
91e7398e0603a8b9b028cd53ce3170c8ceef54f0
|
2025-03-25 19:59:47
|
Shay Drory
|
net/mlx5: Update pfnum retrieval for devlink port attributes Align mlx5 driver usage of 'pfnum' with the documentation clarification introduced in commit bb70b0d48d8e ("devlink: Improve the port attributes description"). Signed-off-by: Shay Drory <[email protected]> Reviewed-by: Mark Bloch <[email protected]> Signed-off-by: Tariq Toukan <[email protected]> Link: https://patch.msgid.link/[email protected] Signed-off-by: Jakub Kicinski <[email protected]>
| false
| 3
| 3
| 6
|
--- drivers/net/ethernet/mellanox/mlx5/core/esw/devlink_port.c
@@ -32,7 +32,7 @@ static void mlx5_esw_offloads_pf_vf_devlink_port_attrs_set(struct mlx5_eswitch *
u16 pfnum;
mlx5_esw_get_port_parent_id(dev, &ppid);
- pfnum = PCI_FUNC(dev->pdev->devfn);
+ pfnum = mlx5_get_dev_index(dev);
external = mlx5_core_is_ecpf_esw_manager(dev);
if (external)
controller_num = dev->priv.eswitch->offloads.host_number + 1;
@@ -110,7 +110,7 @@ static void mlx5_esw_offloads_sf_devlink_port_attrs_set(struct mlx5_eswitch *esw
struct netdev_phys_item_id ppid = {};
u16 pfnum;
- pfnum = PCI_FUNC(dev->pdev->devfn);
+ pfnum = mlx5_get_dev_index(dev);
mlx5_esw_get_port_parent_id(dev, &ppid);
memcpy(dl_port->attrs.switch_id.id, &ppid.id[0], ppid.id_len);
dl_port->attrs.switch_id.id_len = ppid.id_len;
--- drivers/net/ethernet/mellanox/mlx5/core/sf/devlink.c
@@ -285,7 +285,7 @@ mlx5_sf_new_check_attr(struct mlx5_core_dev *dev, const struct devlink_port_new_
NL_SET_ERR_MSG_MOD(extack, "External controller is unsupported");
return -EOPNOTSUPP;
}
- if (new_attr->pfnum != PCI_FUNC(dev->pdev->devfn)) {
+ if (new_attr->pfnum != mlx5_get_dev_index(dev)) {
NL_SET_ERR_MSG_MOD(extack, "Invalid pfnum supplied");
return -EOPNOTSUPP;
}
|
linux
|
torvalds
|
C
|
C
| 189,022
| 55,340
|
Linux kernel source tree
|
torvalds_linux
|
CODE_IMPROVEMENT
|
commit msg and code changes suggests refactoring is done
|
319aae27567f3ddc40e9083c425f7eaf567006a0
| null |
Linus Torvalds
|
Increase the number of possible heads requested from git-upload-pack Now that git-clone-pack exists, we actually have somebody requesting more than just a single head in a pack. So allow the Jeff's of this world to clone things with tens of heads.
| false
| 1
| 1
| 0
|
--- upload-pack.c
@@ -5,7 +5,7 @@
static const char upload_pack_usage[] = "git-upload-pack <dir>";
#define MAX_HAS (16)
-#define MAX_NEEDS (16)
+#define MAX_NEEDS (256)
static int nr_has = 0, nr_needs = 0;
static unsigned char has_sha1[MAX_HAS][20];
static unsigned char needs_sha1[MAX_NEEDS][20];
|
git_git.json
| null | null | null | null | null | null |
git_git.json
|
NEW_FEAT
|
4, increased heads is probably a new feature
|
a532b36e28e1b0abd0071a3a25fb356d08e5d115
|
2024-07-12 22:33:43
|
RustDesk
|
Update playground.yml
| false
| 1
| 1
| 2
|
--- .github/workflows/playground.yml
@@ -180,7 +180,7 @@ jobs:
- name: Build rustdesk
run: |
- ./build.py --flutter ${{ matrix.job.extra-build-args }}
+ ./build.py --flutter --hwcodec ${{ matrix.job.extra-build-args }}
- name: create unsigned dmg
run: |
|
rustdesk
|
rustdesk
|
Rust
|
Rust
| 83,345
| 11,693
|
An open-source remote desktop application designed for self-hosting, as an alternative to TeamViewer.
|
rustdesk_rustdesk
|
CONFIG_CHANGE
|
Obvious
|
9e53371ddaaeab4083fde45e43c803071238e686
|
2025-03-11 15:11:37
|
novahe
|
Fix test cases that may potentially cause a panic.
| false
| 34
| 0
| 34
|
--- pkg/client/tests/remotecommand_test.go
@@ -124,7 +124,6 @@ func fakeServer(t *testing.T, requestReceived chan struct{}, testName string, ex
opts, err := remotecommand.NewOptions(req)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
if exec {
cmd := req.URL.Query()[api.ExecCommandParam]
--- staging/src/k8s.io/apiserver/pkg/admission/audit_test.go
@@ -202,12 +202,10 @@ func TestWithAuditConcurrency(t *testing.T) {
mutator, ok := handler.(MutationInterface)
if !ok {
t.Error("handler is not an interface of type MutationInterface")
- return
}
auditMutator, ok := auditHandler.(MutationInterface)
if !ok {
t.Error("handler is not an interface of type MutationInterface")
- return
}
assert.Equal(t, mutator.Admit(ctx, a, nil), auditMutator.Admit(ctx, a, nil), "WithAudit decorator should not effect the return value")
}()
--- staging/src/k8s.io/apiserver/pkg/util/proxy/streamtunnel_test.go
@@ -59,13 +59,11 @@ func TestTunnelingHandler_UpgradeStreamingAndTunneling(t *testing.T) {
_, err := httpstream.Handshake(req, w, []string{constants.PortForwardV1Name})
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
upgrader := spdy.NewResponseUpgrader()
conn := upgrader.UpgradeResponse(w, req, justQueueStream(streamChan))
if conn == nil {
t.Error("connect is unexpected nil")
- return
}
defer conn.Close() //nolint:errcheck
<-stopServerChan
@@ -105,12 +103,10 @@ func TestTunnelingHandler_UpgradeStreamingAndTunneling(t *testing.T) {
clientStream, err := spdyClient.CreateStream(http.Header{})
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
_, err = io.Copy(clientStream, bytes.NewReader(randomData))
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
clientStream.Close() //nolint:errcheck
}()
@@ -183,7 +179,6 @@ func TestTunnelingHandler_BadHandshakeError(t *testing.T) {
_, err := httpstream.Handshake(req, w, []string{constants.PortForwardV1Name})
if err == nil {
t.Errorf("handshake should have returned an error %v", err)
- return
}
assert.ErrorContains(t, err, "unable to negotiate protocol")
w.WriteHeader(http.StatusForbidden)
@@ -240,7 +235,6 @@ func TestTunnelingHandler_UpstreamSPDYServerErrorPropagated(t *testing.T) {
_, err := httpstream.Handshake(req, w, []string{constants.PortForwardV1Name})
if err != nil {
t.Errorf("handshake should have succeeded %v", err)
- return
}
// Returned status code should be incremented in metrics.
w.WriteHeader(statusCode)
--- staging/src/k8s.io/client-go/discovery/cached/disk/cached_discovery_test.go
@@ -352,7 +352,6 @@ func TestCachedDiscoveryClientUnaggregatedServerGroups(t *testing.T) {
output, err := json.Marshal(body)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
// Content-type is "unaggregated" discovery format -- no resources returned.
w.Header().Set("Content-Type", discovery.AcceptV1)
--- staging/src/k8s.io/client-go/discovery/cached/memory/memcache_test.go
@@ -589,7 +589,6 @@ func TestMemCacheGroupsAndMaybeResources(t *testing.T) {
output, err := json.Marshal(body)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
// Content-type is "unaggregated" discovery format -- no resources returned.
w.Header().Set("Content-Type", discovery.AcceptV1)
@@ -1121,7 +1120,6 @@ func TestAggregatedMemCacheGroupsAndMaybeResources(t *testing.T) {
output, err := json.Marshal(agg)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
// Content-type is "aggregated" discovery format.
w.Header().Set("Content-Type", discovery.AcceptV2)
@@ -1422,7 +1420,6 @@ func TestMemCacheAggregatedServerGroups(t *testing.T) {
output, err := json.Marshal(agg)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
// Content-type is "aggregated" discovery format.
w.Header().Set("Content-Type", discovery.AcceptV2Beta1)
--- staging/src/k8s.io/client-go/discovery/discovery_client_test.go
@@ -1325,7 +1325,6 @@ func TestAggregatedServerGroups(t *testing.T) {
output, err = json.Marshal(agg)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
// Content-Type is "aggregated" discovery format. Add extra parameter
// to ensure we are resilient to these extra parameters.
@@ -1334,7 +1333,6 @@ func TestAggregatedServerGroups(t *testing.T) {
_, err = w.Write(output)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
}))
defer server.Close()
@@ -2401,7 +2399,6 @@ func TestAggregatedServerGroupsAndResources(t *testing.T) {
output, err = json.Marshal(agg)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
} else {
var agg *apidiscoveryv2beta1.APIGroupDiscoveryList
@@ -2417,7 +2414,6 @@ func TestAggregatedServerGroupsAndResources(t *testing.T) {
output, err = json.Marshal(&agg)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
}
// Content-Type is "aggregated" discovery format. Add extra parameter
@@ -2569,7 +2565,6 @@ func TestAggregatedServerGroupsAndResourcesWithErrors(t *testing.T) {
output, err = json.Marshal(agg)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
// Content-Type is "aggregated" discovery format. Add extra parameter
// to ensure we are resilient to these extra parameters.
@@ -3187,7 +3182,6 @@ func TestAggregatedServerPreferredResources(t *testing.T) {
output, err = json.Marshal(agg)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
// Content-Type is "aggregated" discovery format. Add extra parameter
// to ensure we are resilient to these extra parameters.
--- staging/src/k8s.io/client-go/tools/portforward/tunneling_connection_test.go
@@ -53,18 +53,15 @@ func TestTunnelingConnection_ReadWriteClose(t *testing.T) {
conn, err := upgrader.Upgrade(w, req, nil)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
defer conn.Close() //nolint:errcheck
if conn.Subprotocol() != constants.WebsocketsSPDYTunnelingPortForwardV1 {
t.Errorf("Not acceptable agreement Subprotocol: %v", conn.Subprotocol())
- return
}
tunnelingConn := NewTunnelingConnection("server", conn)
spdyConn, err := spdy.NewServerConnection(tunnelingConn, justQueueStream(streamChan))
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
defer spdyConn.Close() //nolint:errcheck
<-stopServerChan
@@ -88,12 +85,10 @@ func TestTunnelingConnection_ReadWriteClose(t *testing.T) {
clientStream, err := spdyClient.CreateStream(http.Header{})
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
_, err = io.Copy(clientStream, strings.NewReader(expected))
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
clientStream.Close() //nolint:errcheck
}()
@@ -119,7 +114,6 @@ func TestTunnelingConnection_LocalRemoteAddress(t *testing.T) {
conn, err := upgrader.Upgrade(w, req, nil)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
defer conn.Close() //nolint:errcheck
if conn.Subprotocol() != constants.WebsocketsSPDYTunnelingPortForwardV1 {
@@ -156,12 +150,10 @@ func TestTunnelingConnection_ReadWriteDeadlines(t *testing.T) {
conn, err := upgrader.Upgrade(w, req, nil)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
defer conn.Close() //nolint:errcheck
if conn.Subprotocol() != constants.WebsocketsSPDYTunnelingPortForwardV1 {
t.Errorf("Not acceptable agreement Subprotocol: %v", conn.Subprotocol())
- return
}
<-stopServerChan
}))
--- staging/src/k8s.io/client-go/transport/websocket/roundtripper_test.go
@@ -115,18 +115,15 @@ func TestWebSocketRoundTripper_RoundTripperFails(t *testing.T) {
statusBytes, err := runtime.Encode(encoder, testCase.status)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
_, err = w.Write(statusBytes)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
} else if len(testCase.body) > 0 {
_, err := w.Write([]byte(testCase.body))
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
}
}))
--- staging/src/k8s.io/kube-aggregator/pkg/apiserver/handler_discovery_test.go
@@ -395,19 +395,16 @@ func TestV2Beta1Skew(t *testing.T) {
err := apidiscoveryv2scheme.Convertv2APIGroupDiscoveryListTov2beta1APIGroupDiscoveryList(&apiGroup, &v2b, nil)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
converted, err := json.Marshal(v2b)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
w.Header().Set("Content-Type", "application/json;"+"g=apidiscovery.k8s.io;v=v2beta1;as=APIGroupDiscoveryList")
w.WriteHeader(200)
_, err = w.Write(converted)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
}))
testCtx, cancel := context.WithCancel(context.Background())
--- staging/src/k8s.io/kubelet/pkg/cri/streaming/server_test.go
@@ -350,7 +350,6 @@ func runRemoteCommandTest(t *testing.T, commandType string) {
exec, err := remotecommand.NewSPDYExecutor(&restclient.Config{}, "POST", reqURL)
if err != nil {
t.Errorf("unexpected error %v", err)
- return
}
opts := remotecommand.StreamOptions{
|
kubernetes
|
kubernetes
|
Go
|
Go
| 113,460
| 40,344
|
Production-Grade Container Scheduling and Management
|
kubernetes_kubernetes
|
BUG_FIX
|
this commit fixes/polishes an earlier feature
|
0c548516bc951cbbd3f153a8e5c7b6a2108962c0
| null |
Adrian Holovaty
|
Fixed #1185 -- Fixed Python 2.4 bug in Django authentication mod_python handler. Thanks, Brian Ray git-svn-id: http://code.djangoproject.com/svn/django/trunk@1853 bcc190cf-cafb-0310-a4f2-bffc1f526a37
| false
| 2
| 1
| 1
|
--- AUTHORS
@@ -81,6 +81,7 @@ answer newbie questions, and generally made Django that much better:
phaedo <http://phaedo.cx/>
Luke Plant <http://lukeplant.me.uk/>
plisk
+ Brian Ray <http://brianray.chipy.org/>
Oliver Rutherfurd <http://rutherfurd.net/>
David Schein
sopel
--- modpython.py
@@ -13,7 +13,7 @@ def authenhandler(req, **kwargs):
from django.models.auth import users
# check for PythonOptions
- _str_to_bool = lambda s: s.lower() in '1', 'true', 'on', 'yes'
+ _str_to_bool = lambda s: s.lower() in ('1', 'true', 'on', 'yes')
options = req.get_options()
permission_name = options.get('DjangoPermissionName', None)
|
django_django.json
| null | null | null | null | null | null |
django_django.json
|
BUG_FIX
|
5, fix written in commits msg
|
2d8dd49589ce6953dc9866192f2ca0dc53957ace
| null |
Aidan Labourdette
|
chore: fix formatting (#3560) Fixed the markdown formatting for the openWith? parameter options for the open function in shell module. see: https://github.com/tauri-apps/tauri-docs/pull/476
| false
| 1
| 1
| 0
|
--- shell.ts
@@ -394,7 +394,7 @@ type CommandEvent =
* or the one specified with `openWith`.
*
* The `openWith` value must be one of `firefox`, `google chrome`, `chromium` `safari`,
- * `open`, `start`, `xdg-open`, `gio`, gnome-open`, `kde-open` or `wslview`.
+ * `open`, `start`, `xdg-open`, `gio`, `gnome-open`, `kde-open` or `wslview`.
*
* @example
* ```typescript
|
tauri-apps_tauri.json
| null | null | null | null | null | null |
tauri-apps_tauri.json
|
CONFIG_CHANGE
|
5, formatting
|
294ae7b0b3d258725135791657087443c7491d91
|
2024-03-10 12:20:22
|
Zezhong Li
|
Update README.md
| false
| 2
| 0
| 2
|
--- README.md
@@ -168,8 +168,6 @@ NOTE: the ranking has no particular order.
|  | *TKDD '23* | Modeling Regime Shifts in Multiple Time Series | None |
|  | *World Wide Web '23* | Anomaly and change point detection for time series with concept drift | None |
|  | *EAAI '23* | PrecTime A deep learning architecture for precise time series segmentation in industrial manufacturing operations | None |
-|  | *JASA'22* | Factor Models for High-Dimensional Tensor Time Series | None |
-|  | *JSS'22* | Analysis of Tensor Time Series: tensorTS | [tensorTS](https://github.com/ZeBang/tensorTS) |
|  | *IMWUT '22* | ColloSSL Collaborative Self-Supervised Learning for Human Activity Recognition 🌟 | [collossl](https://github.com/akhilmathurs/collossl) |
|  | *MSSP '22* | A multivariate time series segmentation algorithm for analyzing the operating statuses of tunnel boring machines | None |
|  | *Technometrics '22* | Bayesian Hierarchical Model for Change Point Detection in Multivariate Sequences | [Supplementary Materials](https://doi.org/10.1080/00401706.2021.1927848) |
|
awesome-time-series-segmentation-papers
|
lzz19980125
|
MATLAB
|
MATLAB
| 454
| 8
|
This repository contains a reading list of papers on Time Series Segmentation. This repository is still being continuously improved.
|
lzz19980125_awesome-time-series-segmentation-papers
|
CODE_IMPROVEMENT
|
Code change: type annotation added
|
b3ef56a322cd153b4bb9395e0a54a250b0826d18
|
2024-04-18 23:58:44
|
Christopher Helmerich
|
Added 'Open in MATLAB Online'
| false
| 1
| 0
| 1
|
--- README.md
@@ -1,7 +1,6 @@
# WarpFactory
[](https://opensource.org/licenses/MIT)
-[](https://matlab.mathworks.com/open/github/v1?repo=NerdsWithAttitudes/WarpFactory&file=README.md)
WarpFactory is a powerful numerical toolkit written in MATLAB for analyzing warp drive spacetimes using Einstein's theory of General Relativity. Its unique focus lies in providing a numerical framework to analyze the physicality of spacetime.
|
warpfactory
|
nerdswithattitudes
|
MATLAB
|
MATLAB
| 298
| 41
|
WarpFactory is a numerical toolkit for analyzing warp drive spacetimes.
|
nerdswithattitudes_warpfactory
|
DOC_CHANGE
|
changes in readme
|
237be79680d15f091df888acd1d8986a052867a4
|
2023-02-08 17:37:30
|
2dust
|
up 1.7.38
| false
| 2
| 2
| 4
|
--- V2rayNG/app/build.gradle
@@ -18,8 +18,8 @@ android {
minSdkVersion 21
targetSdkVersion Integer.parseInt("$targetSdkVer")
multiDexEnabled true
- versionCode 502
- versionName "1.7.38"
+ versionCode 501
+ versionName "1.7.37"
}
if (props["sign"]) {
|
v2rayng
|
2dust
|
Kotlin
|
Kotlin
| 38,863
| 5,828
|
A V2Ray client for Android, support Xray core and v2fly core
|
2dust_v2rayng
|
BUG_FIX
|
correcting display behavior under Wayland
|
8659ad904966dfe809925c980ac11e7f98ac61aa
|
2025-02-13 16:13:08
|
Filippo Valsorda
|
crypto/internal/fips140test: require FIPS 140 mode for the ACVP wrapper Change-Id: I6a6a46565c14cf1d924a8fcfbf6752e9646ec63d Reviewed-on: https://go-review.googlesource.com/c/go/+/648818 Reviewed-by: Daniel McCarney <[email protected]> LUCI-TryBot-Result: Go LUCI <[email protected]> Reviewed-by: Roland Shoemaker <[email protected]> Auto-Submit: Filippo Valsorda <[email protected]> Reviewed-by: Ian Lance Taylor <[email protected]>
| false
| 5
| 0
| 5
|
--- src/crypto/internal/fips140test/acvp_test.go
@@ -75,10 +75,6 @@ func TestMain(m *testing.M) {
}
func wrapperMain() {
- if !fips140.Enabled {
- fmt.Fprintln(os.Stderr, "ACVP wrapper must be run with GODEBUG=fips140=on")
- os.Exit(2)
- }
if err := processingLoop(bufio.NewReader(os.Stdin), os.Stdout); err != nil {
fmt.Fprintf(os.Stderr, "processing error: %v\n", err)
os.Exit(1)
@@ -2133,7 +2129,6 @@ func TestACVP(t *testing.T) {
cmd = testenv.Command(t, goTool, args...)
cmd.Dir = dataDir
cmd.Env = append(os.Environ(), "ACVP_WRAPPER=1")
- cmd.Env = append(os.Environ(), "GODEBUG=fips140=on")
output, err := cmd.CombinedOutput()
if err != nil {
t.Fatalf("failed to run acvp tests: %s\n%s", err, string(output))
|
go
|
golang
|
Go
|
Go
| 126,191
| 17,926
|
The Go programming language
|
golang_go
|
BUG_FIX
|
probably a bug fix
|
d27890d98ddedfd86f23c9aef12ab422b413f1a2
|
2023-10-23 13:36:33
|
krahets
|
Fix automating build workflow for Python
| false
| 15
| 13
| 28
|
--- .github/workflows/python.yml
@@ -32,6 +32,7 @@ jobs:
run: |
python -m pip install --upgrade pip
pip install black
+ if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: Lint with black
run: |
black codes/python
--- codes/python/build.py
@@ -1,20 +1,17 @@
import glob
import py_compile as pyc
-if __name__ == "__main__":
- # find source code files
- src_paths = sorted(glob.glob("codes/python/**/*.py"))
- num_src = len(src_paths)
- num_src_error = 0
+src_paths = sorted(glob.glob("codes/python/**/*.py"))
+num_src = len(src_paths)
+num_src_error = 0
- # compile python code
- for src_path in src_paths:
- try:
- pyc.compile(src_path, doraise=True)
- except pyc.PyCompileError as e:
- num_src_error += 1
- print(e)
+for src_path in src_paths:
+ try:
+ pyc.compile(src_path, doraise=True)
+ except pyc.PyCompileError as e:
+ num_src_error += 1
+ print(e)
- print(f"===== Build Complete =====")
- print(f"Total: {num_src}")
- print(f"Error: {num_src_error}")
+print(f"===== Build Summary =====")
+print(f"Total: {num_src}")
+print(f"Error: {num_src - num_src_error}")
|
hello-algo
|
krahets
|
Java
|
Java
| 109,696
| 13,651
|
《Hello 算法》:动画图解、一键运行的数据结构与算法教程。支持 Python, Java, C++, C, C#, JS, Go, Swift, Rust, Ruby, Kotlin, TS, Dart 代码。简体版和繁体版同步更新,English version ongoing
|
krahets_hello-algo
|
CODE_IMPROVEMENT
|
Non-functional code changes to improve readability, migration etc.
|
4d122658cff4a0d6366be320698070fe6d5333e3
|
2025-03-20 14:42:06
|
A. Unique TensorFlower
|
Update GraphDef version to 2172. PiperOrigin-RevId: 738716643
| false
| 1
| 1
| 2
|
--- tensorflow/core/public/version.h
@@ -93,7 +93,7 @@ limitations under the License.
#define TF_GRAPH_DEF_VERSION_MIN_PRODUCER 0
#define TF_GRAPH_DEF_VERSION_MIN_CONSUMER 0
-#define TF_GRAPH_DEF_VERSION 2172 // Updated: 2025/3/20
+#define TF_GRAPH_DEF_VERSION 2171 // Updated: 2025/3/19
// Checkpoint compatibility versions (the versions field in SavedSliceMeta).
//
|
tensorflow
|
tensorflow
|
C++
|
C++
| 188,388
| 74,565
|
An Open Source Machine Learning Framework for Everyone
|
nan_tensorflow
|
CONFIG_CHANGE
|
version updates are done
|
a96a6dcab726589baac8dc090caa079d291c1a50
|
2022-03-29 09:26:00
|
Kyle Kizirian
|
fixes minor typos (#655)
| false
| 3
| 3
| 6
|
--- distributed_systems/README.md
@@ -124,7 +124,7 @@ Distributed Caching Protocols for Relieving Hot Spots on the World Wide Web](con
* :scroll: [Implementing the Omega failure detector in the crash-recovery failure model](implementing-the-omega-failure-detector-in-crash-recovery-failure-model.pdf)
-* :scroll: [Impossibility of Distributed Consensus with One Faulty Process](impossibility-of-consensus-with-one-faulty-process.pdf)
+* :scroll: [Impossibility of Distributed Consensuswith One Faulty Process](impossibility-of-consensus-with-one-faulty-process.pdf)
* :scroll: [In Search of an Understandable Consensus Algorithm](in-search-of-an-understandable-consensus-algorithm.pdf)
@@ -156,7 +156,7 @@ Distributed Caching Protocols for Relieving Hot Spots on the World Wide Web](con
* :scroll: [Signal/Collect: Graph Algorithms for the (Semantic) Web](signal-%26-collect-graph-algorithms-for-the-\(semantic\)-web.pdf)
-* :scroll: [Solution of a Problem in
+* :scroll: [Slution of a Problem in
Concurrent Programming Control](solution-of-a-problem-in-concurrent-programming-control.pdf)
* :scroll: [Sparse Partitions](sparse-partitions.pdf)
@@ -165,7 +165,7 @@ Concurrent Programming Control](solution-of-a-problem-in-concurrent-programming-
* :scroll: [The Akamai Network: A Platform for High-Performance Internet Applications](the-akamai-network.pdf)
-* :scroll: [The Dining Cryptographers Problem:
+* :scroll: [The Dining CryptographersProblem:
Unconditional Sender and Recipient Untraceability](the-dining-cryptographers-problem.pdf)
* :scroll: [Tor: The Second-Generation Onion Router](tor-the-second-generation-onion-router.pdf)
|
papers-we-love
|
papers-we-love
|
Shell
|
Shell
| 91,347
| 5,859
|
Papers from the computer science community to read and discuss.
|
papers-we-love_papers-we-love
|
DOC_CHANGE
|
Obvious
|
8a35f7baaea7aeee2a94c95bd137633a3dd1486d
|
2024-02-15 05:26:58
|
Suoqin Jin
|
Update README.md
| false
| 0
| 4
| 4
|
--- README.md
@@ -65,6 +65,10 @@ Please check the tutorial directory of the repo.
- [Interface with other single-cell analysis toolkits (e.g., Seurat, SingleCellExperiment, Scanpy)](https://htmlpreview.github.io/?https://github.com/jinworks/CellChat/blob/master/tutorial/Interface_with_other_single-cell_analysis_toolkits.html)
- [Tutorial for updating ligand-receptor database CellChatDB](https://htmlpreview.github.io/?https://github.com/jinworks/CellChat/blob/master/tutorial/Update-CellChatDB.html)
+<p align="center">
+ <img width="700" src="https://github.com/jinworks/CellChat/blob/main/overview_CellChat.png">
+</p>
+
## Web-based “CellChat Explorer”
|
cellchat
|
jinworks
|
R
|
R
| 367
| 61
|
R toolkit for inference, visualization and analysis of cell-cell communication from single-cell and spatially resolved transcriptomics
|
jinworks_cellchat
|
DOC_CHANGE
|
changes in readme
|
68e6d5ad7e9af8929a22a889b1182706abbfcb50
|
2023-10-15 22:41:05
|
Christian Clauss
|
validate_solutions.py: os.getenv('GITHUB_TOKEN', '') (#10546) * validate_solutions.py: os.getenv('GITHUB_TOKEN', '')
@tianyizheng02
* updating DIRECTORY.md
* f this
---------
Co-authored-by: github-actions <${GITHUB_ACTOR}@users.noreply.github.com>
| false
| 4
| 2
| 6
|
--- DIRECTORY.md
@@ -373,7 +373,6 @@
* [Electric Conductivity](electronics/electric_conductivity.py)
* [Electric Power](electronics/electric_power.py)
* [Electrical Impedance](electronics/electrical_impedance.py)
- * [Ic 555 Timer](electronics/ic_555_timer.py)
* [Ind Reactance](electronics/ind_reactance.py)
* [Ohms Law](electronics/ohms_law.py)
* [Real And Reactive Power](electronics/real_and_reactive_power.py)
@@ -623,7 +622,6 @@
* [Is Ip V4 Address Valid](maths/is_ip_v4_address_valid.py)
* [Is Square Free](maths/is_square_free.py)
* [Jaccard Similarity](maths/jaccard_similarity.py)
- * [Joint Probability Distribution](maths/joint_probability_distribution.py)
* [Juggler Sequence](maths/juggler_sequence.py)
* [Karatsuba](maths/karatsuba.py)
* [Krishnamurthy Number](maths/krishnamurthy_number.py)
@@ -677,8 +675,8 @@
* [Radians](maths/radians.py)
* [Radix2 Fft](maths/radix2_fft.py)
* [Remove Digit](maths/remove_digit.py)
+ * [Rkf45](maths/rkf45.py)
* [Runge Kutta](maths/runge_kutta.py)
- * [Runge Kutta Fehlberg 45](maths/runge_kutta_fehlberg_45.py)
* [Segmented Sieve](maths/segmented_sieve.py)
* Series
* [Arithmetic](maths/series/arithmetic.py)
--- scripts/validate_solutions.py
@@ -55,7 +55,7 @@ def added_solution_file_path() -> list[pathlib.Path]:
solution_file_paths = []
headers = {
"Accept": "application/vnd.github.v3+json",
- "Authorization": f"token {os.getenv('GITHUB_TOKEN', '')}",
+ "Authorization": "token " + os.environ["GITHUB_TOKEN"],
}
files = requests.get(get_files_url(), headers=headers).json()
for file in files:
|
python
|
thealgorithms
|
Python
|
Python
| 197,891
| 46,346
|
All Algorithms implemented in Python
|
thealgorithms_python
|
DOC_CHANGE
|
binary changes in .py files inside docs folder
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.