Hash
stringlengths 40
40
| Date
stringlengths 19
20
⌀ | Author
stringlengths 2
30
| commit_message
stringlengths 3
28.8k
| IsMerge
bool 1
class | Additions
int64 0
55.2k
| Deletions
int64 0
991
| Total Changes
int64 -3
55.2k
| git_diff
stringlengths 23
47.3k
| Repository Name
stringclasses 159
values | Owner
stringclasses 85
values | Primary Language
stringclasses 20
values | Language
stringclasses 19
values | Stars
float64 218
411k
⌀ | Forks
float64 8
79k
⌀ | Description
stringclasses 96
values | Repository
stringclasses 161
values | type
stringclasses 6
values | Comment
stringlengths 7
156
⌀ |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
6eeff3c57e02f07c8494433aed8bf90532010089
|
2022-03-14 01:51:52
|
Jon Shier
|
Updates for Xcode 13.3 (#3576) * Update project files for Xcode 13.3.
* Update Package to 5.6.
* Run a few jobs using Xcode 13.3.
| false
| 74
| 33
| 107
|
--- .github/workflows/ci.yml
@@ -90,7 +90,7 @@ jobs:
run:
shell: "/usr/bin/arch -arch arm64e /bin/zsh {0}"
env:
- DEVELOPER_DIR: "/Applications/Xcode_13.3.app/Contents/Developer"
+ DEVELOPER_DIR: "/Applications/Xcode_13.2.1.app/Contents/Developer"
timeout-minutes: 10
strategy:
fail-fast: false
@@ -118,7 +118,7 @@ jobs:
run:
shell: "/usr/bin/arch -arch arm64e /bin/zsh {0}"
env:
- DEVELOPER_DIR: /Applications/Xcode_13.3.app/Contents/Developer
+ DEVELOPER_DIR: /Applications/Xcode_13.2.1.app/Contents/Developer
timeout-minutes: 10
strategy:
fail-fast: false
@@ -146,7 +146,7 @@ jobs:
run:
shell: "/usr/bin/arch -arch arm64e /bin/zsh {0}"
env:
- DEVELOPER_DIR: /Applications/Xcode_13.3.app/Contents/Developer
+ DEVELOPER_DIR: /Applications/Xcode_13.2.1.app/Contents/Developer
timeout-minutes: 10
strategy:
fail-fast: false
@@ -194,17 +194,14 @@ jobs:
run: set -o pipefail && env NSUnbufferedIO=YES xcodebuild -project "Alamofire.xcodeproj" -scheme "${{ matrix.scheme }}" -destination "${{ matrix.destination }}" -testPlan "${{ matrix.testPlan }}" clean test | xcpretty
SPM:
name: Test with SPM
- runs-on: firebreak
- defaults:
- run:
- shell: "/usr/bin/arch -arch arm64e /bin/zsh {0}"
+ runs-on: macOS-11
env:
- DEVELOPER_DIR: "/Applications/Xcode_13.3.app/Contents/Developer"
+ DEVELOPER_DIR: "/Applications/Xcode_13.2.1.app/Contents/Developer"
timeout-minutes: 10
steps:
- uses: actions/checkout@v2
- name: Install Firewalk
- run: brew install alamofire/alamofire/firewalk || brew upgrade alamofire/alamofire/firewalk && firewalk &
+ run: brew install alamofire/alamofire/firewalk && firewalk &
- name: Test SPM
run: swift test -c debug
SPM_Older:
@@ -217,9 +214,6 @@ jobs:
fail-fast: false
matrix:
include:
- - xcode: "Xcode_13.2.1.app"
- runsOn: macOS-11
- name: "macOS 11, SPM 5.5 Build"
- xcode: "Xcode_12.5.1.app"
runsOn: macOS-11
name: "macOS 11, SPM 5.4 Build"
--- Alamofire.xcodeproj/xcshareddata/xcschemes/Alamofire iOS.xcscheme
@@ -1,6 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?>
<Scheme
- LastUpgradeVersion = "1330"
+ LastUpgradeVersion = "1320"
version = "1.7">
<BuildAction
parallelizeBuildables = "YES"
--- Alamofire.xcodeproj/xcshareddata/xcschemes/Alamofire macOS.xcscheme
@@ -1,6 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?>
<Scheme
- LastUpgradeVersion = "1330"
+ LastUpgradeVersion = "1320"
version = "1.7">
<BuildAction
parallelizeBuildables = "YES"
--- Alamofire.xcodeproj/xcshareddata/xcschemes/Alamofire tvOS.xcscheme
@@ -1,6 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?>
<Scheme
- LastUpgradeVersion = "1330"
+ LastUpgradeVersion = "1320"
version = "1.7">
<BuildAction
parallelizeBuildables = "YES"
--- Alamofire.xcodeproj/xcshareddata/xcschemes/Alamofire watchOS.xcscheme
@@ -1,6 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?>
<Scheme
- LastUpgradeVersion = "1330"
+ LastUpgradeVersion = "1320"
version = "1.7">
<BuildAction
parallelizeBuildables = "YES"
--- Example/iOS Example.xcodeproj/project.pbxproj
@@ -216,7 +216,7 @@
isa = PBXProject;
attributes = {
LastSwiftUpdateCheck = 0720;
- LastUpgradeCheck = 1330;
+ LastUpgradeCheck = 1320;
ORGANIZATIONNAME = Alamofire;
TargetAttributes = {
F8111E0419A951050040E7D1 = {
--- Example/iOS Example.xcodeproj/xcshareddata/xcschemes/iOS Example.xcscheme
@@ -1,6 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?>
<Scheme
- LastUpgradeVersion = "1330"
+ LastUpgradeVersion = "1320"
version = "1.3">
<BuildAction
parallelizeBuildables = "YES"
--- Package.swift
@@ -1,4 +1,4 @@
-// swift-tools-version:5.6
+// swift-tools-version:5.5
//
// Package.swift
//
--- [email protected]
@@ -1,48 +0,0 @@
-// swift-tools-version:5.5
-//
-// Package.swift
-//
-// Copyright (c) 2014-2020 Alamofire Software Foundation (http://alamofire.org/)
-//
-// Permission is hereby granted, free of charge, to any person obtaining a copy
-// of this software and associated documentation files (the "Software"), to deal
-// in the Software without restriction, including without limitation the rights
-// to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-// copies of the Software, and to permit persons to whom the Software is
-// furnished to do so, subject to the following conditions:
-//
-// The above copyright notice and this permission notice shall be included in
-// all copies or substantial portions of the Software.
-//
-// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
-// THE SOFTWARE.
-//
-
-import PackageDescription
-
-let package = Package(name: "Alamofire",
- platforms: [.macOS(.v10_12),
- .iOS(.v10),
- .tvOS(.v10),
- .watchOS(.v3)],
- products: [.library(name: "Alamofire",
- targets: ["Alamofire"])],
- targets: [.target(name: "Alamofire",
- path: "Source",
- exclude: ["Info.plist"],
- linkerSettings: [.linkedFramework("CFNetwork",
- .when(platforms: [.iOS,
- .macOS,
- .tvOS,
- .watchOS]))]),
- .testTarget(name: "AlamofireTests",
- dependencies: ["Alamofire"],
- path: "Tests",
- exclude: ["Info.plist", "Test Plans"],
- resources: [.process("Resources")])],
- swiftLanguageVersions: [.v5])
--- watchOS Example/watchOS Example.xcodeproj/xcshareddata/xcschemes/watchOS Example WatchKit App.xcscheme
@@ -1,6 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?>
<Scheme
- LastUpgradeVersion = "1330"
+ LastUpgradeVersion = "1320"
version = "1.3">
<BuildAction
parallelizeBuildables = "YES"
@@ -56,8 +56,10 @@
debugServiceExtension = "internal"
enableGPUValidationMode = "1"
allowLocationSimulation = "YES">
- <BuildableProductRunnable
- runnableDebuggingMode = "0">
+ <RemoteRunnable
+ runnableDebuggingMode = "2"
+ BundleIdentifier = "com.apple.Carousel"
+ RemotePath = "/watchOS Example WatchKit App">
<BuildableReference
BuildableIdentifier = "primary"
BlueprintIdentifier = "318E330D2419AD1C00BDE48F"
@@ -65,7 +67,7 @@
BlueprintName = "watchOS Example WatchKit App"
ReferencedContainer = "container:watchOS Example.xcodeproj">
</BuildableReference>
- </BuildableProductRunnable>
+ </RemoteRunnable>
<AdditionalOptions>
<AdditionalOption
key = "NSZombieEnabled"
@@ -80,8 +82,10 @@
savedToolIdentifier = ""
useCustomWorkingDirectory = "NO"
debugDocumentVersioning = "YES">
- <BuildableProductRunnable
- runnableDebuggingMode = "0">
+ <RemoteRunnable
+ runnableDebuggingMode = "2"
+ BundleIdentifier = "com.apple.Carousel"
+ RemotePath = "/watchOS Example WatchKit App">
<BuildableReference
BuildableIdentifier = "primary"
BlueprintIdentifier = "318E330D2419AD1C00BDE48F"
@@ -89,7 +93,16 @@
BlueprintName = "watchOS Example WatchKit App"
ReferencedContainer = "container:watchOS Example.xcodeproj">
</BuildableReference>
- </BuildableProductRunnable>
+ </RemoteRunnable>
+ <MacroExpansion>
+ <BuildableReference
+ BuildableIdentifier = "primary"
+ BlueprintIdentifier = "318E330D2419AD1C00BDE48F"
+ BuildableName = "watchOS Example WatchKit App.app"
+ BlueprintName = "watchOS Example WatchKit App"
+ ReferencedContainer = "container:watchOS Example.xcodeproj">
+ </BuildableReference>
+ </MacroExpansion>
</ProfileAction>
<AnalyzeAction
buildConfiguration = "Debug">
|
alamofire
|
alamofire
|
Swift
|
Swift
| 41,720
| 7,598
|
Elegant HTTP Networking in Swift
|
alamofire_alamofire
|
CODE_IMPROVEMENT
|
Controls the visibility of the network activity indicator on iOS using Alamofire
|
1006fa9c42dbbcbda2eac2382ed892f94635c68e
|
2025-02-12 03:25:32
|
Mikael Hermansson
|
Skip `Object::to_string` when Jolt Physics is on separate thread
| false
| 10
| 8
| 18
|
--- doc/classes/ProjectSettings.xml
@@ -2330,7 +2330,6 @@
</member>
<member name="physics/3d/run_on_separate_thread" type="bool" setter="" getter="" default="false">
If [code]true[/code], the 3D physics server runs on a separate thread, making better use of multi-core CPUs. If [code]false[/code], the 3D physics server runs on the main thread. Running the physics server on a separate thread can increase performance, but restricts API access to only physics process.
- [b]Note:[/b] When [member physics/3d/physics_engine] is set to [code]Jolt Physics[/code], enabling this setting will prevent the 3D physics server from being able to provide any context when reporting errors and warnings, and will instead always refer to nodes as [code]<unknown>[/code].
</member>
<member name="physics/3d/sleep_threshold_angular" type="float" setter="" getter="" default="0.139626">
Threshold angular velocity under which a 3D physics body will be considered inactive. See [constant PhysicsServer3D.SPACE_PARAM_BODY_ANGULAR_VELOCITY_SLEEP_THRESHOLD].
--- modules/jolt_physics/jolt_physics_server_3d.h
@@ -421,7 +421,6 @@ public:
virtual int get_process_info(PhysicsServer3D::ProcessInfo p_process_info) override;
- bool is_on_separate_thread() const { return on_separate_thread; }
bool is_active() const { return active; }
void free_space(JoltSpace3D *p_space);
--- modules/jolt_physics/objects/jolt_object_3d.cpp
@@ -30,7 +30,6 @@
#include "jolt_object_3d.h"
-#include "../jolt_physics_server_3d.h"
#include "../jolt_project_settings.h"
#include "../spaces/jolt_layers.h"
#include "../spaces/jolt_space_3d.h"
@@ -138,12 +137,6 @@ bool JoltObject3D::can_interact_with(const JoltObject3D &p_other) const {
}
String JoltObject3D::to_string() const {
- static const String fallback_name = "<unknown>";
-
- if (JoltPhysicsServer3D::get_singleton()->is_on_separate_thread()) {
- return fallback_name; // Calling `Object::to_string` is not thread-safe.
- }
-
Object *instance = get_instance();
- return instance != nullptr ? instance->to_string() : fallback_name;
+ return instance != nullptr ? instance->to_string() : "<unknown>";
}
--- modules/jolt_physics/objects/jolt_soft_body_3d.cpp
@@ -727,3 +727,8 @@ bool JoltSoftBody3D::is_vertex_pinned(int p_index) const {
return pinned_vertices.has(physics_index);
}
+
+String JoltSoftBody3D::to_string() const {
+ Object *instance = get_instance();
+ return instance != nullptr ? instance->to_string() : "<unknown>";
+}
--- modules/jolt_physics/objects/jolt_soft_body_3d.h
@@ -167,6 +167,8 @@ public:
void unpin_all_vertices();
bool is_vertex_pinned(int p_index) const;
+
+ String to_string() const;
};
#endif // JOLT_SOFT_BODY_3D_H
|
godot
|
godotengine
|
C++
|
C++
| 94,776
| 21,828
|
Godot Engine – Multi-platform 2D and 3D game engine
|
godotengine_godot
|
BUG_FIX
|
correcting display behavior under Wayland
|
83145a674444c15e61f1b4fb7043e8cdc4a62ea3
|
2025-03-22 08:44:40
|
dianne
|
match lowering cleanup: `non_scalar_compare` is only for `&str` Since array and slice constants are now lowered to array and slice patterns, `non_scalar_compare` was only called for string comparisons. This specializes it to strings, renames it, and removes the unused array-unsizing logic. This also updates some outdated doc comments.
| false
| 24
| 94
| 118
|
--- compiler/rustc_middle/src/thir.rs
@@ -800,9 +800,9 @@ pub enum PatKind<'tcx> {
},
/// One of the following:
- /// * `&str` (represented as a valtree), which will be handled as a string pattern and thus
- /// exhaustiveness checking will detect if you use the same string twice in different
- /// patterns.
+ /// * `&str`/`&[u8]` (represented as a valtree), which will be handled as a string/slice pattern
+ /// and thus exhaustiveness checking will detect if you use the same string/slice twice in
+ /// different patterns.
/// * integer, bool, char or float (represented as a valtree), which will be handled by
/// exhaustiveness to cover exactly its own value, similar to `&str`, but these values are
/// much simpler.
--- compiler/rustc_mir_build/src/builder/matches/mod.rs
@@ -1326,8 +1326,8 @@ enum TestKind<'tcx> {
Eq {
value: Const<'tcx>,
// Integer types are handled by `SwitchInt`, and constants with ADT
- // types and `&[T]` types are converted back into patterns, so this can
- // only be `&str`, `f32` or `f64`.
+ // types are converted back into patterns, so this can only be `&str`,
+ // `&[T]`, `f32` or `f64`.
ty: Ty<'tcx>,
},
--- compiler/rustc_mir_build/src/builder/matches/test.rs
@@ -11,6 +11,7 @@
use rustc_data_structures::fx::FxIndexMap;
use rustc_hir::{LangItem, RangeEnd};
use rustc_middle::mir::*;
+use rustc_middle::ty::adjustment::PointerCoercion;
use rustc_middle::ty::util::IntTypeExt;
use rustc_middle::ty::{self, GenericArg, Ty, TyCtxt};
use rustc_middle::{bug, span_bug};
@@ -177,30 +178,21 @@ pub(super) fn perform_test(
_ => {}
}
- assert_eq!(expect_ty, ty);
if !ty.is_scalar() {
// Use `PartialEq::eq` instead of `BinOp::Eq`
// (the binop can only handle primitives)
- // Make sure that we do *not* call any user-defined code here.
- // The only type that can end up here is string literals, which have their
- // comparison defined in `core`.
- // (Interestingly this means that exhaustiveness analysis relies, for soundness,
- // on the `PartialEq` impl for `str` to b correct!)
- match *ty.kind() {
- ty::Ref(_, deref_ty, _) if deref_ty == self.tcx.types.str_ => {}
- _ => {
- span_bug!(source_info.span, "invalid type for non-scalar compare: {ty}")
- }
- };
- self.string_compare(
+ self.non_scalar_compare(
block,
success_block,
fail_block,
source_info,
expect,
+ expect_ty,
Operand::Copy(place),
+ ty,
);
} else {
+ assert_eq!(expect_ty, ty);
self.compare(
block,
success_block,
@@ -378,19 +370,97 @@ fn compare(
);
}
- /// Compare two values of type `&str` using `<str as std::cmp::PartialEq>::eq`.
- fn string_compare(
+ /// Compare two values using `<T as std::compare::PartialEq>::eq`.
+ /// If the values are already references, just call it directly, otherwise
+ /// take a reference to the values first and then call it.
+ fn non_scalar_compare(
&mut self,
block: BasicBlock,
success_block: BasicBlock,
fail_block: BasicBlock,
source_info: SourceInfo,
- expect: Operand<'tcx>,
- val: Operand<'tcx>,
+ mut expect: Operand<'tcx>,
+ expect_ty: Ty<'tcx>,
+ mut val: Operand<'tcx>,
+ mut ty: Ty<'tcx>,
) {
- let str_ty = self.tcx.types.str_;
+ // If we're using `b"..."` as a pattern, we need to insert an
+ // unsizing coercion, as the byte string has the type `&[u8; N]`.
+ //
+ // We want to do this even when the scrutinee is a reference to an
+ // array, so we can call `<[u8]>::eq` rather than having to find an
+ // `<[u8; N]>::eq`.
+ let unsize = |ty: Ty<'tcx>| match ty.kind() {
+ ty::Ref(region, rty, _) => match rty.kind() {
+ ty::Array(inner_ty, n) => Some((region, inner_ty, n)),
+ _ => None,
+ },
+ _ => None,
+ };
+ let opt_ref_ty = unsize(ty);
+ let opt_ref_test_ty = unsize(expect_ty);
+ match (opt_ref_ty, opt_ref_test_ty) {
+ // nothing to do, neither is an array
+ (None, None) => {}
+ (Some((region, elem_ty, _)), _) | (None, Some((region, elem_ty, _))) => {
+ let tcx = self.tcx;
+ // make both a slice
+ ty = Ty::new_imm_ref(tcx, *region, Ty::new_slice(tcx, *elem_ty));
+ if opt_ref_ty.is_some() {
+ let temp = self.temp(ty, source_info.span);
+ self.cfg.push_assign(
+ block,
+ source_info,
+ temp,
+ Rvalue::Cast(
+ CastKind::PointerCoercion(
+ PointerCoercion::Unsize,
+ CoercionSource::Implicit,
+ ),
+ val,
+ ty,
+ ),
+ );
+ val = Operand::Copy(temp);
+ }
+ if opt_ref_test_ty.is_some() {
+ let slice = self.temp(ty, source_info.span);
+ self.cfg.push_assign(
+ block,
+ source_info,
+ slice,
+ Rvalue::Cast(
+ CastKind::PointerCoercion(
+ PointerCoercion::Unsize,
+ CoercionSource::Implicit,
+ ),
+ expect,
+ ty,
+ ),
+ );
+ expect = Operand::Move(slice);
+ }
+ }
+ }
+
+ // Figure out the type on which we are calling `PartialEq`. This involves an extra wrapping
+ // reference: we can only compare two `&T`, and then compare_ty will be `T`.
+ // Make sure that we do *not* call any user-defined code here.
+ // The only types that can end up here are string and byte literals,
+ // which have their comparison defined in `core`.
+ // (Interestingly this means that exhaustiveness analysis relies, for soundness,
+ // on the `PartialEq` impls for `str` and `[u8]` to b correct!)
+ let compare_ty = match *ty.kind() {
+ ty::Ref(_, deref_ty, _)
+ if deref_ty == self.tcx.types.str_ || deref_ty != self.tcx.types.u8 =>
+ {
+ deref_ty
+ }
+ _ => span_bug!(source_info.span, "invalid type for non-scalar compare: {}", ty),
+ };
+
let eq_def_id = self.tcx.require_lang_item(LangItem::PartialEq, Some(source_info.span));
- let method = trait_method(self.tcx, eq_def_id, sym::eq, [str_ty, str_ty]);
+ let method = trait_method(self.tcx, eq_def_id, sym::eq, [compare_ty, compare_ty]);
let bool_ty = self.tcx.types.bool;
let eq_result = self.temp(bool_ty, source_info.span);
|
rust
|
rust-lang
|
Rust
|
Rust
| 101,693
| 13,172
|
Empowering everyone to build reliable and efficient software.
|
rust-lang_rust
|
CODE_IMPROVEMENT
|
Obvious
|
d7e5c00e94938f6be94e693d3f21f1f8f431c4f9
|
2024-08-29 07:04:03
|
Lucas Fernandes Nogueira
|
feat(core): add `plugin::PermissionState` (#10817) * feat(core): add `plugin::mobile::PermissionState`
* move to plugin module
* default, specta::Type
| false
| 60
| 1
| 61
|
--- .changes/rust-permission-state.md
@@ -1,5 +0,0 @@
----
-"tauri": patch:enhance
----
-
-Added `plugin:::PermissionState` enum.
--- crates/tauri/src/plugin.rs
@@ -12,10 +12,7 @@ use crate::{
webview::PageLoadPayload,
AppHandle, Error, RunEvent, Runtime, Webview, Window,
};
-use serde::{
- de::{Deserialize, DeserializeOwned, Deserializer, Error as DeError},
- Serialize, Serializer,
-};
+use serde::de::DeserializeOwned;
use serde_json::Value as JsonValue;
use tauri_macros::default_runtime;
use thiserror::Error;
@@ -901,54 +898,3 @@ fn initialize<R: Runtime>(
)
.map_err(|e| Error::PluginInitialization(plugin.name().to_string(), e.to_string()))
}
-
-/// Permission state.
-#[derive(Debug, Clone, Copy, PartialEq, Eq, Default)]
-#[cfg_attr(feature = "specta", derive(specta::Type))]
-pub enum PermissionState {
- /// Permission access has been granted.
- Granted,
- /// Permission access has been denied.
- Denied,
- /// Permission must be requested, but you must explain to the user why your app needs that permission. **Android only**.
- PromptWithRationale,
- /// Unknown state. Must request permission.
- #[default]
- Unknown,
-}
-
-impl std::fmt::Display for PermissionState {
- fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
- match self {
- Self::Granted => write!(f, "granted"),
- Self::Denied => write!(f, "denied"),
- Self::PromptWithRationale => write!(f, "prompt-with-rationale"),
- Self::Unknown => write!(f, "Unknown"),
- }
- }
-}
-
-impl Serialize for PermissionState {
- fn serialize<S>(&self, serializer: S) -> std::result::Result<S::Ok, S::Error>
- where
- S: Serializer,
- {
- serializer.serialize_str(self.to_string().as_ref())
- }
-}
-
-impl<'de> Deserialize<'de> for PermissionState {
- fn deserialize<D>(deserializer: D) -> std::result::Result<Self, D::Error>
- where
- D: Deserializer<'de>,
- {
- let s = <String as Deserialize>::deserialize(deserializer)?;
- match s.to_lowercase().as_str() {
- "granted" => Ok(Self::Granted),
- "denied" => Ok(Self::Denied),
- "prompt-with-rationale" => Ok(Self::PromptWithRationale),
- "prompt" => Ok(Self::Unknown),
- _ => Err(DeError::custom(format!("unknown permission state '{s}'"))),
- }
- }
-}
|
tauri
|
tauri-apps
|
Rust
|
Rust
| 90,101
| 2,752
|
Build smaller, faster, and more secure desktop and mobile applications with a web frontend.
|
tauri-apps_tauri
|
NEW_FEAT
|
Obvious
|
c16d0df7b2ac87007f789346c32f7e47e4ad1bfe
| null |
Mohamed Hegazy
|
Disable test as it needs resolution which the test harness does not support yet
| false
| 0
| 0
| 0
|
--- selfReferencedExternalModule2.ts
|
microsoft_TypeScript.json
| null | null | null | null | null | null |
microsoft_TypeScript.json
|
BUG_FIX
|
4, test was disabled
|
062413fb30b4f40c456b5c4746f18a6f34198307
|
2025-02-14 20:21:03
|
Fabio Alessandrelli
|
[ENet] Explicitely destroy hosts on close To ensure we free up the UDP port even if a script is holding a reference to the underlying host, we need to explicitly destroy it on close.
| false
| 1
| 0
| 1
|
--- modules/enet/enet_multiplayer_peer.cpp
@@ -301,7 +301,6 @@ void ENetMultiplayerPeer::close() {
}
for (KeyValue<int, Ref<ENetConnection>> &E : hosts) {
E.value->flush();
- E.value->destroy();
}
active_mode = MODE_NONE;
|
godot
|
godotengine
|
C++
|
C++
| 94,776
| 21,828
|
Godot Engine – Multi-platform 2D and 3D game engine
|
godotengine_godot
|
BUG_FIX
|
probably a bug fix to free up the UDP port
|
388cb95f8d0471510723b17d31a420db01d4c8e2
|
2023-10-24 16:17:49
|
Winter
|
update: APF
| false
| 616
| 402
| 1,018
|
--- .gitignore
@@ -2,7 +2,6 @@
__pycache__/
build/
devel/
-video/
# local env files
.env.local
--- README.md
@@ -96,9 +96,9 @@ Planner | Version | Animation
## Local Planner
| Planner | Version | Animation |
| ------- | -------------------------------------------------------- | -------------------------------------------------------- |
-| **PID** | [](https://github.com/ai-winter/matlab_motion_planning/blob/master/local_planner/pid_plan.m) | 
-| **APF** | [](https://github.com/ai-winter/matlab_motion_planning/blob/master/local_planner/apf_plan.m) | 
-| **DWA** | [](https://github.com/ai-winter/matlab_motion_planning/blob/master/local_planner/dwa_plan.m) | 
+| **PID** | [](https://github.com/ai-winter/matlab_motion_planning/blob/master/local_planner/pid.m) | 
+| **APF** |  | 
+| **DWA** | [](https://github.com/ai-winter/matlab_motion_planning/blob/master/local_planner/dwa.m) | 
| **TEB** |  | 
| **MPC** |  | 
| **Lattice** |  | 
@@ -134,5 +134,4 @@ Planner | Version | Animation
## Local Planning
-* [DWA: ](https://www.ri.cmu.edu/pub_files/pub1/fox_dieter_1997_1/fox_dieter_1997_1.pdf) The Dynamic Window Approach to Collision Avoidance
-* [APF: ](https://ieeexplore.ieee.org/document/1087247)Real-time obstacle avoidance for manipulators and mobile robots
\ No newline at end of file
+* [DWA: ](https://www.ri.cmu.edu/pub_files/pub1/fox_dieter_1997_1/fox_dieter_1997_1.pdf) The Dynamic Window Approach to Collision Avoidance
\ No newline at end of file
--- examples/simulation_global.mlx
Binary files a/examples/simulation_global.mlx and b/examples/simulation_global.mlx differ
--- examples/simulation_global_replan.mlx
Binary files /dev/null and b/examples/simulation_global_replan.mlx differ
--- examples/simulation_local.mlx
Binary files a/examples/simulation_local.mlx and b/examples/simulation_local.mlx differ
--- examples/simulation_total.mlx
Binary files /dev/null and b/examples/simulation_total.mlx differ
--- gif/apf_matlab.gif
Binary files a/gif/apf_matlab.gif and /dev/null differ
--- gif/pid_matlab.gif
Binary files a/gif/pid_matlab.gif and b/gif/pid_matlab.gif differ
--- local_planner/apf_plan.m
@@ -1,192 +0,0 @@
-function [pose, traj, flag] = apf_plan(start, goal, varargin)
-%%
-% @file: apf_plan.m
-% @breif: Artificial Potential Field motion planning
-% @paper: The Artificial Potential Field to Collision Avoidance
-% @author: Winter
-% @update: 2023.10.24
-
-%%
- p = inputParser;
- addParameter(p, 'path', "none");
- addParameter(p, 'map', "none");
- parse(p, varargin{:});
-
- if isstring(p.Results.path) || isstring(p.Results.map)
- exception = MException('MyErr:InvalidInput', 'parameter `path` or `map` must be set.');
- throw(exception);
- end
-
- % path
- path = flipud(p.Results.path);
- path_length = size(path, 1);
- plan_idx = 1;
-
- % map
- map = p.Results.map;
-
- % obstacle
- [m, ~] = size(map);
- obs_index = find(map==2);
- obstacle = [mod(obs_index - 1, m) + 1, fix((obs_index - 1) / m) + 1];
-
- % initial robotic state
- robot.x = start(1);
- robot.y = start(2);
- robot.theta = start(3);
- robot.v = 0;
- robot.w = 0;
- max_v = 0.4;
-
- % parameters
- zeta = 1.0;
- eta = 0.8;
- d_0 = 1.5;
-
- dt = 0.1;
- p_window = 0.5;
- p_precision = 0.5;
- o_precision = pi / 4;
- e_v_ = 0; i_v_ = 0;
- e_w_ = 0; i_w_ = 0;
- max_iter = 1000;
-
- % return value
- flag = false;
- pose = [];
- traj = [];
-
- iter = 0;
- % main loop
- while (1)
- iter = iter + 1;
- if (iter > max_iter)
- break;
- end
-
- % break until goal reached
- if (norm([robot.x, robot.y] - goal(:, 1:2)) < p_precision)
- flag = true;
- break;
- end
-
- % compute the tatget pose and force at the current step
- rep_force = getRepulsiveForce([robot.x, robot.y], obstacle, d_0);
- while (plan_idx <= path_length)
- tgt_pos = path(plan_idx, :);
- attr_force = getAttractiveForce([robot.x, robot.y], tgt_pos);
- net_force = zeta * attr_force + eta * rep_force;
-
- % in body frame
- b_x_d = path(plan_idx, 1) - robot.x;
- b_y_d = path(plan_idx, 2) - robot.y;
-
- if (norm([b_x_d, b_y_d]) > p_window)
- break;
- end
- plan_idx = plan_idx + 1;
- end
-
- new_v = [robot.v * cos(robot.theta), robot.v * sin(robot.theta)] + net_force;
- new_v = new_v ./ norm(new_v);
- new_v = new_v .* max_v;
-
- theta_d = atan2(new_v(2), new_v(1));
-
- % calculate velocity command
- if (norm([robot.x, robot.y] - goal(:, 1:2)) < p_precision)
- if (abs(robot.theta - goal(3)) < o_precision)
- u = [0, 0];
- else
- [w, e_w_, i_w_] = angularController(robot, goal(3), dt, e_w_, i_w_);
- u = [0, w];
- end
- elseif (abs(theta_d - robot.theta) > pi / 2)
- [w, e_w_, i_w_] = angularController(robot, theta_d, dt, e_w_, i_w_);
- u = [0, w];
- else
- [v, e_v_, i_v_] = linearController(robot, norm(new_v), dt, e_v_, i_v_);
- [w, e_w_, i_w_] = angularController(robot, theta_d, dt, e_w_, i_w_);
- u = [v, w];
- end
-
- % input into robotic kinematic
- robot = f(robot, u, dt);
- pose = [pose; robot.x, robot.y, robot.theta];
- end
-end
-
-%%
-function attr_force = getAttractiveForce(cur_pos, tgt_pos)
- attr_force = tgt_pos - cur_pos;
- if ~all(attr_force == 0)
- attr_force = attr_force ./ norm(attr_force);
- end
-end
-
-function rep_force = getRepulsiveForce(cur_pos, obstacle, d_0)
- D = dist(obstacle, cur_pos');
- rep_force = (1 ./ D - 1 / d_0) .* (1 ./ D) .^ 2 .* (cur_pos - obstacle);
- valid_mask = (1 ./ D - 1 / d_0) > 0;
- rep_force = sum(rep_force(valid_mask, :), 1);
-
- if ~all(rep_force == 0)
- rep_force = rep_force ./ norm(rep_force);
- end
-end
-
-function [v, e_v_, i_v_] = linearController(robot, v_d, dt, e_v_, i_v_)
- e_v = v_d - robot.v;
- i_v_ = i_v_ + e_v * dt;
- d_v = (e_v - e_v_) / dt;
- e_v_ = e_v;
-
- k_v_p = 1.00;
- k_v_i = 0.00;
- k_v_d = 0.00;
- v_inc = k_v_p * e_v_ + k_v_i * i_v_ + k_v_d * d_v;
-
- v = robot.v + v_inc;
-end
-
-function [w, e_w_, i_w_] = angularController(robot, theta_d, dt, e_w_, i_w_)
- e_theta = theta_d - robot.theta;
- if (e_theta > pi)
- e_theta = e_theta - 2 * pi;
- elseif (e_theta < -pi)
- e_theta = e_theta + 2 * pi;
- end
-
- w_d = e_theta / dt / 10;
- e_w = w_d - robot.w;
- i_w_ = i_w_ + e_w * dt;
- d_w = (e_w - e_w_) / dt;
- e_w_ = e_w;
-
- k_w_p = 1.00;
- k_w_i = 0.00;
- k_w_d = 0.01;
- w_inc = k_w_p * e_w_ + k_w_i * i_w_ + k_w_d * d_w;
-
- w = robot.w + w_inc;
-end
-
-function robot = f(robot, u, dt)
-%@breif: robotic kinematic
- F = [ 1 0 0 0 0
- 0 1 0 0 0
- 0 0 1 0 0
- 0 0 0 0 0
- 0 0 0 0 0];
-
- B = [dt * cos(robot.theta) 0
- dt * sin(robot.theta) 0
- 0 dt
- 1 0
- 0 1];
-
- x = [robot.x; robot.y; robot.theta; robot.v; robot.w];
- x_star = F * x + B * u';
- robot.x = x_star(1); robot.y = x_star(2); robot.theta = x_star(3);
- robot.v = x_star(4); robot.w = x_star(5);
-end
\ No newline at end of file
--- local_planner/dwa_plan.m
@@ -1,32 +1,12 @@
-function [pose, traj, flag] = dwa_plan(start, goal, varargin)
+function [pose, traj, flag] = dwa(map, start, goal, kinematic)
%%
-% @file: dwa_plan.m
+% @file: dwa.m
% @breif: DWA motion planning
% @paper: The Dynamic Window Approach to Collision Avoidance
% @author: Winter
% @update: 2023.1.30
%%
- p = inputParser;
- addParameter(p, 'path', "none");
- addParameter(p, 'map', "none");
- parse(p, varargin{:});
-
- if isstring(p.Results.map)
- exception = MException('MyErr:InvalidInput', 'parameter `map` must be set.');
- throw(exception);
- end
-
- map = p.Results.map;
-
- % kinematic
- kinematic.V_MAX = 1.0; % maximum velocity [m/s]
- kinematic.W_MAX = 20.0 * pi /180; % maximum rotation speed[rad/s]
- kinematic.V_ACC = 0.2; % acceleration [m/s^2]
- kinematic.W_ACC = 50.0 * pi /180; % angular acceleration [rad/s^2]
- kinematic.V_RESOLUTION = 0.01; % velocity resolution [m/s]
- kinematic.W_RESOLUTION = 1.0 * pi /180; % rotation speed resolution [rad/s]]
-
% return value
flag = false;
pose = [];
--- local_planner/pid_plan.m
@@ -1,29 +1,7 @@
-function [pose, traj, flag] = pid_plan(start, goal, varargin)
-%%
-% @file: pid_plan.m
-% @breif: PID motion planning
-% @paper: The PID to path tracking
-% @author: Winter, Gzy
-% @update: 2023.1.30
-
-%%
- p = inputParser;
- addParameter(p, 'path', "none");
- addParameter(p, 'map', "none");
- parse(p, varargin{:});
-
- if isstring(p.Results.path)
- exception = MException('MyErr:InvalidInput', 'parameter `path` must be set, using global planner.');
- throw(exception);
- end
-
- % path
- path = p.Results.path;
-
+function [pose, flag] = pid(path, start, goal)
% return value
flag = false;
- pose = [];
- traj = [];
+ pose = [];
% reverse path
path = flipud(path);
@@ -60,7 +38,7 @@
% break until goal reached
if (norm([robot.x, robot.y] - goal(:, 1:2)) < p_precision)
- flag = true;
+% && ... abs(robot.theta - goal(3)) < o_precision
break;
end
--- utils/animation/animation.m
@@ -1,7 +1,7 @@
-function animation(planner_name, pose, traj, delta, record_video)
+function animation_dwa(pose, traj, delta, record_video)
%
-% @file: animation.m
-% @breif: local planning algorithm animation
+% @file: animation_dwa.m
+% @breif: DWA algorithm animation
% @author: Winter
% @update: 2023.1.30
%
@@ -9,16 +9,14 @@ function animation(planner_name, pose, traj, delta, record_video)
[frames, ~] = size(pose);
if record_video
- process = VideoWriter(sprintf("%s%s%s", "./utils/animation/video/", planner_name, ".mp4"), 'MPEG-4');
+ process = VideoWriter('./animation/video/dwa.mp4', 'MPEG-4');
open(process);
movie = moviein(frames);
end
for i=1:frames
handler = plot_robot([pose(i, 2) + delta, pose(i, 1) + delta, pose(i, 3)], 0.8, 0.4, 'r');
- if ~isempty(traj)
- handler2 = plot_trajectory(traj(i).info, delta);
- end
+ handler2 = plot_trajectory(traj(i).info, delta);
plot(pose(i, 2) + delta, pose(i, 1) + delta, 'Marker', '.', 'color', "#f00");
drawnow;
if record_video
@@ -26,10 +24,7 @@ function animation(planner_name, pose, traj, delta, record_video)
writeVideo(process, movie(:, i));
end
delete(handler);
-
- if ~isempty(traj)
- delete(handler2);
- end
+ delete(handler2);
end
if record_video
--- utils/animation/animation_pid.m
@@ -0,0 +1,27 @@
+function animation_pid(pose, traj, delta, record_video)
+ hold on
+ [frames, ~] = size(pose);
+
+ if record_video
+ process = VideoWriter('./animation/video/pid.avi');
+ open(process);
+ movie = moviein(frames);
+ end
+
+ for i=1:frames
+ handler = plot_robot([pose(i, 2) + delta, pose(i, 1) + delta, pose(i, 3)], 0.8, 0.4, 'r');
+% handler2 = plot_trajectory(traj(i).info, delta);
+ plot(pose(i, 2) + delta, pose(i, 1) + delta, 'Marker', '.', 'color', "#f00");
+ drawnow;
+ if record_video
+ movie(:, i) = getframe;
+ writeVideo(process, movie(:, i));
+ end
+ delete(handler);
+% delete(handler2);
+ end
+
+ if record_video
+ close(process);
+ end
+end
|
matlab_motion_planning
|
ai-winter
|
MATLAB
|
MATLAB
| 419
| 66
|
Motion planning and Navigation of AGV/AMR:matlab implementation of Dijkstra, A*, Theta*, JPS, D*, LPA*, D* Lite, RRT, RRT*, RRT-Connect, Informed RRT*, ACO, Voronoi, PID, LQR, MPC, APF, RPP, DWA, DDPG, Bezier, B-spline, Dubins, Reeds-Shepp etc.
|
ai-winter_matlab_motion_planning
|
BUG_FIX
|
Hide chat history when not logged in, suppress most log requests, UA detects mobile vs desktop
|
a8a33098c3669f7cade28662805d0a3aec0b67b7
|
2022-12-21 03:26:15
|
Serhiy Mytrovtsiy
|
fix: prevent throwing an error when the commit type is not detected, just omit it
| false
| 1
| 1
| 2
|
--- Kit/scripts/changelog.py
@@ -57,7 +57,7 @@ class Changelog:
elif self.langPattern.match(line) or "translation" in line or "localization" in line:
lang.append(line)
else:
- print("Failed to detect commit {} type".format(line))
+ raise ValueError("Failed to detect commit {} type".format(line))
return fix, feat, lang
|
stats
|
exelban
|
Swift
|
Swift
| 29,655
| 950
|
macOS system monitor in your menu bar
|
exelban_stats
|
CODE_IMPROVEMENT
|
this commit fixes/polishes an earlier feature
|
b7b05c55f2b7a0bfcbfc8d142c2f3242313e8e53
|
2022-03-13 07:15:28
|
Matheus Felipe
|
Remove Live precious metal prices, it's broken Live precious metal prices returns http code 500 when accessed
| false
| 0
| 1
| 1
|
--- README.md
@@ -765,6 +765,7 @@ API | Description | Auth | HTTPS | CORS |
| [Indian Mutual Fund](https://www.mfapi.in/) | Get complete history of India Mutual Funds Data | No | Yes | Unknown |
| [Intrinio](https://intrinio.com/) | A wide selection of financial data feeds | `apiKey` | Yes | Unknown |
| [Klarna](https://docs.klarna.com/klarna-payments/api/payments-api/) | Klarna payment and shopping service | `apiKey` | Yes | Unknown |
+| [Live precious metal prices](https://notnullsolutions.com/live-metal-prices-api/live-precious-metal-prices-api-documentation) | Live prices for all precious metals like Gold | `apiKey` | Yes | Unknown |
| [MercadoPago](https://www.mercadopago.com.br/developers/es/reference) | Mercado Pago API reference - all the information you need to develop your integrations | `apiKey` | Yes | Unknown |
| [Mono](https://mono.co/) | Connect with users’ bank accounts and access transaction data in Africa | `apiKey` | Yes | Unknown |
| [Moov](https://docs.moov.io/api/) | The Moov API makes it simple for platforms to send, receive, and store money | `apiKey` | Yes | Unknown |
|
public-apis
|
public-apis
|
Python
|
Python
| 329,015
| 34,881
|
A collective list of free APIs
|
public-apis_public-apis
|
CONFIG_CHANGE
|
Very small changes
|
1399ee6aa0aa3e44a70780cfd0aca3d384c97fa4
|
2023-10-22 08:16:45
|
Evan You
|
test: pin esbuild to 0.16 for karma-esbuild compat
| false
| 117
| 117
| 234
|
--- package.json
@@ -99,7 +99,7 @@
"conventional-changelog-cli": "^2.2.2",
"cross-spawn": "^7.0.3",
"enquirer": "^2.3.6",
- "esbuild": "^0.16.0",
+ "esbuild": "^0.19.5",
"execa": "^4.1.0",
"he": "^1.2.0",
"jasmine-core": "^4.2.0",
@@ -107,7 +107,7 @@
"karma": "^6.3.20",
"karma-chrome-launcher": "^3.1.1",
"karma-cli": "^2.0.0",
- "karma-esbuild": "^2.2.5",
+ "karma-esbuild": "^2.2.4",
"karma-jasmine": "^5.0.1",
"lint-staged": "^12.5.0",
"lodash": "^4.17.21",
--- pnpm-lock.yaml
@@ -52,8 +52,8 @@ importers:
specifier: ^2.3.6
version: 2.3.6
esbuild:
- specifier: ^0.16.0
- version: 0.16.17
+ specifier: ^0.19.5
+ version: 0.19.5
execa:
specifier: ^4.1.0
version: 4.1.0
@@ -76,8 +76,8 @@ importers:
specifier: ^2.0.0
version: 2.0.0
karma-esbuild:
- specifier: ^2.2.5
- version: 2.2.5([email protected])
+ specifier: ^2.2.4
+ version: 2.2.5([email protected])
karma-jasmine:
specifier: ^5.0.1
version: 5.1.0([email protected])
@@ -312,15 +312,6 @@ packages:
'@jridgewell/trace-mapping': 0.3.9
dev: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-MIGl6p5sc3RDTLLkYL1MyL8BMRN4tLMRCn+yRJJmEDvYZ2M7tmAf80hx1kbNEUX2KJ50RRtxZ4JHLvCfuB6kBg==}
- engines: {node: '>=12'}
- cpu: [arm64]
- os: [android]
- requiresBuild: true
- dev: true
- optional: true
-
/@esbuild/[email protected]:
resolution: {integrity: sha512-Nz4rJcchGDtENV0eMKUNa6L12zz2zBDXuhj/Vjh18zGqB44Bi7MBMSXjgunJgjRhCmKOjnPuZp4Mb6OKqtMHLQ==}
engines: {node: '>=12'}
@@ -330,10 +321,10 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-N9x1CMXVhtWEAMS7pNNONyA14f71VPQN9Cnavj1XQh6T7bskqiLLrSca4O0Vr8Wdcga943eThxnVp3JLnBMYtw==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-5d1OkoJxnYQfmC+Zd8NBFjkhyCNYwM4n9ODrycTFY6Jk1IGiZ+tjVJDDSwDt77nK+tfpGP4T50iMtVi4dEGzhQ==}
engines: {node: '>=12'}
- cpu: [arm]
+ cpu: [arm64]
os: [android]
requiresBuild: true
dev: true
@@ -348,10 +339,10 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-a3kTv3m0Ghh4z1DaFEuEDfz3OLONKuFvI4Xqczqx4BqLyuFaFkuaG4j2MtA6fuWEFeC5x9IvqnX7drmRq/fyAQ==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-bhvbzWFF3CwMs5tbjf3ObfGqbl/17ict2/uwOSfr3wmxDE6VdS2GqY/FuzIPe0q0bdhj65zQsvqfArI9MY6+AA==}
engines: {node: '>=12'}
- cpu: [x64]
+ cpu: [arm]
os: [android]
requiresBuild: true
dev: true
@@ -366,11 +357,11 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-/2agbUEfmxWHi9ARTX6OQ/KgXnOWfsNlTeLcoV7HSuSTv63E4DqtAc+2XqGw1KHxKMHGZgbVCZge7HXWX9Vn+w==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-9t+28jHGL7uBdkBjL90QFxe7DVA+KGqWlHCF8ChTKyaKO//VLuoBricQCgwhOjA1/qOczsw843Fy4cbs4H3DVA==}
engines: {node: '>=12'}
- cpu: [arm64]
- os: [darwin]
+ cpu: [x64]
+ os: [android]
requiresBuild: true
dev: true
optional: true
@@ -384,10 +375,10 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-2By45OBHulkd9Svy5IOCZt376Aa2oOkiE9QWUK9fe6Tb+WDr8hXL3dpqi+DeLiMed8tVXspzsTAvd0jUl96wmg==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-mvXGcKqqIqyKoxq26qEDPHJuBYUA5KizJncKOAf9eJQez+L9O+KfvNFu6nl7SCZ/gFb2QPaRqqmG0doSWlgkqw==}
engines: {node: '>=12'}
- cpu: [x64]
+ cpu: [arm64]
os: [darwin]
requiresBuild: true
dev: true
@@ -402,11 +393,11 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-mt+cxZe1tVx489VTb4mBAOo2aKSnJ33L9fr25JXpqQqzbUIw/yzIzi+NHwAXK2qYV1lEFp4OoVeThGjUbmWmdw==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-Ly8cn6fGLNet19s0X4unjcniX24I0RqjPv+kurpXabZYSXGM4Pwpmf85WHJN3lAgB8GSth7s5A0r856S+4DyiA==}
engines: {node: '>=12'}
- cpu: [arm64]
- os: [freebsd]
+ cpu: [x64]
+ os: [darwin]
requiresBuild: true
dev: true
optional: true
@@ -420,10 +411,10 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-8ScTdNJl5idAKjH8zGAsN7RuWcyHG3BAvMNpKOBaqqR7EbUhhVHOqXRdL7oZvz8WNHL2pr5+eIT5c65kA6NHug==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-GGDNnPWTmWE+DMchq1W8Sd0mUkL+APvJg3b11klSGUDvRXh70JqLAO56tubmq1s2cgpVCSKYywEiKBfju8JztQ==}
engines: {node: '>=12'}
- cpu: [x64]
+ cpu: [arm64]
os: [freebsd]
requiresBuild: true
dev: true
@@ -438,11 +429,11 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-7S8gJnSlqKGVJunnMCrXHU9Q8Q/tQIxk/xL8BqAP64wchPCTzuM6W3Ra8cIa1HIflAvDnNOt2jaL17vaW+1V0g==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-1CCwDHnSSoA0HNwdfoNY0jLfJpd7ygaLAp5EHFos3VWJCRX9DMwWODf96s9TSse39Br7oOTLryRVmBoFwXbuuQ==}
engines: {node: '>=12'}
- cpu: [arm64]
- os: [linux]
+ cpu: [x64]
+ os: [freebsd]
requiresBuild: true
dev: true
optional: true
@@ -456,10 +447,10 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-iihzrWbD4gIT7j3caMzKb/RsFFHCwqqbrbH9SqUSRrdXkXaygSZCZg1FybsZz57Ju7N/SHEgPyaR0LZ8Zbe9gQ==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-o3vYippBmSrjjQUCEEiTZ2l+4yC0pVJD/Dl57WfPwwlvFkrxoSO7rmBZFii6kQB3Wrn/6GwJUPLU5t52eq2meA==}
engines: {node: '>=12'}
- cpu: [arm]
+ cpu: [arm64]
os: [linux]
requiresBuild: true
dev: true
@@ -474,10 +465,10 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-kiX69+wcPAdgl3Lonh1VI7MBr16nktEvOfViszBSxygRQqSpzv7BffMKRPMFwzeJGPxcio0pdD3kYQGpqQ2SSg==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-lrWXLY/vJBzCPC51QN0HM71uWgIEpGSjSZZADQhq7DKhPcI6NH1IdzjfHkDQws2oNpJKpR13kv7/pFHBbDQDwQ==}
engines: {node: '>=12'}
- cpu: [ia32]
+ cpu: [arm]
os: [linux]
requiresBuild: true
dev: true
@@ -492,10 +483,10 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-dTzNnQwembNDhd654cA4QhbS9uDdXC3TKqMJjgOWsC0yNCbpzfWoXdZvp0mY7HU6nzk5E0zpRGGx3qoQg8T2DQ==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-MkjHXS03AXAkNp1KKkhSKPOCYztRtK+KXDNkBa6P78F8Bw0ynknCSClO/ztGszILZtyO/lVKpa7MolbBZ6oJtQ==}
engines: {node: '>=12'}
- cpu: [loong64]
+ cpu: [ia32]
os: [linux]
requiresBuild: true
dev: true
@@ -510,10 +501,10 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-ezbDkp2nDl0PfIUn0CsQ30kxfcLTlcx4Foz2kYv8qdC6ia2oX5Q3E/8m6lq84Dj/6b0FrkgD582fJMIfHhJfSw==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-42GwZMm5oYOD/JHqHska3Jg0r+XFb/fdZRX+WjADm3nLWLcIsN27YKtqxzQmGNJgu0AyXg4HtcSK9HuOk3v1Dw==}
engines: {node: '>=12'}
- cpu: [mips64el]
+ cpu: [loong64]
os: [linux]
requiresBuild: true
dev: true
@@ -528,10 +519,10 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-dzS678gYD1lJsW73zrFhDApLVdM3cUF2MvAa1D8K8KtcSKdLBPP4zZSLy6LFZ0jYqQdQ29bjAHJDgz0rVbLB3g==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-kcjndCSMitUuPJobWCnwQ9lLjiLZUR3QLQmlgaBfMX23UEa7ZOrtufnRds+6WZtIS9HdTXqND4yH8NLoVVIkcg==}
engines: {node: '>=12'}
- cpu: [ppc64]
+ cpu: [mips64el]
os: [linux]
requiresBuild: true
dev: true
@@ -546,10 +537,10 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-ylNlVsxuFjZK8DQtNUwiMskh6nT0vI7kYl/4fZgV1llP5d6+HIeL/vmmm3jpuoo8+NuXjQVZxmKuhDApK0/cKw==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-yJAxJfHVm0ZbsiljbtFFP1BQKLc8kUF6+17tjQ78QjqjAQDnhULWiTA6u0FCDmYT1oOKS9PzZ2z0aBI+Mcyj7Q==}
engines: {node: '>=12'}
- cpu: [riscv64]
+ cpu: [ppc64]
os: [linux]
requiresBuild: true
dev: true
@@ -564,10 +555,10 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-gzy7nUTO4UA4oZ2wAMXPNBGTzZFP7mss3aKR2hH+/4UUkCOyqmjXiKpzGrY2TlEUhbbejzXVKKGazYcQTZWA/w==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-5u8cIR/t3gaD6ad3wNt1MNRstAZO+aNyBxu2We8X31bA8XUNyamTVQwLDA1SLoPCUehNCymhBhK3Qim1433Zag==}
engines: {node: '>=12'}
- cpu: [s390x]
+ cpu: [riscv64]
os: [linux]
requiresBuild: true
dev: true
@@ -582,10 +573,10 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-mdPjPxfnmoqhgpiEArqi4egmBAMYvaObgn4poorpUaqmvzzbvqbowRllQ+ZgzGVMGKaPkqUmPDOOFQRUFDmeUw==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-Z6JrMyEw/EmZBD/OFEFpb+gao9xJ59ATsoTNlj39jVBbXqoZm4Xntu6wVmGPB/OATi1uk/DB+yeDPv2E8PqZGw==}
engines: {node: '>=12'}
- cpu: [x64]
+ cpu: [s390x]
os: [linux]
requiresBuild: true
dev: true
@@ -600,11 +591,11 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-/PzmzD/zyAeTUsduZa32bn0ORug+Jd1EGGAUJvqfeixoEISYpGnAezN6lnJoskauoai0Jrs+XSyvDhppCPoKOA==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-psagl+2RlK1z8zWZOmVdImisMtrUxvwereIdyJTmtmHahJTKb64pAcqoPlx6CewPdvGvUKe2Jw+0Z/0qhSbG1A==}
engines: {node: '>=12'}
cpu: [x64]
- os: [netbsd]
+ os: [linux]
requiresBuild: true
dev: true
optional: true
@@ -618,11 +609,11 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-2yaWJhvxGEz2RiftSk0UObqJa/b+rIAjnODJgv2GbGGpRwAfpgzyrg1WLK8rqA24mfZa9GvpjLcBBg8JHkoodg==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-kL2l+xScnAy/E/3119OggX8SrWyBEcqAh8aOY1gr4gPvw76la2GlD4Ymf832UCVbmuWeTf2adkZDK+h0Z/fB4g==}
engines: {node: '>=12'}
cpu: [x64]
- os: [openbsd]
+ os: [netbsd]
requiresBuild: true
dev: true
optional: true
@@ -636,11 +627,11 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-xtVUiev38tN0R3g8VhRfN7Zl42YCJvyBhRKw1RJjwE1d2emWTVToPLNEQj/5Qxc6lVFATDiy6LjVHYhIPrLxzw==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-sPOfhtzFufQfTBgRnE1DIJjzsXukKSvZxloZbkJDG383q0awVAq600pc1nfqBcl0ice/WN9p4qLc39WhBShRTA==}
engines: {node: '>=12'}
cpu: [x64]
- os: [sunos]
+ os: [openbsd]
requiresBuild: true
dev: true
optional: true
@@ -654,11 +645,11 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-ga8+JqBDHY4b6fQAmOgtJJue36scANy4l/rL97W+0wYmijhxKetzZdKOJI7olaBaMhWt8Pac2McJdZLxXWUEQw==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-dGZkBXaafuKLpDSjKcB0ax0FL36YXCvJNnztjKV+6CO82tTYVDSH2lifitJ29jxRMoUhgkg9a+VA/B03WK5lcg==}
engines: {node: '>=12'}
- cpu: [arm64]
- os: [win32]
+ cpu: [x64]
+ os: [sunos]
requiresBuild: true
dev: true
optional: true
@@ -672,10 +663,10 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-WnsKaf46uSSF/sZhwnqE4L/F89AYNMiD4YtEcYekBt9Q7nj0DiId2XH2Ng2PHM54qi5oPrQ8luuzGszqi/veig==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-dWVjD9y03ilhdRQ6Xig1NWNgfLtf2o/STKTS+eZuF90fI2BhbwD6WlaiCGKptlqXlURVB5AUOxUj09LuwKGDTg==}
engines: {node: '>=12'}
- cpu: [ia32]
+ cpu: [arm64]
os: [win32]
requiresBuild: true
dev: true
@@ -690,10 +681,10 @@ packages:
dev: true
optional: true
- /@esbuild/[email protected]:
- resolution: {integrity: sha512-y+EHuSchhL7FjHgvQL/0fnnFmO4T1bhvWANX6gcnqTjtnKWbTvUMCpGnv2+t+31d7RzyEAYAd4u2fnIhHL6N/Q==}
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-4liggWIA4oDgUxqpZwrDhmEfAH4d0iljanDOK7AnVU89T6CzHon/ony8C5LeOdfgx60x5cnQJFZwEydVlYx4iw==}
engines: {node: '>=12'}
- cpu: [x64]
+ cpu: [ia32]
os: [win32]
requiresBuild: true
dev: true
@@ -708,6 +699,15 @@ packages:
dev: true
optional: true
+ /@esbuild/[email protected]:
+ resolution: {integrity: sha512-czTrygUsB/jlM8qEW5MD8bgYU2Xg14lo6kBDXW6HdxKjh8M5PzETGiSHaz9MtbXBYDloHNUAUW2tMiKW4KM9Mw==}
+ engines: {node: '>=12'}
+ cpu: [x64]
+ os: [win32]
+ requiresBuild: true
+ dev: true
+ optional: true
+
/@hutson/[email protected]:
resolution: {integrity: sha512-H9XAx3hc0BQHY6l+IFSWHDySypcXsvsuLhgYLUGywmJ5pswRVQJUHpOsobnLYp2ZUaUlKiKDrgWWhosOwAEM8Q==}
engines: {node: '>=6.9.0'}
@@ -2675,36 +2675,6 @@ packages:
is-arrayish: 0.2.1
dev: true
- /[email protected]:
- resolution: {integrity: sha512-G8LEkV0XzDMNwXKgM0Jwu3nY3lSTwSGY6XbxM9cr9+s0T/qSV1q1JVPBGzm3dcjhCic9+emZDmMffkwgPeOeLg==}
- engines: {node: '>=12'}
- hasBin: true
- requiresBuild: true
- optionalDependencies:
- '@esbuild/android-arm': 0.16.17
- '@esbuild/android-arm64': 0.16.17
- '@esbuild/android-x64': 0.16.17
- '@esbuild/darwin-arm64': 0.16.17
- '@esbuild/darwin-x64': 0.16.17
- '@esbuild/freebsd-arm64': 0.16.17
- '@esbuild/freebsd-x64': 0.16.17
- '@esbuild/linux-arm': 0.16.17
- '@esbuild/linux-arm64': 0.16.17
- '@esbuild/linux-ia32': 0.16.17
- '@esbuild/linux-loong64': 0.16.17
- '@esbuild/linux-mips64el': 0.16.17
- '@esbuild/linux-ppc64': 0.16.17
- '@esbuild/linux-riscv64': 0.16.17
- '@esbuild/linux-s390x': 0.16.17
- '@esbuild/linux-x64': 0.16.17
- '@esbuild/netbsd-x64': 0.16.17
- '@esbuild/openbsd-x64': 0.16.17
- '@esbuild/sunos-x64': 0.16.17
- '@esbuild/win32-arm64': 0.16.17
- '@esbuild/win32-ia32': 0.16.17
- '@esbuild/win32-x64': 0.16.17
- dev: true
-
/[email protected]:
resolution: {integrity: sha512-ceqxoedUrcayh7Y7ZX6NdbbDzGROiyVBgC4PriJThBKSVPWnnFHZAkfI1lJT8QFkOwH4qOS2SJkS4wvpGl8BpA==}
engines: {node: '>=12'}
@@ -2735,6 +2705,36 @@ packages:
'@esbuild/win32-x64': 0.18.20
dev: true
+ /[email protected]:
+ resolution: {integrity: sha512-bUxalY7b1g8vNhQKdB24QDmHeY4V4tw/s6Ak5z+jJX9laP5MoQseTOMemAr0gxssjNcH0MCViG8ONI2kksvfFQ==}
+ engines: {node: '>=12'}
+ hasBin: true
+ requiresBuild: true
+ optionalDependencies:
+ '@esbuild/android-arm': 0.19.5
+ '@esbuild/android-arm64': 0.19.5
+ '@esbuild/android-x64': 0.19.5
+ '@esbuild/darwin-arm64': 0.19.5
+ '@esbuild/darwin-x64': 0.19.5
+ '@esbuild/freebsd-arm64': 0.19.5
+ '@esbuild/freebsd-x64': 0.19.5
+ '@esbuild/linux-arm': 0.19.5
+ '@esbuild/linux-arm64': 0.19.5
+ '@esbuild/linux-ia32': 0.19.5
+ '@esbuild/linux-loong64': 0.19.5
+ '@esbuild/linux-mips64el': 0.19.5
+ '@esbuild/linux-ppc64': 0.19.5
+ '@esbuild/linux-riscv64': 0.19.5
+ '@esbuild/linux-s390x': 0.19.5
+ '@esbuild/linux-x64': 0.19.5
+ '@esbuild/netbsd-x64': 0.19.5
+ '@esbuild/openbsd-x64': 0.19.5
+ '@esbuild/sunos-x64': 0.19.5
+ '@esbuild/win32-arm64': 0.19.5
+ '@esbuild/win32-ia32': 0.19.5
+ '@esbuild/win32-x64': 0.19.5
+ dev: true
+
/[email protected]:
resolution: {integrity: sha512-k0er2gUkLf8O0zKJiAhmkTnJlTvINGv7ygDNPbeIsX/TJjGJZHuh9B2UxbsaEkmlEo9MfhrSzmhIlhRlI2GXnw==}
engines: {node: '>=6'}
@@ -3929,13 +3929,13 @@ packages:
resolve: 1.22.1
dev: true
- /[email protected]([email protected]):
+ /[email protected]([email protected]):
resolution: {integrity: sha512-+NiRmZhUm/MqOsL1cAu8+RmiOMvIxWDaeYDLBB5upxHF9Hh3Og8YH43EAmDan40pxt2FKDcOjupgqIe4Tx2szQ==}
peerDependencies:
esbuild: '>=0.8.45'
dependencies:
chokidar: 3.5.3
- esbuild: 0.16.17
+ esbuild: 0.19.5
source-map: 0.6.1
dev: true
--- test/transition/karma.conf.js
@@ -2,12 +2,12 @@ const featureFlags = require('../../scripts/feature-flags')
process.env.CHROME_BIN = require('puppeteer').executablePath()
const define = {
- __DEV__: `true`,
- 'process.env.CI': String(!!process.env.CI)
+ __DEV__: true,
+ 'process.env.CI': !!process.env.CI
}
for (const key in featureFlags) {
- define[`process.env.${key}`] = String(featureFlags[key])
+ define[`process.env.${key}`] = featureFlags[key]
}
module.exports = function (config) {
|
vue
|
vuejs
|
TypeScript
|
TypeScript
| 208,427
| 33,725
|
This is the repo for Vue 2. For Vue 3, go to https://github.com/vuejs/core
|
vuejs_vue
|
CODE_IMPROVEMENT
|
version changes
|
fc614b796cc481a5bd10bdb887e36ef44adecad4
|
2024-09-10 23:10:57
|
Constantin Graf
|
Increaded timeout for ARM build
| false
| 1
| 1
| 2
|
--- .github/workflows/build-public.yml
@@ -18,7 +18,7 @@ jobs:
contents: read
attestations: write
id-token: write
- timeout-minutes: 90
+ timeout-minutes: 10
steps:
- name: "Check out code"
|
solidtime
|
solidtime-io
|
PHP
|
PHP
| 5,267
| 278
|
Modern open-source time-tracking app
|
solidtime-io_solidtime
|
CONFIG_CHANGE
|
timeout increased in yml file
|
87eccc3a81d133e636150268c24d2220cad70fe5
|
2025-03-03 22:08:05
|
Phillip Wood
|
meson: fix building technical and howto docs When our asciidoc files were renamed from "*.txt" to "*.adoc" in 1f010d6bdf7 (doc: use .adoc extension for AsciiDoc files, 2025-01-20) the "meson.build" file in "Documentation" was updated but the "meson.build" files in the "technical" and "howto" subdirectories were not. This causes the meson build to fail when configured with -Ddocs=html. Fix this by updating the relevant "meson.build" files. Signed-off-by: Phillip Wood <[email protected]> Acked-by: Patrick Steinhardt <[email protected]> Signed-off-by: Junio C Hamano <[email protected]>
| false
| 48
| 48
| 96
|
--- Documentation/howto/meson.build
@@ -1,20 +1,20 @@
howto_sources = [
- 'coordinate-embargoed-releases.adoc',
- 'keep-canonical-history-correct.adoc',
- 'maintain-git.adoc',
- 'new-command.adoc',
- 'rebase-from-internal-branch.adoc',
- 'rebuild-from-update-hook.adoc',
- 'recover-corrupted-blob-object.adoc',
- 'recover-corrupted-object-harder.adoc',
- 'revert-a-faulty-merge.adoc',
- 'revert-branch-rebase.adoc',
- 'separating-topic-branches.adoc',
- 'setup-git-server-over-http.adoc',
- 'update-hook-example.adoc',
- 'use-git-daemon.adoc',
- 'using-merge-subtree.adoc',
- 'using-signed-tag-in-pull-request.adoc',
+ 'coordinate-embargoed-releases.txt',
+ 'keep-canonical-history-correct.txt',
+ 'maintain-git.txt',
+ 'new-command.txt',
+ 'rebase-from-internal-branch.txt',
+ 'rebuild-from-update-hook.txt',
+ 'recover-corrupted-blob-object.txt',
+ 'recover-corrupted-object-harder.txt',
+ 'revert-a-faulty-merge.txt',
+ 'revert-branch-rebase.txt',
+ 'separating-topic-branches.txt',
+ 'setup-git-server-over-http.txt',
+ 'update-hook-example.txt',
+ 'use-git-daemon.txt',
+ 'using-merge-subtree.txt',
+ 'using-signed-tag-in-pull-request.txt',
]
howto_index = custom_target(
@@ -26,7 +26,7 @@ howto_index = custom_target(
env: script_environment,
capture: true,
input: howto_sources,
- output: 'howto-index.adoc',
+ output: 'howto-index.txt',
)
custom_target(
--- Documentation/technical/meson.build
@@ -1,37 +1,37 @@
api_docs = [
- 'api-error-handling.adoc',
- 'api-merge.adoc',
- 'api-parse-options.adoc',
- 'api-simple-ipc.adoc',
- 'api-trace2.adoc',
+ 'api-error-handling.txt',
+ 'api-merge.txt',
+ 'api-parse-options.txt',
+ 'api-simple-ipc.txt',
+ 'api-trace2.txt',
]
articles = [
- 'bitmap-format.adoc',
- 'build-systems.adoc',
- 'bundle-uri.adoc',
- 'commit-graph.adoc',
- 'directory-rename-detection.adoc',
- 'hash-function-transition.adoc',
- 'long-running-process-protocol.adoc',
- 'multi-pack-index.adoc',
- 'packfile-uri.adoc',
- 'pack-heuristics.adoc',
- 'parallel-checkout.adoc',
- 'partial-clone.adoc',
- 'platform-support.adoc',
- 'racy-git.adoc',
- 'reftable.adoc',
- 'remembering-renames.adoc',
- 'repository-version.adoc',
- 'rerere.adoc',
- 'scalar.adoc',
- 'send-pack-pipeline.adoc',
- 'shallow.adoc',
- 'sparse-checkout.adoc',
- 'sparse-index.adoc',
- 'trivial-merge.adoc',
- 'unit-tests.adoc',
+ 'bitmap-format.txt',
+ 'build-systems.txt',
+ 'bundle-uri.txt',
+ 'commit-graph.txt',
+ 'directory-rename-detection.txt',
+ 'hash-function-transition.txt',
+ 'long-running-process-protocol.txt',
+ 'multi-pack-index.txt',
+ 'packfile-uri.txt',
+ 'pack-heuristics.txt',
+ 'parallel-checkout.txt',
+ 'partial-clone.txt',
+ 'platform-support.txt',
+ 'racy-git.txt',
+ 'reftable.txt',
+ 'remembering-renames.txt',
+ 'repository-version.txt',
+ 'rerere.txt',
+ 'scalar.txt',
+ 'send-pack-pipeline.txt',
+ 'shallow.txt',
+ 'sparse-checkout.txt',
+ 'sparse-index.txt',
+ 'trivial-merge.txt',
+ 'unit-tests.txt',
]
api_index = custom_target(
@@ -43,7 +43,7 @@ api_index = custom_target(
],
env: script_environment,
input: api_docs,
- output: 'api-index.adoc',
+ output: 'api-index.txt',
)
custom_target(
|
git
| null |
C
|
C
| null | null |
Version control
|
_git
|
CODE_IMPROVEMENT
|
Code change: identifier renaming
|
48dac52685b6e7a7017aa7ed8407746453737b12
|
2024-06-26 11:54:46
|
Łukasz Jan Niemier
|
ft: use RPC helper function everywhere (#367)
| false
| 21
| 106
| 127
|
--- VERSION
@@ -1 +1 @@
-1.1.66
+1.1.65
--- lib/supavisor.ex
@@ -92,7 +92,16 @@ defmodule Supavisor do
if node() == dest_node do
subscribe_local(pid, id)
else
- H.rpc(dest_node, __MODULE__, :subscribe_local, [pid, id], 15_000)
+ try do
+ # TODO: tests for different cases
+ :erpc.call(dest_node, __MODULE__, :subscribe_local, [pid, id], 15_000)
+ |> case do
+ {:EXIT, _} = badrpc -> {:error, {:badrpc, badrpc}}
+ result -> result
+ end
+ catch
+ kind, reason -> {:error, {:badrpc, {kind, reason}}}
+ end
end
end
--- lib/supavisor/helpers.ex
@@ -329,11 +329,12 @@ defmodule Supavisor.Helpers do
def rpc(node, module, function, args, timeout \\ 15_000) do
try do
:erpc.call(node, module, function, args, timeout)
+ |> case do
+ {:EXIT, _} = badrpc -> {:error, {:badrpc, badrpc}}
+ result -> result
+ end
catch
kind, reason -> {:error, {:badrpc, {kind, reason}}}
- else
- {:EXIT, _} = badrpc -> {:error, {:badrpc, badrpc}}
- result -> result
end
end
--- lib/supavisor_web/ws_proxy.ex
@@ -61,6 +61,6 @@ defmodule SupavisorWeb.WsProxy do
@spec connect_local() :: {:ok, port()} | {:error, term()}
defp connect_local() do
proxy_port = Application.fetch_env!(:supavisor, :proxy_port_transaction)
- :gen_tcp.connect(~c"localhost", proxy_port, [:binary, packet: :raw, active: true])
+ :gen_tcp.connect('localhost', proxy_port, [:binary, packet: :raw, active: true])
end
end
--- priv/repo/migrations/20230619091028_add_tenant_ip_version.exs
@@ -10,8 +10,7 @@ defmodule Supavisor.Repo.Migrations.AddTenantIpVersion do
constraint(
"tenants",
:ip_version_values,
- check: "ip_version IN ('auto', 'v4', 'v6')",
- prefix: "_supavisor"
+ check: "ip_version IN ('auto', 'v4', 'v6')"
)
)
end
--- priv/repo/migrations/20230919100141_create_cluster_tenants.exs
@@ -9,17 +9,12 @@ defmodule Supavisor.Repo.Migrations.CreateClusterTenants do
add(
:cluster_alias,
- references(:clusters,
- on_delete: :delete_all,
- type: :string,
- column: :alias,
- prefix: "_supavisor"
- )
+ references(:clusters, on_delete: :delete_all, type: :string, column: :alias)
)
add(
:tenant_external_id,
- references(:tenants, type: :string, column: :external_id, prefix: "_supavisor")
+ references(:tenants, type: :string, column: :external_id)
)
timestamps()
@@ -29,8 +24,7 @@ defmodule Supavisor.Repo.Migrations.CreateClusterTenants do
constraint(
:cluster_tenants,
:type,
- check: "type IN ('read', 'write')",
- prefix: "_supavisor"
+ check: "type IN ('read', 'write')"
)
)
--- priv/repo/migrations/20231004133121_add_default_pool_strategy.exs
@@ -10,8 +10,7 @@ defmodule Supavisor.Repo.Migrations.AddDefaultPoolStrategy do
constraint(
"tenants",
:default_pool_strategy_values,
- check: "default_pool_strategy IN ('fifo', 'lifo')",
- prefix: "_supavisor"
+ check: "default_pool_strategy IN ('fifo', 'lifo')"
)
)
end
--- priv/repo/migrations/mix.exs
@@ -0,0 +1,83 @@
+defmodule Supavisor.MixProject do
+ use Mix.Project
+
+ def project do
+ [
+ app: :supavisor,
+ version: "0.0.1",
+ elixir: "~> 1.14",
+ elixirc_paths: elixirc_paths(Mix.env()),
+ start_permanent: Mix.env() == :prod,
+ aliases: aliases(),
+ deps: deps()
+ ]
+ end
+
+ # Configuration for the OTP application.
+ #
+ # Type `mix help compile.app` for more information.
+ def application do
+ [
+ mod: {Supavisor.Application, []},
+ extra_applications: [:logger, :runtime_tools]
+ ]
+ end
+
+ # Specifies which paths to compile per environment.
+ defp elixirc_paths(:test), do: ["lib", "test/support"]
+ defp elixirc_paths(_), do: ["lib"]
+
+ # Specifies your project dependencies.
+ #
+ # Type `mix help deps` for examples and options.
+ defp deps do
+ [
+ {:phoenix, "~> 1.6.13"},
+ {:phoenix_ecto, "~> 4.4"},
+ {:ecto_sql, "~> 3.6"},
+ {:postgrex, ">= 0.0.0"},
+ {:phoenix_html, "~> 3.0"},
+ {:phoenix_live_reload, "~> 1.2", only: :dev},
+ {:phoenix_live_view, "~> 0.17.5"},
+ {:phoenix_live_dashboard, "~> 0.6"},
+ {:telemetry_metrics, "~> 0.6"},
+ {:telemetry_poller, "~> 1.0"},
+ {:jason, "~> 1.2"},
+ {:plug_cowboy, "~> 2.5"},
+ {:joken, "~> 2.5.0"},
+ {:cloak_ecto, "~> 1.2.0"},
+ {:meck, "~> 0.9.2", only: :test},
+ {:credo, "~> 1.6.4", only: [:dev, :test], runtime: false},
+ {:dialyxir, "~> 1.1.0", only: [:dev], runtime: false},
+ {:benchee, "~> 1.1.0", only: :dev},
+
+ # pooller
+ {:poolboy, "~> 1.5.2"},
+ {:syn, "~> 3.3"},
+ {:pgo, "~> 0.12"}
+ # TODO: add ranch deps
+ ]
+ end
+
+ # Aliases are shortcuts or tasks specific to the current project.
+ # For example, to install project dependencies and perform other setup tasks, run:
+ #
+ # $ mix setup
+ #
+ # See the documentation for `Mix` for more info on aliases.
+ defp aliases do
+ [
+ setup: ["deps.get", "ecto.setup"],
+ "ecto.setup": ["ecto.create", "ecto.migrate", "run priv/repo/seeds.exs"],
+ "ecto.reset": ["ecto.drop", "ecto.setup"],
+ # test: ["ecto.create --quiet", "ecto.migrate --quiet", "test"]
+ test: [
+ "ecto.create",
+ "run priv/repo/seeds_before_migration.exs",
+ "ecto.migrate --prefix _supavisor --log-migrator-sql",
+ "run priv/repo/seeds_after_migration.exs",
+ "test"
+ ]
+ ]
+ end
+end
--- test/integration/proxy_test.exs
@@ -173,8 +173,8 @@ defmodule Supavisor.Integration.ProxyTest do
"http://localhost:#{Application.get_env(:supavisor, :proxy_port_transaction)}"
) ==
{:ok,
- {{~c"HTTP/1.1", 204, ~c"OK"},
- [{~c"x-app-version", Application.spec(:supavisor, :vsn)}], []}}
+ {{'HTTP/1.1', 204, 'OK'}, [{'x-app-version', Application.spec(:supavisor, :vsn)}],
+ []}}
end
test "checks that client_handler is idle and db_pid is nil for transaction mode" do
|
supavisor
|
supabase
|
Elixir
|
Elixir
| 1,869
| 64
|
A cloud-native, multi-tenant Postgres connection pooler.
|
supabase_supavisor
|
CODE_IMPROVEMENT
|
code refactoring to use helper function
|
da05b9d5b33555dbfc72c650e9ca87580ef72f31
|
2024-05-08 18:30:50
|
Cameron Carstens
|
Reference Sway-Libs, Sway-Standards, and Sway-Applications In Sway Book (#5944) ## Description
Previously the Sway book did not have any sections on Sway-Libs,
Sway-Standards, or Sway-Applications and was sparsely mentioned.
Sections for each repository have been added with a brief overview of
what they contain. Links to any app, standard, or library have also been
provided.
Any use of the sway-libs repository has been replaced with the
[Sway-Libs book](https://fuellabs.github.io/sway-libs/book/index.html).
This should eventually be replaced with a doc hub link when it has been
integrated and the same for Sway-Apps and Sway-Libs.
Closes #5780
## Checklist
- [x] I have linked to any relevant issues.
- [ ] I have commented my code, particularly in hard-to-understand
areas.
- [x] I have updated the documentation where relevant (API docs, the
reference, and the Sway book).
- [x] If my change requires substantial documentation changes, I have
[requested support from the DevRel
team](https://github.com/FuelLabs/devrel-requests/issues/new/choose)
- [x] I have added tests that prove my fix is effective or that my
feature works.
- [x] I have added (or requested a maintainer to add) the necessary
`Breaking*` or `New Feature` labels where relevant.
- [x] I have done my best to ensure that my PR adheres to [the Fuel Labs
Code Review
Standards](https://github.com/FuelLabs/rfcs/blob/master/text/code-standards/external-contributors.md).
- [x] I have requested a review from the relevant team or maintainers.
---------
Co-authored-by: K1-R1 <[email protected]>
| false
| 168
| 6
| 174
|
--- docs/book/spell-check-custom-words.txt
@@ -189,20 +189,8 @@ unary
SRC
DEX
SubId
-Pausable
-Libs
-Reentrancy
-reentrancy
-mathematic
-Soulbound
-NFTs
-NFT
-dApps
-fungible
-TicTacToe
-DAO
-Timelock
transpiler
+NFT
namespacing
unsafety
prioritizations
\ No newline at end of file
--- docs/book/src/SUMMARY.md
@@ -7,13 +7,11 @@
- [The Fuel Toolchain](./introduction/fuel_toolchain.md)
- [A Forc Project](./introduction/forc_project.md)
- [Standard Library](./introduction/standard_library.md)
- - [Sway Language Standards](./introduction/sway_standards.md)
- [Examples](./examples/index.md)
- [Counter](./examples/counter.md)
- [`FizzBuzz`](./examples/fizzbuzz.md)
- [Wallet Smart Contract](./examples/wallet_smart_contract.md)
- [Liquidity Pool](./examples/liquidity_pool.md)
- - [Sway Applications](./examples/sway_applications.md)
- [Program Types](./sway-program-types/index.md)
- [Contracts](./sway-program-types/smart_contracts.md)
- [Libraries](./sway-program-types/libraries.md)
@@ -65,7 +63,6 @@
- [Features](./lsp/features.md)
- [Troubleshooting](./lsp/troubleshooting.md)
- [Sway Reference](./reference/index.md)
- - [Sway Libraries](./reference/sway_libs.md)
- [Compiler Intrinsics](./reference/compiler_intrinsics.md)
- [Attributes](./reference/attributes.md)
- [Style Guide](./reference/style_guide.md)
--- docs/book/src/blockchain-development/access_control.md
@@ -29,7 +29,7 @@ The `msg_sender` function works as follows:
Many contracts require some form of ownership for access control. The [SRC-5 Ownership Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-5.md) has been defined to provide an interoperable interface for ownership within contracts.
-To accomplish this, use the [Ownership Library](https://fuellabs.github.io/sway-libs/book/ownership/index.html) to keep track of the owner. This allows setting and revoking ownership using the variants `Some(..)` and `None` respectively. This is better, safer, and more readable than using the `Identity` type directly where revoking ownership has to be done using some magic value such as `std::constants::ZERO_B256` or otherwise.
+To accomplish this, use the [Ownership Library](https://github.com/FuelLabs/sway-libs/tree/master/libs/src/ownership) to keep track of the owner. This allows setting and revoking ownership using the variants `Some(..)` and `None` respectively. This is better, safer, and more readable than using the `Identity` type directly where revoking ownership has to be done using some magic value such as `std::constants::ZERO_B256` or otherwise.
- The following is an example of how to properly lock a function such that only the owner may call a function:
@@ -62,12 +62,3 @@ Setting ownership can be done in one of two ways; During compile time or run tim
```sway
{{#include ../../../../examples/ownership/src/main.sw:get_owner_example}}
```
-
-## Access Control Libraries
-
-[Sway-Libs](../reference/sway_libs.md) provides the following libraries to enable further access control.
-
-- [Ownership Library](https://fuellabs.github.io/sway-libs/book/ownership/index.html); used to apply restrictions on functions such that only a **single** user may call them. This library provides helper functions for the [SRC-5; Ownership Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-5.md).
-- [Admin Library](https://fuellabs.github.io/sway-libs/book/admin/index.html); used to apply restrictions on functions such that only a select few users may call them like a whitelist.
-- [Pausable Library](https://fuellabs.github.io/sway-libs/book/pausable/index.html); allows contracts to implement an emergency stop mechanism.
-- [Reentrancy Guard Library](https://fuellabs.github.io/sway-libs/book/reentrancy/index.html); used to detect and prevent reentrancy attacks.
--- docs/book/src/blockchain-development/calling_contracts.md
@@ -78,7 +78,7 @@ fn main() {
A common attack vector for smart contracts is [re-entrancy](https://docs.soliditylang.org/en/v0.8.4/security-considerations.html#re-entrancy). Similar to the EVM, the FuelVM allows for re-entrancy.
-A _stateless_ re-entrancy guard is included in the [`sway-libs`](https://fuellabs.github.io/sway-libs/book/reentrancy/index.html) library. The guard will panic (revert) at run time if re-entrancy is detected.
+A _stateless_ re-entrancy guard is included in the [`sway-libs`](https://github.com/FuelLabs/sway-libs) library. The guard will panic (revert) at run time if re-entrancy is detected.
```sway
contract;
--- docs/book/src/blockchain-development/native_assets.md
@@ -14,18 +14,8 @@ On the FuelVM, _all_ assets are native and the process for sending _any_ native
While you would still need a smart contract to handle the minting and burning of assets, the sending and receiving of these assets can be done independently of the asset contract.
-Just like the EVM however, Fuel has a standard that describes a standard API for Native Assets using the Sway Language. The ERC-20 equivalent for the Sway Language is the [SRC-20; Native Asset Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-20.md).
-
> **NOTE** It is important to note that Fuel does not have tokens.
-### ERC-721 vs Native Asset
-
-On the EVM, an ERC-721 token or NFT is a contract that contains multiple tokens which are non-fungible with one another.
-
-On the FuelVM, the ERC-721 equivalent is a Native Asset where each asset has a supply of one. This is defined in the [SRC-20; Native Asset Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-20.md#non-fungible-asset-restrictions) under the Non-Fungible Asset Restrictions.
-
-In practice, this means all NFTs are treated the same as any other Native Asset on Fuel. When writing Sway code, no additional cases for handling non-fungible and fungible assets are required.
-
### No Token Approvals
An advantage Native Assets bring is that there is no need for token approvals; as with Ether on the EVM. With millions of dollars hacked every year due to misused token approvals, the FuelVM eliminates this attack vector.
@@ -106,8 +96,6 @@ You may also mint an asset to a specific entity with the `std::asset::mint_to()`
{{#include ../../../../examples/native_asset/src/main.sw:mint_to_asset}}
```
-If you intend to allow external users to mint assets using your contract, the [SRC-3; Mint and Burn Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-3.md#fn-mintrecipient-identity-vault_sub_id-subid-amount-u64) defines a standard API for minting assets. The [Sway-Libs Asset Library](https://fuellabs.github.io/sway-libs/book/asset/supply.html) also provides an additional library to support implementations of the SRC-3 Standard into your contract.
-
### Burning a Native Asset
To burn an asset, the `std::asset::burn()` function must be called internally from the contract which minted them. The `SubId` used to mint the coins and amount must be provided. The burned coins must be owned by the contract. When an asset is burned it doesn't exist anymore.
@@ -116,8 +104,6 @@ To burn an asset, the `std::asset::burn()` function must be called internally fr
{{#include ../../../../examples/native_asset/src/main.sw:burn_asset}}
```
-If you intend to allow external users to burn assets using your contract, the [SRC-3; Mint and Burn Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-3.md#fn-mintrecipient-identity-vault_sub_id-subid-amount-u64) defines a standard API for burning assets. The [Sway-Libs Asset Library](https://fuellabs.github.io/sway-libs/book/asset/supply.html) also provides an additional library to support implementations of the SRC-3 Standard into your contract.
-
### Transfer a Native Asset
To internally transfer a Native Asset, the `std::asset::transfer()` function must be called. A target `Identity` or user must be provided as well as the `AssetId` of the asset and an amount.
@@ -180,20 +166,13 @@ We currently have the following standards for Native Assets:
- [SRC-3; Mint and Burn Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-3.md) is used to enable mint and burn functionality for Native Assets.
- [SRC-7; Arbitrary Asset Metadata Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-7.md) is used to store metadata for Native Assets.
- [SRC-6; Vault Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-6.md) defines the implementation of a standard API for asset vaults developed in Sway.
-
-## Native Asset Libraries
-
-Additional Libraries have been developed to allow you to quickly create an deploy dApps that follow the [Sway Standards](https://github.com/FuelLabs/sway-standards).
-
-- [Asset Library](https://fuellabs.github.io/sway-libs/book/asset/index.html) provides functionality to implement the [SRC-20; Native Asset Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-20.md), [SRC-3; Mint and Burn Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-3.md), and [SRC-7; Arbitrary Asset Metadata Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-7.md) standards.
-
<!-- native_assets:example:end -->
## Single Native Asset Example
In this fully fleshed out example, we show a native asset contract which mints a single asset. This is the equivalent to the ERC-20 Standard use in Ethereum. Note there are no token approval functions.
-It implements the [SRC-20; Native Asset](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-20.md), [SRC-3; Mint and Burn](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-3.md), and [SRC-5; Ownership](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-5.md) standards. It does not use any external libraries.
+It implements the [SRC-20; Native Asset](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-20.md), [SRC-3; Mint and Burn](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-3.md), and [SRC-5; Ownership](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-5.md) standards.
```sway
// ERC20 equivalent in Sway.
@@ -335,7 +314,7 @@ fn require_access_owner() {
In this fully fleshed out example, we show a native asset contract which mints multiple assets. This is the equivalent to the ERC-1155 Standard use in Ethereum. Note there are no token approval functions.
-It implements the [SRC-20; Native Asset](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-20.md), [SRC-3; Mint and Burn](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-3.md), and [SRC-5; Ownership](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-5.md) standards. It does not use any external libraries.
+It implements the [SRC-20; Native Asset](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-20.md), [SRC-3; Mint and Burn](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-3.md), and [SRC-5; Ownership](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-5.md) standards.
```sway
// ERC1155 equivalent in Sway.
--- docs/book/src/examples/index.md
@@ -5,6 +5,5 @@ Some basic example contracts to see how Sway and Forc work.
- [Counter](./counter.md)
- [`FizzBuzz`](./fizzbuzz.md)
- [Wallet Smart Contract](./wallet_smart_contract.md)
-- [Liquidity Pool](./wallet_smart_contract.md)
Additional examples can be found in the [Sway Applications](https://github.com/FuelLabs/sway-applications/tree/master) repository.
--- docs/book/src/examples/sway_applications.md
@@ -1,33 +0,0 @@
-# Sway Applications
-
-The [Sway-Applications](https://github.com/FuelLabs/sway-applications) Repository contains end-to-end example applications that are written in Sway in order to demonstrate what can be built.
-
-## Asset Management
-
-- [Airdrop](https://github.com/FuelLabs/sway-applications/tree/master/airdrop) is an asset distribution program where users are able to claim assets given a valid merkle proof.
-- [Escrow](https://github.com/FuelLabs/sway-applications/tree/master/escrow) is a third party that keeps an asset on behalf of multiple parties.
-- [Non-Fungible Native Asset (NFT)](https://github.com/FuelLabs/sway-applications/tree/master/NFT) is an asset contract which provides unique collectibles, identified and differentiated by IDs, where assets contain metadata giving them distinctive characteristics.
-- [Fractional Non-Fungible Token (F-NFT)](https://github.com/FuelLabs/sway-applications/tree/master/fractional-NFT) is a token contract which issues shares or partial ownership upon locking an NFT into a vault.
-- [Timelock](https://github.com/FuelLabs/sway-applications/tree/master/timelock) is a contract which restricts the execution of a transaction to a specified time range.
-- [Native Asset](https://github.com/FuelLabs/sway-applications/tree/master/native-asset) is a basic asset contract that enables the use of Native Assets on Fuel using existing standards and libraries.
-
-## Decentralized Finance
-
-- [English Auction](https://github.com/FuelLabs/sway-applications/tree/master/english-auction) is an auction where users bid up the price of an asset until the bidding period has ended or a reserve has been met.
-- [Fundraiser](https://github.com/FuelLabs/sway-applications/tree/master/fundraiser) is a program allowing users to pledge towards a goal.
-- [OTC Swap Predicate](https://github.com/FuelLabs/sway-applications/tree/master/OTC-swap-predicate) is a predicate that can be used to propose and execute an atomic swap between two parties without requiring any on-chain state.
-
-## Governance
-
-- [Decentralized Autonomous Organization (DAO)](https://github.com/FuelLabs/sway-applications/tree/master/DAO) is an organization where users get to vote on governance proposals using governance assets.
-- [Multi-Signature Wallet](https://github.com/FuelLabs/sway-applications/tree/master/multisig-wallet) is a wallet that requires multiple signatures to execute a transaction.
-
-## Games
-
-- [TicTacToe](https://github.com/FuelLabs/sway-applications/tree/master/TicTacToe) is a game where two players compete to align three markers in a row.
-
-## Other
-
-- [Counter-Script](https://github.com/FuelLabs/sway-applications/tree/master/counter-script) is a script that calls a contract to increment a counter.
-- [Name-Registry](https://github.com/FuelLabs/sway-applications/tree/master/name-registry) allows users to perform transactions with human readable names instead of addresses.
-- [Oracle](https://github.com/FuelLabs/sway-applications/tree/master/oracle) is a smart contract that provides off-chain data to on-chain applications.
--- docs/book/src/introduction/index.md
@@ -6,4 +6,3 @@ To get started with Forc and Sway smart contract development, install the Fuel t
- [The Fuel Toolchain](./fuel_toolchain.md)
- [A Forc Project](./forc_project.md)
- [Standard Library](./standard_library.md)
-- [Sway Language Standards](./sway_standards.md)
--- docs/book/src/introduction/sway_standards.md
@@ -1,40 +0,0 @@
-# Sway Standards
-
-Just like many other smart contract languages, usage standards have been developed to enable cross compatibility between smart contracts.
-
-For more information on using a Sway Standard, please refer to the [Sway-Standards Repository](https://github.com/FuelLabs/sway-standards).
-
-## Standards
-
-### Native Asset Standards
-
-- [SRC-20; Native Asset Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-20.md) defines the implementation of a standard API for [Native Assets](../blockchain-development/native_assets.md) using the Sway Language.
-- [SRC-3; Mint and Burn](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-3.md) is used to enable mint and burn functionality for Native Assets.
-- [SRC-7; Arbitrary Asset Metadata Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-7.md) is used to store metadata for [Native Assets](../blockchain-development/native_assets.md), usually as NFTs.
-- [SRC-9; Metadata Keys Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-9.md) is used to store standardized metadata keys for [Native Assets](../blockchain-development/native_assets.md) in combination with the SRC-7 standard.
-- [SRC-6; Vault Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-6.md) defines the implementation of a standard API for asset vaults developed in Sway.
-
-### Predicate Standards
-
-- [SRC-13; Soulbound Address Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-13.md) defines a specific `Address` as a Soulbound Address for Soulbound Assets to become non-transferable.
-
-### Access Control Standards
-
-- [SRC-5; Ownership Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-5.md) is used to restrict function calls to admin users in contracts.
-
-### Contract Standards
-
-- [SRC-12; Contract Factory](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-12.md) defines the implementation of a standard API for contract factories.
-
-### Bridge Standards
-
-- [SRC-8; Bridged Asset](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-8.md) defines the metadata required for an asset bridged to the Fuel Network.
-- [SRC-10; Native Bridge Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-10.md) defines the standard API for the Native Bridge between the Fuel Chain and the canonical base chain.
-
-### Documentation Standards
-
-- [SRC-2; Inline Documentation](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-2.md) defines how to document your Sway files.
-
-## Standards Support
-
-Libraries have also been developed to support Sway Standards. These can be in [Sway-Libs](../reference/sway_libs.md).
--- docs/book/src/reference/sway_libs.md
@@ -1,42 +0,0 @@
-# Sway Libraries
-
-The purpose of Sway Libraries is to contain libraries which users can import and use that are not part of the standard library.
-
-These libraries contain helper functions and other tools valuable to blockchain development.
-
-For more information on how to use a Sway-Libs library, please refer to the [Sway-Libs Book](https://fuellabs.github.io/sway-libs/book/getting_started/index.html).
-
-## Assets Libraries
-
-Asset Libraries are any libraries that use [Native Assets](../blockchain-development/native_assets.md) on the Fuel Network.
-
-- [Asset Library](https://fuellabs.github.io/sway-libs/book/asset/index.html); provides helper functions for the [SRC-20](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-20.md), [SRC-3](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-3.md), and [SRC-7](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-7.md) standards.
-
-## Access Control and Security Libraries
-
-Access Control and Security Libraries are any libraries that are built and intended to provide additional safety when developing smart contracts.
-
-- [Ownership Library](https://fuellabs.github.io/sway-libs/book/ownership/index.html); used to apply restrictions on functions such that only a **single** user may call them. This library provides helper functions for the [SRC-5; Ownership Standard](https://github.com/FuelLabs/sway-standards/blob/master/SRCs/src-5.md).
-- [Admin Library](https://fuellabs.github.io/sway-libs/book/admin/index.html); used to apply restrictions on functions such that only a select few users may call them like a whitelist.
-- [Pausable Library](https://fuellabs.github.io/sway-libs/book/pausable/index.html); allows contracts to implement an emergency stop mechanism.
-- [Reentrancy Guard Library](https://fuellabs.github.io/sway-libs/book/reentrancy/index.html); used to detect and prevent reentrancy attacks.
-
-## Cryptography Libraries
-
-Cryptography Libraries are any libraries that provided cryptographic functionality beyond what the std-lib provides.
-
-- [Bytecode Library](https://fuellabs.github.io/sway-libs/book/bytecode/index.html); used for on-chain verification and computation of bytecode roots for contracts and predicates.
-- [Merkle Proof Library](https://fuellabs.github.io/sway-libs/book/merkle/index.html); used to verify Binary Merkle Trees computed off-chain.
-
-## Math Libraries
-
-Math Libraries are libraries which provide mathematic functions or number types that are outside of the std-lib's scope.
-
-- [Fixed Point Number Library](https://fuellabs.github.io/sway-libs/book/fixed_point/index.html); an interface to implement fixed-point numbers.
-- [Signed Integers Library](https://fuellabs.github.io/sway-libs/book/signed_integers/index.html); an interface to implement signed integers.
-
-## Data Structures Libraries
-
-Data Structure Libraries are libraries which provide complex data structures which unlock additional functionality for Smart Contracts.
-
-- [Queue Library](https://fuellabs.github.io/sway-libs/book/queue/index.html); a linear data structure that provides First-In-First-Out (FIFO) operations.
--- docs/reference/src/documentation/operations/reentrancy.md
@@ -9,7 +9,7 @@ To mitigate security concerns there are two approaches that are commonly used:
## Re-entrancy Guard
-Sway provides a stateless [re-entrancy](https://fuellabs.github.io/sway-libs/book/reentrancy/index.html) guard, which reverts at run-time when re-entrancy is detected.
+Sway provides a stateless [re-entrancy](https://github.com/FuelLabs/sway-libs/tree/master/libs/src/reentrancy) guard, which reverts at run-time when re-entrancy is detected.
To use the guard we must import it.
|
sway
|
fuellabs
|
Rust
|
Rust
| 62,435
| 5,382
|
🌴 Empowering everyone to build reliable and efficient smart contracts.
|
fuellabs_sway
|
DOC_CHANGE
|
Obvious
|
9ef6650207338c41eccb0e303b865ac8c3ff3e2b
|
2022-10-10 18:35:58
|
Oleksandr (Sasha) Khivrych
|
feat: added ukrainian translations for hash table (#948) Co-authored-by: Oleksii Trekhleb <[email protected]>
| false
| 35
| 5
| 40
|
--- src/data-structures/hash-table/README.md
@@ -6,13 +6,12 @@ _Read this in other languages:_
[_日本語_](README.ja-JP.md),
[_Français_](README.fr-FR.md),
[_Português_](README.pt-BR.md),
-[_한국어_](README.ko-KR.md),
-[_Українська_](README.uk-UA.md)
+[_한국어_](README.ko-KR.md)
In computing, a **hash table** (hash map) is a data
-structure which implements an _associative array_
-abstract data type, a structure that can _map keys
-to values_. A hash table uses a _hash function_ to
+structure which implements an *associative array*
+abstract data type, a structure that can *map keys
+to values*. A hash table uses a *hash function* to
compute an index into an array of buckets or slots,
from which the desired value can be found
@@ -29,7 +28,7 @@ Hash collision resolved by separate chaining.

-_Made with [okso.app](https://okso.app)_
+*Made with [okso.app](https://okso.app)*
## References
--- src/data-structures/hash-table/README.uk-UA.md
@@ -1,29 +0,0 @@
-# Геш таблиця
-
-**Геш таблиця** - структура даних, що реалізує абстрактний тип даних асоціативний масив, тобто. структура, яка
-_зв'язує ключі зі значеннями_. Геш-таблиця використовує _геш-функцію_ для обчислення індексу в масиві, в якому може
-бути знайдено бажане значення. Нижче представлена геш-таблиця, у якій ключем виступає ім'я людини, а значеннями
-телефонні номери. Геш-функція перетворює ключ-ім'я на індекс масиву з телефонними номерами.
-
-
-
-В ідеалі геш-функція присвоюватиме елементу масиву унікальний ключ. Проте більшість реальних геш-таблиць
-використовують недосконалі геш-функції. Це може призвести до ситуацій, коли геш-функція генерує однаковий індекс для
-кількох ключів. Ці ситуації називаються колізіями і мають бути якось вирішені.
-
-Існує два варіанти вирішення колізій - геш-таблиця з ланцюжками та з відкритою адресацією.
-
-Метод ланцюжків передбачає зберігання значень, відповідних одному й тому індексу як зв'язкового списку(ланцюжка).
-
-
-
-_Made with [okso.app](https://okso.app)_
-
-Метод відкритої адресації поміщає значення, для якого отримано дублюючий індекс, в першу вільну комірку.
-
-
-
-## Посилання
-
-- [Wikipedia](https://uk.wikipedia.org/wiki/%D0%93%D0%B5%D1%88-%D1%82%D0%B0%D0%B1%D0%BB%D0%B8%D1%86%D1%8F)
-- [YouTube](https://www.youtube.com/watch?v=WTYaboK-NMk)
|
javascript-algorithms
|
trekhleb
|
JavaScript
|
JavaScript
| 190,336
| 30,518
|
📝 Algorithms and data structures implemented in JavaScript with explanations and links to further readings
|
trekhleb_javascript-algorithms
|
DOC_CHANGE
|
Matched \.md\b in diff
|
33e701ab275779825d827f60ec1715acd239b48c
|
2025-01-16 07:10:41
|
Easy
|
Update README.md
| false
| 1
| 1
| 2
|
--- README.md
@@ -30,7 +30,7 @@
## 电子书
-- 可使用 mdbook-epub 工具自行编译:`mdbook-epub --standalone true` 然后 epub 在 book 目录下
+- 可使用 mdbook-epub 工具自行编译:`mdbook-epub --standalone` 然后 epub 在 book 目录下
- 扫码订阅《方法论》更新频道后下载: [进入](https://subdeer.cn/channel/landing/11)
## 在线阅读
|
one-person-businesses-methodology-v2.0
|
easychen
|
PHP
|
PHP
| 5,272
| 464
|
《一人企业方法论》第二版,也适合做其他副业(比如自媒体、电商、数字商品)的非技术人群。
|
easychen_one-person-businesses-methodology-v2.0
|
DOC_CHANGE
|
Obvious
|
77c6fd5ab2a76a6c442debe20735cff610d460a5
|
2024-09-13 22:56:51
|
Dmitry Lyzo
|
Improve direct profile ranking
| false
| 5
| 3
| 8
|
--- MediaBrowser.Model/Dlna/StreamBuilder.cs
@@ -19,10 +19,8 @@ namespace MediaBrowser.Model.Dlna
{
// Aliases
internal const TranscodeReason ContainerReasons = TranscodeReason.ContainerNotSupported | TranscodeReason.ContainerBitrateExceedsLimit;
- internal const TranscodeReason AudioCodecReasons = TranscodeReason.AudioBitrateNotSupported | TranscodeReason.AudioChannelsNotSupported | TranscodeReason.AudioProfileNotSupported | TranscodeReason.AudioSampleRateNotSupported | TranscodeReason.SecondaryAudioNotSupported | TranscodeReason.AudioBitDepthNotSupported | TranscodeReason.AudioIsExternal;
- internal const TranscodeReason AudioReasons = TranscodeReason.AudioCodecNotSupported | AudioCodecReasons;
- internal const TranscodeReason VideoCodecReasons = TranscodeReason.VideoResolutionNotSupported | TranscodeReason.AnamorphicVideoNotSupported | TranscodeReason.InterlacedVideoNotSupported | TranscodeReason.VideoBitDepthNotSupported | TranscodeReason.VideoBitrateNotSupported | TranscodeReason.VideoFramerateNotSupported | TranscodeReason.VideoLevelNotSupported | TranscodeReason.RefFramesNotSupported | TranscodeReason.VideoRangeTypeNotSupported | TranscodeReason.VideoProfileNotSupported;
- internal const TranscodeReason VideoReasons = TranscodeReason.VideoCodecNotSupported | VideoCodecReasons;
+ internal const TranscodeReason AudioReasons = TranscodeReason.AudioCodecNotSupported | TranscodeReason.AudioBitrateNotSupported | TranscodeReason.AudioChannelsNotSupported | TranscodeReason.AudioProfileNotSupported | TranscodeReason.AudioSampleRateNotSupported | TranscodeReason.SecondaryAudioNotSupported | TranscodeReason.AudioBitDepthNotSupported | TranscodeReason.AudioIsExternal;
+ internal const TranscodeReason VideoReasons = TranscodeReason.VideoCodecNotSupported | TranscodeReason.VideoResolutionNotSupported | TranscodeReason.AnamorphicVideoNotSupported | TranscodeReason.InterlacedVideoNotSupported | TranscodeReason.VideoBitDepthNotSupported | TranscodeReason.VideoBitrateNotSupported | TranscodeReason.VideoFramerateNotSupported | TranscodeReason.VideoLevelNotSupported | TranscodeReason.RefFramesNotSupported | TranscodeReason.VideoRangeTypeNotSupported | TranscodeReason.VideoProfileNotSupported;
internal const TranscodeReason DirectStreamReasons = AudioReasons | TranscodeReason.ContainerNotSupported | TranscodeReason.VideoCodecTagNotSupported;
private readonly ILogger _logger;
@@ -1316,7 +1314,7 @@ namespace MediaBrowser.Model.Dlna
}
}
- var rankings = new[] { TranscodeReason.VideoCodecNotSupported, VideoCodecReasons, TranscodeReason.AudioCodecNotSupported, AudioCodecReasons, ContainerReasons };
+ var rankings = new[] { VideoReasons, AudioReasons, ContainerReasons };
var rank = (ref TranscodeReason a) =>
{
var index = 1;
|
jellyfin
|
jellyfin
|
C#
|
C#
| 37,617
| 3,375
|
The Free Software Media System - Server Backend & API
|
jellyfin_jellyfin
|
PERF_IMPROVEMENT
|
simplify decoder draining logic
|
af0ddcd9f2b26e92b2a5313032266558a881e34d
|
2024-06-01 12:29:32
|
Edward Hsing
|
Update README.md
| false
| 1
| 0
| 1
|
--- README.md
@@ -1,6 +1,5 @@
# US.KG – A FREE NAME FOR EVERYONE
[Registry Website (https://nic.us.kg/)](https://nic.us.kg/)
-#### Due to high demand, the KYC process on GitHub requires a longer review time. If you need faster review, please consider donating $1 and sending an email; it is usually processed within a few hours.
## Domain names no longer cost
Now, regardless of your project, whether you’re an individual or an organization, you can easily register and own your own *.US.KG domain name, 100% completely free. You can host your website with any third-party DNS service you like, such as Cloudflare, FreeDNS by afraid, hostry…
|
freedomain
|
digitalplatdev
|
HTML
|
HTML
| 41,142
| 933
|
DigitalPlat FreeDomain: Free Domain For Everyone
|
digitalplatdev_freedomain
|
NEW_FEAT
|
Adding new domain management features
|
1d416f6626ad02a47b245697b32fb2c374e928f4
|
2024-08-24 09:40:36
|
fufesou
|
refact: flutter keyboard, map mode (#9160) Signed-off-by: fufesou <[email protected]>
| false
| 231
| 210
| 441
|
--- Cargo.lock
@@ -5187,7 +5187,7 @@ dependencies = [
[[package]]
name = "rdev"
version = "0.5.0-2"
-source = "git+https://github.com/rustdesk-org/rdev#d4c1759926d693ba269e2cb8cf9f87b13e424e4e"
+source = "git+https://github.com/rustdesk-org/rdev#b3434caee84c92412b45a2f655a15ac5dad33488"
dependencies = [
"cocoa 0.24.1",
"core-foundation 0.9.4",
--- flutter/lib/common/widgets/remote_input.dart
@@ -34,7 +34,8 @@ class RawKeyFocusScope extends StatelessWidget {
canRequestFocus: true,
focusNode: focusNode,
onFocusChange: onFocusChange,
- onKeyEvent: (node, event) => inputModel.handleKeyEvent(event),
+ onKey: (FocusNode data, RawKeyEvent e) =>
+ inputModel.handleRawKeyEvent(e),
child: child));
}
}
--- flutter/lib/models/desktop_render_texture.dart
@@ -181,7 +181,6 @@ class TextureModel {
}
updateCurrentDisplay(int curDisplay) {
- if (isWeb) return;
final ffi = parent.target;
if (ffi == null) return;
tryCreateTexture(int idx) {
--- flutter/lib/models/input_model.dart
@@ -178,15 +178,15 @@ class PointerEventToRust {
}
class ToReleaseKeys {
- KeyEvent? lastLShiftKeyEvent;
- KeyEvent? lastRShiftKeyEvent;
- KeyEvent? lastLCtrlKeyEvent;
- KeyEvent? lastRCtrlKeyEvent;
- KeyEvent? lastLAltKeyEvent;
- KeyEvent? lastRAltKeyEvent;
- KeyEvent? lastLCommandKeyEvent;
- KeyEvent? lastRCommandKeyEvent;
- KeyEvent? lastSuperKeyEvent;
+ RawKeyEvent? lastLShiftKeyEvent;
+ RawKeyEvent? lastRShiftKeyEvent;
+ RawKeyEvent? lastLCtrlKeyEvent;
+ RawKeyEvent? lastRCtrlKeyEvent;
+ RawKeyEvent? lastLAltKeyEvent;
+ RawKeyEvent? lastRAltKeyEvent;
+ RawKeyEvent? lastLCommandKeyEvent;
+ RawKeyEvent? lastRCommandKeyEvent;
+ RawKeyEvent? lastSuperKeyEvent;
reset() {
lastLShiftKeyEvent = null;
@@ -200,7 +200,67 @@ class ToReleaseKeys {
lastSuperKeyEvent = null;
}
- release(KeyEventResult Function(KeyEvent e) handleKeyEvent) {
+ updateKeyDown(LogicalKeyboardKey logicKey, RawKeyDownEvent e) {
+ if (e.isAltPressed) {
+ if (logicKey == LogicalKeyboardKey.altLeft) {
+ lastLAltKeyEvent = e;
+ } else if (logicKey == LogicalKeyboardKey.altRight) {
+ lastRAltKeyEvent = e;
+ }
+ } else if (e.isControlPressed) {
+ if (logicKey == LogicalKeyboardKey.controlLeft) {
+ lastLCtrlKeyEvent = e;
+ } else if (logicKey == LogicalKeyboardKey.controlRight) {
+ lastRCtrlKeyEvent = e;
+ }
+ } else if (e.isShiftPressed) {
+ if (logicKey == LogicalKeyboardKey.shiftLeft) {
+ lastLShiftKeyEvent = e;
+ } else if (logicKey == LogicalKeyboardKey.shiftRight) {
+ lastRShiftKeyEvent = e;
+ }
+ } else if (e.isMetaPressed) {
+ if (logicKey == LogicalKeyboardKey.metaLeft) {
+ lastLCommandKeyEvent = e;
+ } else if (logicKey == LogicalKeyboardKey.metaRight) {
+ lastRCommandKeyEvent = e;
+ } else if (logicKey == LogicalKeyboardKey.superKey) {
+ lastSuperKeyEvent = e;
+ }
+ }
+ }
+
+ updateKeyUp(LogicalKeyboardKey logicKey, RawKeyUpEvent e) {
+ if (e.isAltPressed) {
+ if (logicKey == LogicalKeyboardKey.altLeft) {
+ lastLAltKeyEvent = null;
+ } else if (logicKey == LogicalKeyboardKey.altRight) {
+ lastRAltKeyEvent = null;
+ }
+ } else if (e.isControlPressed) {
+ if (logicKey == LogicalKeyboardKey.controlLeft) {
+ lastLCtrlKeyEvent = null;
+ } else if (logicKey == LogicalKeyboardKey.controlRight) {
+ lastRCtrlKeyEvent = null;
+ }
+ } else if (e.isShiftPressed) {
+ if (logicKey == LogicalKeyboardKey.shiftLeft) {
+ lastLShiftKeyEvent = null;
+ } else if (logicKey == LogicalKeyboardKey.shiftRight) {
+ lastRShiftKeyEvent = null;
+ }
+ } else if (e.isMetaPressed) {
+ if (logicKey == LogicalKeyboardKey.metaLeft) {
+ lastLCommandKeyEvent = null;
+ } else if (logicKey == LogicalKeyboardKey.metaRight) {
+ lastRCommandKeyEvent = null;
+ } else if (logicKey == LogicalKeyboardKey.superKey) {
+ lastSuperKeyEvent = null;
+ }
+ }
+ }
+
+ release(KeyEventResult Function(RawKeyEvent e) handleRawKeyEvent) {
for (final key in [
lastLShiftKeyEvent,
lastRShiftKeyEvent,
@@ -213,7 +273,10 @@ class ToReleaseKeys {
lastSuperKeyEvent,
]) {
if (key != null) {
- handleKeyEvent(key);
+ handleRawKeyEvent(RawKeyUpEvent(
+ data: key.data,
+ character: key.character,
+ ));
}
}
}
@@ -276,116 +339,49 @@ class InputModel {
}
}
- void handleKeyDownEventModifiers(KeyEvent e) {
- KeyUpEvent upEvent(e) => KeyUpEvent(
- physicalKey: e.physicalKey,
- logicalKey: e.logicalKey,
- timeStamp: e.timeStamp,
- );
- if (e.logicalKey == LogicalKeyboardKey.altLeft) {
- if (!alt) {
- alt = true;
- }
- toReleaseKeys.lastLAltKeyEvent = upEvent(e);
- } else if (e.logicalKey == LogicalKeyboardKey.altRight) {
- if (!alt) {
- alt = true;
- }
- toReleaseKeys.lastLAltKeyEvent = upEvent(e);
- } else if (e.logicalKey == LogicalKeyboardKey.controlLeft) {
- if (!ctrl) {
- ctrl = true;
- }
- toReleaseKeys.lastLCtrlKeyEvent = upEvent(e);
- } else if (e.logicalKey == LogicalKeyboardKey.controlRight) {
- if (!ctrl) {
- ctrl = true;
- }
- toReleaseKeys.lastRCtrlKeyEvent = upEvent(e);
- } else if (e.logicalKey == LogicalKeyboardKey.shiftLeft) {
- if (!shift) {
- shift = true;
- }
- toReleaseKeys.lastLShiftKeyEvent = upEvent(e);
- } else if (e.logicalKey == LogicalKeyboardKey.shiftRight) {
- if (!shift) {
- shift = true;
- }
- toReleaseKeys.lastRShiftKeyEvent = upEvent(e);
- } else if (e.logicalKey == LogicalKeyboardKey.metaLeft) {
- if (!command) {
- command = true;
- }
- toReleaseKeys.lastLCommandKeyEvent = upEvent(e);
- } else if (e.logicalKey == LogicalKeyboardKey.metaRight) {
- if (!command) {
- command = true;
- }
- toReleaseKeys.lastRCommandKeyEvent = upEvent(e);
- } else if (e.logicalKey == LogicalKeyboardKey.superKey) {
- if (!command) {
- command = true;
- }
- toReleaseKeys.lastSuperKeyEvent = upEvent(e);
- }
- }
-
- void handleKeyUpEventModifiers(KeyEvent e) {
- if (e.logicalKey == LogicalKeyboardKey.altLeft) {
- alt = false;
- toReleaseKeys.lastLAltKeyEvent = null;
- } else if (e.logicalKey == LogicalKeyboardKey.altRight) {
- alt = false;
- toReleaseKeys.lastRAltKeyEvent = null;
- } else if (e.logicalKey == LogicalKeyboardKey.controlLeft) {
- ctrl = false;
- toReleaseKeys.lastLCtrlKeyEvent = null;
- } else if (e.logicalKey == LogicalKeyboardKey.controlRight) {
- ctrl = false;
- toReleaseKeys.lastRCtrlKeyEvent = null;
- } else if (e.logicalKey == LogicalKeyboardKey.shiftLeft) {
- shift = false;
- toReleaseKeys.lastLShiftKeyEvent = null;
- } else if (e.logicalKey == LogicalKeyboardKey.shiftRight) {
- shift = false;
- toReleaseKeys.lastRShiftKeyEvent = null;
- } else if (e.logicalKey == LogicalKeyboardKey.metaLeft) {
- command = false;
- toReleaseKeys.lastLCommandKeyEvent = null;
- } else if (e.logicalKey == LogicalKeyboardKey.metaRight) {
- command = false;
- toReleaseKeys.lastRCommandKeyEvent = null;
- } else if (e.logicalKey == LogicalKeyboardKey.superKey) {
- command = false;
- toReleaseKeys.lastSuperKeyEvent = null;
- }
- }
-
- KeyEventResult handleKeyEvent(KeyEvent e) {
+ KeyEventResult handleRawKeyEvent(RawKeyEvent e) {
if (isViewOnly) return KeyEventResult.handled;
if ((isDesktop || isWebDesktop) && !isInputSourceFlutter) {
return KeyEventResult.handled;
}
- if (isWindows || isLinux) {
- // Ignore meta keys. Because flutter window will loose focus if meta key is pressed.
- if (e.physicalKey == PhysicalKeyboardKey.metaLeft ||
- e.physicalKey == PhysicalKeyboardKey.metaRight) {
- return KeyEventResult.handled;
+
+ final key = e.logicalKey;
+ if (e is RawKeyDownEvent) {
+ if (!e.repeat) {
+ if (e.isAltPressed && !alt) {
+ alt = true;
+ } else if (e.isControlPressed && !ctrl) {
+ ctrl = true;
+ } else if (e.isShiftPressed && !shift) {
+ shift = true;
+ } else if (e.isMetaPressed && !command) {
+ command = true;
+ }
}
+ toReleaseKeys.updateKeyDown(key, e);
}
+ if (e is RawKeyUpEvent) {
+ if (key == LogicalKeyboardKey.altLeft ||
+ key == LogicalKeyboardKey.altRight) {
+ alt = false;
+ } else if (key == LogicalKeyboardKey.controlLeft ||
+ key == LogicalKeyboardKey.controlRight) {
+ ctrl = false;
+ } else if (key == LogicalKeyboardKey.shiftRight ||
+ key == LogicalKeyboardKey.shiftLeft) {
+ shift = false;
+ } else if (key == LogicalKeyboardKey.metaLeft ||
+ key == LogicalKeyboardKey.metaRight ||
+ key == LogicalKeyboardKey.superKey) {
+ command = false;
+ }
- if (e is KeyUpEvent) {
- handleKeyUpEventModifiers(e);
- } else if (e is KeyDownEvent) {
- handleKeyDownEventModifiers(e);
+ toReleaseKeys.updateKeyUp(key, e);
}
// * Currently mobile does not enable map mode
- if ((isDesktop || isWebDesktop)) {
- // FIXME: e.character is wrong for dead keys, eg: ^ in de
- newKeyboardMode(e.character ?? '', e.physicalKey.usbHidUsage & 0xFFFF,
- // Show repeat event be converted to "release+press" events?
- e is KeyDownEvent || e is KeyRepeatEvent);
+ if ((isDesktop || isWebDesktop) && keyboardMode == 'map') {
+ mapKeyboardMode(e);
} else {
legacyKeyboardMode(e);
}
@@ -393,8 +389,42 @@ class InputModel {
return KeyEventResult.handled;
}
- /// Send Key Event
- void newKeyboardMode(String character, int usbHid, bool down) {
+ void mapKeyboardMode(RawKeyEvent e) {
+ int positionCode = -1;
+ int platformCode = -1;
+ bool down;
+
+ if (e.data is RawKeyEventDataMacOs) {
+ RawKeyEventDataMacOs newData = e.data as RawKeyEventDataMacOs;
+ positionCode = newData.keyCode;
+ platformCode = newData.keyCode;
+ } else if (e.data is RawKeyEventDataWindows) {
+ RawKeyEventDataWindows newData = e.data as RawKeyEventDataWindows;
+ positionCode = newData.scanCode;
+ platformCode = newData.keyCode;
+ } else if (e.data is RawKeyEventDataLinux) {
+ RawKeyEventDataLinux newData = e.data as RawKeyEventDataLinux;
+ // scanCode and keyCode of RawKeyEventDataLinux are incorrect.
+ // 1. scanCode means keycode
+ // 2. keyCode means keysym
+ positionCode = newData.scanCode;
+ platformCode = newData.keyCode;
+ } else if (e.data is RawKeyEventDataAndroid) {
+ RawKeyEventDataAndroid newData = e.data as RawKeyEventDataAndroid;
+ positionCode = newData.scanCode + 8;
+ platformCode = newData.keyCode;
+ } else {}
+
+ if (e is RawKeyDownEvent) {
+ down = true;
+ } else {
+ down = false;
+ }
+ inputRawKey(e.character ?? '', platformCode, positionCode, down);
+ }
+
+ /// Send raw Key Event
+ void inputRawKey(String name, int platformCode, int positionCode, bool down) {
const capslock = 1;
const numlock = 2;
const scrolllock = 3;
@@ -413,23 +443,27 @@ class InputModel {
}
bind.sessionHandleFlutterKeyEvent(
sessionId: sessionId,
- character: character,
- usbHid: usbHid,
+ name: name,
+ platformCode: platformCode,
+ positionCode: positionCode,
lockModes: lockModes,
downOrUp: down);
}
- void legacyKeyboardMode(KeyEvent e) {
- if (e is KeyDownEvent) {
- sendKey(e, down: true);
- } else if (e is KeyRepeatEvent) {
- sendKey(e, press: true);
- } else if (e is KeyUpEvent) {
- sendKey(e);
+ void legacyKeyboardMode(RawKeyEvent e) {
+ if (e is RawKeyDownEvent) {
+ if (e.repeat) {
+ sendRawKey(e, press: true);
+ } else {
+ sendRawKey(e, down: true);
+ }
+ }
+ if (e is RawKeyUpEvent) {
+ sendRawKey(e);
}
}
- void sendKey(KeyEvent e, {bool? down, bool? press}) {
+ void sendRawKey(RawKeyEvent e, {bool? down, bool? press}) {
// for maximum compatibility
final label = physicalKeyMap[e.physicalKey.usbHidUsage] ??
logicalKeyMap[e.logicalKey.keyId] ??
@@ -532,7 +566,7 @@ class InputModel {
}
void enterOrLeave(bool enter) {
- toReleaseKeys.release(handleKeyEvent);
+ toReleaseKeys.release(handleRawKeyEvent);
_pointerMovedAfterEnter = false;
// Fix status
@@ -1130,15 +1164,15 @@ class InputModel {
// Simulate a key press event.
// `usbHidUsage` is the USB HID usage code of the key.
Future<void> tapHidKey(int usbHidUsage) async {
- newKeyboardMode(kKeyFlutterKey, usbHidUsage, true);
+ inputRawKey(kKeyFlutterKey, usbHidUsage, 0, true);
await Future.delayed(Duration(milliseconds: 100));
- newKeyboardMode(kKeyFlutterKey, usbHidUsage, false);
+ inputRawKey(kKeyFlutterKey, usbHidUsage, 0, false);
}
Future<void> onMobileVolumeUp() async =>
- await tapHidKey(PhysicalKeyboardKey.audioVolumeUp.usbHidUsage & 0xFFFF);
+ await tapHidKey(PhysicalKeyboardKey.audioVolumeUp.usbHidUsage);
Future<void> onMobileVolumeDown() async =>
- await tapHidKey(PhysicalKeyboardKey.audioVolumeDown.usbHidUsage & 0xFFFF);
+ await tapHidKey(PhysicalKeyboardKey.audioVolumeDown.usbHidUsage);
Future<void> onMobilePower() async =>
- await tapHidKey(PhysicalKeyboardKey.power.usbHidUsage & 0xFFFF);
+ await tapHidKey(PhysicalKeyboardKey.power.usbHidUsage);
}
--- flutter/lib/web/bridge.dart
@@ -23,7 +23,6 @@ sealed class EventToUI {
) = EventToUI_Rgba;
const factory EventToUI.texture(
int field0,
- bool field1,
) = EventToUI_Texture;
}
@@ -34,19 +33,15 @@ class EventToUI_Event implements EventToUI {
}
class EventToUI_Rgba implements EventToUI {
- const EventToUI_Rgba(final int field0) : field = field0;
+ const EventToUI_Rgba(final int field0) : this.field = field0;
final int field;
int get field0 => field;
}
class EventToUI_Texture implements EventToUI {
- const EventToUI_Texture(final int field0, final bool field1)
- : f0 = field0,
- f1 = field1;
- final int f0;
- final bool f1;
- int get field0 => f0;
- bool get field1 => f1;
+ const EventToUI_Texture(final int field0) : this.field = field0;
+ final int field;
+ int get field0 => field;
}
class RustdeskImpl {
@@ -399,20 +394,14 @@ class RustdeskImpl {
Future<void> sessionHandleFlutterKeyEvent(
{required UuidValue sessionId,
- required String character,
- required int usbHid,
+ required String name,
+ required int platformCode,
+ required int positionCode,
required int lockModes,
required bool downOrUp,
dynamic hint}) {
- return Future(() => js.context.callMethod('setByName', [
- 'flutter_key_event',
- jsonEncode({
- 'name': character,
- 'usb_hid': usbHid,
- 'lock_modes': lockModes,
- if (downOrUp) 'down': 'true',
- })
- ]));
+ // TODO: map mode
+ throw UnimplementedError();
}
void sessionEnterOrLeave(
@@ -713,11 +702,11 @@ class RustdeskImpl {
}
Future<String> mainGetAppName({dynamic hint}) {
- return Future.value(mainGetAppNameSync(hint: hint));
+ throw UnimplementedError();
}
String mainGetAppNameSync({dynamic hint}) {
- return 'RustDesk';
+ throw UnimplementedError();
}
String mainUriPrefixSync({dynamic hint}) {
@@ -769,9 +758,8 @@ class RustdeskImpl {
}
Future<bool> mainIsUsingPublicServer({dynamic hint}) {
- return Future(() =>
- js.context.callMethod('getByName', ["is_using_public_server"]) ==
- 'true');
+ return Future(
+ () => js.context.callMethod('setByName', ["is_using_public_server"]));
}
Future<void> mainDiscover({dynamic hint}) {
@@ -1622,7 +1610,7 @@ class RustdeskImpl {
}
bool mainIsOptionFixed({required String key, dynamic hint}) {
- return false;
+ throw UnimplementedError();
}
bool mainGetUseTextureRender({dynamic hint}) {
@@ -1662,36 +1650,5 @@ class RustdeskImpl {
throw UnimplementedError();
}
- Future<String> getVoiceCallInputDevice({required bool isCm, dynamic hint}) {
- throw UnimplementedError();
- }
-
- Future<void> setVoiceCallInputDevice(
- {required bool isCm, required String device, dynamic hint}) {
- throw UnimplementedError();
- }
-
- bool isPresetPasswordMobileOnly({dynamic hint}) {
- throw UnimplementedError();
- }
-
- String mainGetBuildinOption({required String key, dynamic hint}) {
- return '';
- }
-
- String installInstallOptions({dynamic hint}) {
- throw UnimplementedError();
- }
-
- sessionRenameFile(
- {required UuidValue sessionId,
- required int actId,
- required String path,
- required String newName,
- required bool isRemote,
- dynamic hint}) {
- throw UnimplementedError();
- }
-
void dispose() {}
}
--- flutter/lib/web/texture_rgba_renderer.dart
@@ -6,7 +6,7 @@ class TextureRgbaRenderer {
}
Future<bool> closeTexture(int key) {
- return Future(() => true);
+ throw UnimplementedError();
}
Future<bool> onRgba(
--- src/flutter_ffi.rs
@@ -491,8 +491,9 @@ pub fn session_switch_display(is_desktop: bool, session_id: SessionID, value: Ve
pub fn session_handle_flutter_key_event(
session_id: SessionID,
- character: String,
- usb_hid: i32,
+ name: String,
+ platform_code: i32,
+ position_code: i32,
lock_modes: i32,
down_or_up: bool,
) {
@@ -500,8 +501,9 @@ pub fn session_handle_flutter_key_event(
let keyboard_mode = session.get_keyboard_mode();
session.handle_flutter_key_event(
&keyboard_mode,
- &character,
- usb_hid,
+ &name,
+ platform_code,
+ position_code,
lock_modes,
down_or_up,
);
--- src/ui_session_interface.rs
@@ -803,18 +803,19 @@ impl<T: InvokeUiSession> Session<T> {
pub fn handle_flutter_key_event(
&self,
keyboard_mode: &str,
- character: &str,
- usb_hid: i32,
+ name: &str,
+ platform_code: i32,
+ position_code: i32,
lock_modes: i32,
down_or_up: bool,
) {
- if character == "flutter_key" {
- self._handle_key_flutter_simulation(keyboard_mode, usb_hid, down_or_up);
+ if name == "flutter_key" {
+ self._handle_key_flutter_simulation(keyboard_mode, platform_code, down_or_up);
} else {
self._handle_key_non_flutter_simulation(
keyboard_mode,
- character,
- usb_hid,
+ platform_code,
+ position_code,
lock_modes,
down_or_up,
);
@@ -830,10 +831,10 @@ impl<T: InvokeUiSession> Session<T> {
) {
// https://github.com/flutter/flutter/blob/master/packages/flutter/lib/src/services/keyboard_key.g.dart#L4356
let ctrl_key = match platform_code {
- 0x007f => Some(ControlKey::VolumeMute),
- 0x0080 => Some(ControlKey::VolumeUp),
- 0x0081 => Some(ControlKey::VolumeDown),
- 0x0066 => Some(ControlKey::Power),
+ 0x0007007f => Some(ControlKey::VolumeMute),
+ 0x00070080 => Some(ControlKey::VolumeUp),
+ 0x00070081 => Some(ControlKey::VolumeDown),
+ 0x00070066 => Some(ControlKey::Power),
_ => None,
};
let Some(ctrl_key) = ctrl_key else { return };
@@ -850,28 +851,22 @@ impl<T: InvokeUiSession> Session<T> {
fn _handle_key_non_flutter_simulation(
&self,
keyboard_mode: &str,
- character: &str,
- usb_hid: i32,
+ platform_code: i32,
+ position_code: i32,
lock_modes: i32,
down_or_up: bool,
) {
- let key = rdev::usb_hid_key_from_code(usb_hid as _);
-
- #[cfg(target_os = "windows")]
- let platform_code: u32 = rdev::win_code_from_key(key).unwrap_or(0);
- #[cfg(target_os = "windows")]
- let position_code: KeyCode = rdev::win_scancode_from_key(key).unwrap_or(0) as _;
+ if position_code < 0 || platform_code < 0 {
+ return;
+ }
+ let platform_code: u32 = platform_code as _;
+ let position_code: KeyCode = position_code as _;
#[cfg(not(target_os = "windows"))]
- let position_code: KeyCode = rdev::code_from_key(key).unwrap_or(0) as _;
- #[cfg(not(any(target_os = "windows", target_os = "linux")))]
- let platform_code: u32 = position_code as _;
- // For translate mode.
- // We need to set the platform code (keysym) if is AltGr.
- // https://github.com/rustdesk/rustdesk/blob/07cf1b4db5ef2f925efd3b16b87c33ce03c94809/src/keyboard.rs#L1029
- // https://github.com/flutter/flutter/issues/153811
- #[cfg(target_os = "linux")]
- let platform_code: u32 = position_code as _;
+ let key = rdev::key_from_code(position_code) as rdev::Key;
+ // Windows requires special handling
+ #[cfg(target_os = "windows")]
+ let key = rdev::get_win_key(platform_code, position_code);
let event_type = if down_or_up {
KeyPress(key)
@@ -880,16 +875,7 @@ impl<T: InvokeUiSession> Session<T> {
};
let event = Event {
time: SystemTime::now(),
- unicode: if character.is_empty() {
- None
- } else {
- Some(rdev::UnicodeInfo {
- name: Some(character.to_string()),
- unicode: character.encode_utf16().collect(),
- // is_dead: is not correct here, because flutter cannot detect deadcode for now.
- is_dead: false,
- })
- },
+ unicode: None,
platform_code,
position_code: position_code as _,
event_type,
|
rustdesk
|
rustdesk
|
Rust
|
Rust
| 83,345
| 11,693
|
An open-source remote desktop application designed for self-hosting, as an alternative to TeamViewer.
|
rustdesk_rustdesk
|
CODE_IMPROVEMENT
|
Obvious
|
7b64e0b295febce77030766eaf75fbf05e245481
|
2022-12-31 13:37:18
|
bannedbook
|
update
| false
| 0
| 0
| 0
|
--- docs/vsp-cn.py
Binary files a/docs/vsp-cn.py and b/docs/vsp-cn.py differ
|
fanqiang
|
bannedbook
|
Kotlin
|
Kotlin
| 39,286
| 7,317
|
翻墙-科学上网
|
bannedbook_fanqiang
|
CONFIG_CHANGE
|
Very small changes
|
fc0049fb20fe5998232a365b5ee08d6730693f1b
|
2025-02-13 04:22:40
|
Filippo Valsorda
|
crypto/tls: document FIPS 140-3 mode behavior Change-Id: I6a6a465612cf76d148b9758ee3fcdc8606497830 Reviewed-on: https://go-review.googlesource.com/c/go/+/648835 Reviewed-by: Daniel McCarney <[email protected]> LUCI-TryBot-Result: Go LUCI <[email protected]> Auto-Submit: Filippo Valsorda <[email protected]> Reviewed-by: Roland Shoemaker <[email protected]> Reviewed-by: Ian Lance Taylor <[email protected]>
| false
| 9
| 0
| 9
|
--- src/crypto/tls/tls.go
@@ -4,15 +4,6 @@
// Package tls partially implements TLS 1.2, as specified in RFC 5246,
// and TLS 1.3, as specified in RFC 8446.
-//
-// # FIPS 140-3 mode
-//
-// When the program is in [FIPS 140-3 mode], this package behaves as if
-// only protocol versions, cipher suites, signature algorithms, and
-// key exchange algorithms approved by NIST SP 800-52r2 are implemented.
-// Others are silently ignored and not negotiated.
-//
-// [FIPS 140-3 mode]: https://go.dev/doc/security/fips140
package tls
// BUG(agl): The crypto/tls package only implements some countermeasures
|
go
|
golang
|
Go
|
Go
| 126,191
| 17,926
|
The Go programming language
|
golang_go
|
DOC_CHANGE
|
Comments are added in the code file
|
a2d8609d09df52ebf2978bf52acacf85b00c052a
|
2023-06-20 08:18:17
|
paigeman
|
Update java16.md
| false
| 1
| 1
| 2
|
--- docs/java/new-features/java16.md
@@ -84,7 +84,7 @@ Java 14([ JEP 370](https://openjdk.org/jeps/370)) 的时候,第一次孵化外
| ---------- | ----------------- | --------------------------------------- | ---------------------------------------- |
| Java SE 14 | preview | [JEP 305](https://openjdk.org/jeps/305) | 首次引入 instanceof 模式匹配。 |
| Java SE 15 | Second Preview | [JEP 375](https://openjdk.org/jeps/375) | 相比较上个版本无变化,继续收集更多反馈。 |
-| Java SE 16 | Permanent Release | [JEP 394](https://openjdk.org/jeps/394) | 模式变量不再隐式为 final。 |
+| Java SE 16 | Permanent Release | [JEP 394](https://openjdk.org/jeps/394) | 模式变量不在隐式为 final。 |
从 Java 16 开始,你可以对 `instanceof` 中的变量值进行修改。
|
javaguide
|
snailclimb
|
Java
|
Java
| 148,495
| 45,728
|
「Java学习+面试指南」一份涵盖大部分 Java 程序员所需要掌握的核心知识。准备 Java 面试,首选 JavaGuide!
|
snailclimb_javaguide
|
DOC_CHANGE
|
obvious
|
f2861448ab16ae06ffdc8560e9f50d6c192c0623
| null |
Graydon Hoare
|
Fix bug in win32 command-line arg processing.
| false
| 1
| 1
| 0
|
--- rust.cpp
@@ -167,7 +167,7 @@ command_line_args
#if defined(__WIN32__)
LPCWSTR cmdline = GetCommandLineW();
LPWSTR *wargv = CommandLineToArgvW(cmdline, &argc);
- dom.win32_require("CommandLineToArgvW", argv != NULL);
+ dom.win32_require("CommandLineToArgvW", wargv != NULL);
argv = (char **) dom.malloc(sizeof(char*) * argc);
for (int i = 0; i < argc; ++i) {
int n_chars = WideCharToMultiByte(CP_UTF8, 0, wargv[i], -1,
|
rust-lang_rust.json
| null | null | null | null | null | null |
rust-lang_rust.json
|
BUG_FIX
|
5, fix written in commits msg
|
e5f20e7f045032ce9571d3d2219bf3ceb9f4cecd
|
2025-04-01 18:21:15
|
nakamura
|
tooltip text of close button is internationalized (#245190)
| false
| 2
| 2
| 4
|
--- src/vs/workbench/browser/parts/banner/bannerPart.ts
@@ -4,7 +4,7 @@
*--------------------------------------------------------------------------------------------*/
import './media/bannerpart.css';
-import { localize, localize2 } from '../../../../nls.js';
+import { localize2 } from '../../../../nls.js';
import { $, addDisposableListener, append, clearNode, EventType, isHTMLElement } from '../../../../base/browser/dom.js';
import { asCSSUrl } from '../../../../base/browser/cssValue.js';
import { ActionBar } from '../../../../base/browser/ui/actionbar/actionbar.js';
@@ -225,7 +225,7 @@ export class BannerPart extends Part implements IBannerService {
// Action
const actionBarContainer = append(this.element, $('div.action-container'));
this.actionBar = this._register(new ActionBar(actionBarContainer));
- const label = item.closeLabel ?? localize('closeBanner', "Close Banner");
+ const label = item.closeLabel ?? 'Close Banner';
const closeAction = this._register(new Action('banner.close', label, ThemeIcon.asClassName(widgetClose), true, () => this.close(item)));
this.actionBar.push(closeAction, { icon: true, label: false });
this.actionBar.setFocusable(false);
|
vscode
|
microsoft
|
TypeScript
|
TypeScript
| 168,072
| 30,802
|
Visual Studio Code
|
microsoft_vscode
|
CONFIG_CHANGE
|
Very small changes
|
89bc5030a2c7c18f2d838a3aba7dfa3dbdc778a6
| null |
Kanitkorn Sujautra
|
Add new item to extension list Add mthuret/storybook-addon-specifications to extension list in docs/extensions.md
| false
| 1
| 0
| 1
|
--- extensions.md
@@ -93,6 +93,7 @@ Rather than creating extensions yourself, you can use extensions available below
* [Create Groups of stories, display all of them together](https://github.com/jurgob/react-storybook-addon-add-stories-group)
* [Display internationalized components in different locales](https://github.com/Tarabyte/react-storybook-addon-intl)
* [Handling stubbed relay data](https://github.com/orta/react-storybooks-relay-container)
+* [Write tests directly inside stories](https://github.com/mthuret/storybook-addon-specifications)
> Feel free to include your extension to the above list and share it with other. <br/>
> Just make it available on NPM (and GitHub) and send a PR to this page.
|
storybookjs_storybook.json
| null | null | null | null | null | null |
storybookjs_storybook.json
|
NEW_FEAT
|
5, added a new feature
|
8a3c5663df80b6c0c7ccd8f1027af00c9207a53a
|
2023-06-10 08:31:35
|
Mike Bostock
|
disjoint force example
| false
| 190
| 91
| 281
|
--- docs/components/ExampleDisjointForce.vue
@@ -1,93 +0,0 @@
-<script setup>
-
-import * as d3 from "d3";
-import {ref, onMounted, onUnmounted} from "vue";
-
-const container = ref();
-
-let simulation;
-
-onMounted(async () => {
- const {links, nodes} = await d3.json("https://static.observableusercontent.com/files/e3680d5f766e85edde560c9c31a6dba2ddfcf2f66e1dced4afa18d8040f1f205e0bde1b8b234d866373f2bfc5806fafc47e244c5c9f48b60aaa1917c1b80fcb7");
-
- const width = 688;
- const height = 640;
- const scale = d3.scaleOrdinal(["var(--vp-c-brand)", "currentColor"]);
- const color = (d) => scale(d.group);
-
- simulation = d3.forceSimulation(nodes)
- .force("link", d3.forceLink(links).id((d) => d.id))
- .force("charge", d3.forceManyBody())
- .force("x", d3.forceX())
- .force("y", d3.forceY())
- .on("tick", ticked);
-
- const svg = d3.select(container.value).append("svg")
- .attr("width", width)
- .attr("height", height)
- .attr("viewBox", [-width / 2, -height / 2, width, height])
- .attr("style", "max-width: 100%; height: auto;");
-
- const link = svg.append("g")
- .attr("stroke", "currentColor")
- .attr("stroke-opacity", 0.5)
- .selectAll("line")
- .data(links)
- .join("line")
- .attr("stroke-width", (d) => Math.sqrt(d.value));
-
- const node = svg.append("g")
- .attr("stroke", "var(--vp-c-bg-alt)")
- .attr("stroke-width", 1.5)
- .selectAll("circle")
- .data(nodes)
- .join("circle")
- .attr("r", 5)
- .attr("fill", color)
- .call(d3.drag()
- .on("start", dragstarted)
- .on("drag", dragged)
- .on("end", dragended));
-
- node.append("title")
- .text((d) => d.id);
-
- function ticked() {
- link
- .attr("x1", d => d.source.x)
- .attr("y1", d => d.source.y)
- .attr("x2", d => d.target.x)
- .attr("y2", d => d.target.y);
-
- node
- .attr("cx", d => d.x)
- .attr("cy", d => d.y);
- }
-
- function dragstarted(event, d) {
- if (!event.active) simulation.alphaTarget(0.3).restart();
- d.fx = d.x;
- d.fy = d.y;
- }
-
- function dragged(event,d) {
- d.fx = event.x;
- d.fy = event.y;
- }
-
- function dragended(event,d) {
- if (!event.active) simulation.alphaTarget(0);
- d.fx = null;
- d.fy = null;
- }
-});
-
-onUnmounted(() => {
- simulation?.stop();
-});
-
-</script>
-<template>
- <div style="margin: 1em 0;" ref="container"></div>
- <a href="https://observablehq.com/@d3/disjoint-force-directed-graph?intent=fork" style="font-size: smaller;">Fork ↗︎</a>
-</template>
--- docs/d3-force.md
@@ -1,26 +1,20 @@
-<script setup>
-
-import ExampleDisjointForce from "./components/ExampleDisjointForce.vue";
-
-</script>
-
# d3-force
-<ExampleDisjointForce />
+This module implements a [velocity Verlet](https://en.wikipedia.org/wiki/Verlet_integration) numerical integrator for simulating physical forces on particles. Force simulations can be used to visualize [networks](https://observablehq.com/@d3/force-directed-graph) and [hierarchies](https://observablehq.com/@d3/force-directed-tree):
-This module implements a [velocity Verlet](https://en.wikipedia.org/wiki/Verlet_integration) numerical integrator for simulating physical forces on particles. Force simulations can be used to visualize [networks](https://observablehq.com/@d3/force-directed-graph) and [hierarchies](https://observablehq.com/@d3/force-directed-tree), and to resolve [collisions](./d3-force/collide.md) as in [bubble charts](http://www.nytimes.com/interactive/2012/09/06/us/politics/convention-word-counts.html).
+[<img alt="Force-Directed Graph" src="https://raw.githubusercontent.com/d3/d3-force/master/img/graph.png" width="420" height="219">](https://observablehq.com/@d3/force-directed-graph)
-<!-- [<img alt="Force-Directed Graph" src="https://raw.githubusercontent.com/d3/d3-force/master/img/graph.png" width="420" height="219">](https://observablehq.com/@d3/force-directed-graph) -->
+[<img alt="Force-Directed Tree" src="https://raw.githubusercontent.com/d3/d3-force/master/img/tree.png" width="420" height="219">](https://observablehq.com/@d3/force-directed-tree)
-<!-- [<img alt="Force-Directed Tree" src="https://raw.githubusercontent.com/d3/d3-force/master/img/tree.png" width="420" height="219">](https://observablehq.com/@d3/force-directed-tree) -->
+You can also simulate circles (disks) with collision, such as for [bubble charts](http://www.nytimes.com/interactive/2012/09/06/us/politics/convention-word-counts.html):
-<!-- [<img alt="Collision Detection" src="https://raw.githubusercontent.com/d3/d3-force/master/img/collide.png" width="420" height="219">](https://observablehq.com/@d3/collision-detection) -->
+[<img alt="Collision Detection" src="https://raw.githubusercontent.com/d3/d3-force/master/img/collide.png" width="420" height="219">](https://observablehq.com/@d3/collision-detection)
-<!-- You can even use it as a rudimentary physics engine, say to simulate cloth: -->
+You can even use it as a rudimentary physics engine, say to simulate cloth:
-<!-- [<img alt="Force-Directed Lattice" src="https://raw.githubusercontent.com/d3/d3-force/master/img/lattice.png" width="480" height="250">](https://observablehq.com/@d3/force-directed-lattice) -->
+[<img alt="Force-Directed Lattice" src="https://raw.githubusercontent.com/d3/d3-force/master/img/lattice.png" width="480" height="250">](https://observablehq.com/@d3/force-directed-lattice)
-To use this module, create a [simulation](./d3-force/simulation.md) for an array of [nodes](./d3-force/simulation.md#simulation_nodes) and apply the desired [forces](./d3-force/simulation.md#simulation_force). Then [listen](./d3-force/simulation.md#simulation_on) for tick events to render the nodes as they update in your preferred graphics system, such as Canvas or SVG.
+To use this module, create a [simulation](./d3-force/simulation.md) for an array of [nodes](./d3-force/simulation.md#simulation_nodes), and compose the desired [forces](./d3-force/simulation.md#simulation_force). Then [listen](./d3-force/simulation.md#simulation_on) for tick events to render the nodes as they update in your preferred graphics system, such as Canvas or SVG.
See one of:
@@ -30,3 +24,27 @@ See one of:
* [Link force](./d3-force/link.md)
* [Many-body force](./d3-force/many-body.md)
* [Position forces](./d3-force/position.md)
+
+## Custom forces
+
+A *force* is a function that modifies nodes’ positions or velocities. It can simulate a physical force such as electrical charge or gravity, or it can resolve a geometric constraint such as keeping nodes within a bounding box or keeping linked nodes a fixed distance apart. For example, here is a force that moves nodes towards the origin:
+
+```js
+function force(alpha) {
+ for (let i = 0, n = nodes.length, node, k = alpha * 0.1; i < n; ++i) {
+ node = nodes[i];
+ node.vx -= node.x * k;
+ node.vy -= node.y * k;
+ }
+}
+```
+
+Forces typically read the node’s current position ⟨*x*,*y*⟩ and then mutate the node’s velocity ⟨*vx*,*vy*⟩. Forces may also “peek ahead” to the anticipated next position of the node, ⟨*x* + *vx*,*y* + *vy*⟩; this is necessary for resolving geometric constraints through [iterative relaxation](https://en.wikipedia.org/wiki/Relaxation_\(iterative_method\)). Forces may also modify the position directly, which is sometimes useful to avoid adding energy to the simulation, such as when recentering the simulation in the viewport.
+
+### *force*(*alpha*) {#_force}
+
+Applies this force, optionally observing the specified *alpha*. Typically, the force is applied to the array of nodes previously passed to [*force*.initialize](#force_initialize), however, some forces may apply to a subset of nodes, or behave differently. For example, [forceLink](./d3-force/link.md) applies to the source and target of each link.
+
+### *force*.initialize(*nodes*) {#force_initialize}
+
+Supplies the array of *nodes* and *random* source to this force. This method is called when a force is bound to a simulation via [*simulation*.force](./d3-force/simulation.md#simulation_force) and when the simulation’s nodes change via [*simulation*.nodes](./d3-force/simulation.md#simulation_nodes). A force may perform necessary work during initialization, such as evaluating per-node parameters, to avoid repeatedly performing work during each application of the force.
--- docs/d3-force/collide.md
@@ -1,12 +1,12 @@
<script setup>
-import ExampleCollideForce from "../components/ExampleCollideForce.vue";
+import CollideForce from "../components/CollideForce.vue";
</script>
# Collide force
-<ExampleCollideForce />
+<CollideForce />
The collide force treats nodes as circles with a given [radius](#collide_radius), rather than points, and prevents nodes from overlapping. More formally, two nodes *a* and *b* are separated so that the distance between *a* and *b* is at least *radius*(*a*) + *radius*(*b*). To reduce jitter, this is by default a “soft” constraint with a configurable [strength](#collide_strength) and [iteration count](#collide_iterations).
--- docs/d3-force/simulation.md
@@ -40,7 +40,7 @@ Each *node* must be an object. The following properties are assigned by the simu
* `vx` - the node’s current *x*-velocity
* `vy` - the node’s current *y*-velocity
-The position ⟨*x*,*y*⟩ and velocity ⟨*vx*,*vy*⟩ may be subsequently modified by [forces](#custom-forces) and by the simulation. If either *vx* or *vy* is NaN, the velocity is initialized to ⟨0,0⟩. If either *x* or *y* is NaN, the position is initialized in a [phyllotaxis arrangement](https://observablehq.com/@d3/force-layout-phyllotaxis), so chosen to ensure a deterministic, uniform distribution.
+The position ⟨*x*,*y*⟩ and velocity ⟨*vx*,*vy*⟩ may be subsequently modified by [forces](../d3-force.md#custom-forces) and by the simulation. If either *vx* or *vy* is NaN, the velocity is initialized to ⟨0,0⟩. If either *x* or *y* is NaN, the position is initialized in a [phyllotaxis arrangement](https://observablehq.com/@d3/force-layout-phyllotaxis), so chosen to ensure a deterministic, uniform distribution.
To fix a node in a given position, you may specify two additional properties:
@@ -77,7 +77,7 @@ The alpha decay rate determines how quickly the current alpha interpolates towar
## *simulation*.force(*name*, *force*) {#simulation_force}
-[Source](https://github.com/d3/d3-force/blob/main/src/simulation.js) · If *force* is specified, assigns the [force](#custom-forces) for the specified *name* and returns this simulation. If *force* is not specified, returns the force with the specified name, or undefined if there is no such force. (By default, new simulations have no forces.) For example, to create a new simulation to layout a graph, you might say:
+[Source](https://github.com/d3/d3-force/blob/main/src/simulation.js) · If *force* is specified, assigns the [force](../d3-force.md#custom-forces) for the specified *name* and returns this simulation. If *force* is not specified, returns the force with the specified name, or undefined if there is no such force. (By default, new simulations have no forces.) For example, to create a new simulation to layout a graph, you might say:
```js
const simulation = d3.forceSimulation(nodes)
@@ -114,27 +114,3 @@ The *typenames* is a string containing one or more *typename* separated by white
Note that *tick* events are not dispatched when [*simulation*.tick](#simulation_tick) is called manually; events are only dispatched by the internal timer and are intended for interactive rendering of the simulation. To affect the simulation, register [forces](#simulation_force) instead of modifying nodes’ positions or velocities inside a tick event listener.
See [*dispatch*.on](../d3-dispatch.md#dispatch_on) for details.
-
-## Custom forces
-
-A *force* is a function that modifies nodes’ positions or velocities. It can simulate a physical force such as electrical charge or gravity, or it can resolve a geometric constraint such as keeping nodes within a bounding box or keeping linked nodes a fixed distance apart. For example, here is a force that moves nodes towards the origin:
-
-```js
-function force(alpha) {
- for (let i = 0, n = nodes.length, node, k = alpha * 0.1; i < n; ++i) {
- node = nodes[i];
- node.vx -= node.x * k;
- node.vy -= node.y * k;
- }
-}
-```
-
-Forces typically read the node’s current position ⟨*x*,*y*⟩ and then mutate the node’s velocity ⟨*vx*,*vy*⟩. Forces may also “peek ahead” to the anticipated next position of the node, ⟨*x* + *vx*,*y* + *vy*⟩; this is necessary for resolving geometric constraints through [iterative relaxation](https://en.wikipedia.org/wiki/Relaxation_\(iterative_method\)). Forces may also modify the position directly, which is sometimes useful to avoid adding energy to the simulation, such as when recentering the simulation in the viewport.
-
-### *force*(*alpha*) {#_force}
-
-Applies this force, optionally observing the specified *alpha*. Typically, the force is applied to the array of nodes previously passed to [*force*.initialize](#force_initialize), however, some forces may apply to a subset of nodes, or behave differently. For example, [forceLink](./link.md) applies to the source and target of each link.
-
-### *force*.initialize(*nodes*) {#force_initialize}
-
-Supplies the array of *nodes* and *random* source to this force. This method is called when a force is bound to a simulation via [*simulation*.force](#simulation_force) and when the simulation’s nodes change via [*simulation*.nodes](#simulation_nodes). A force may perform necessary work during initialization, such as evaluating per-node parameters, to avoid repeatedly performing work during each application of the force.
|
d3
|
d3
|
Shell
|
Shell
| 109,977
| 22,868
|
Bring data to life with SVG, Canvas and HTML. :bar_chart::chart_with_upwards_trend::tada:
|
d3_d3
|
NEW_FEAT
|
Obvious
|
f7dbbf251980763609a65efe15ef9f8ed0cc5a95
|
2025-02-03 22:11:08
|
Jakub Ciolek
|
cmd/compile: distribute 8 and 16-bit multiplication Expand the existing rule to cover 8 and 16 bit variants. compilecmp linux/amd64: time time.parseStrictRFC3339.func1 80 -> 70 (-12.50%) time.Time.appendStrictRFC3339.func1 80 -> 70 (-12.50%) time.Time.appendStrictRFC3339 439 -> 428 (-2.51%) time [cmd/compile] time.parseStrictRFC3339.func1 80 -> 70 (-12.50%) time.Time.appendStrictRFC3339.func1 80 -> 70 (-12.50%) time.Time.appendStrictRFC3339 439 -> 428 (-2.51%) linux/arm64: time time.parseStrictRFC3339.func1 changed time.Time.appendStrictRFC3339.func1 changed time.Time.appendStrictRFC3339 416 -> 400 (-3.85%) time [cmd/compile] time.Time.appendStrictRFC3339 416 -> 400 (-3.85%) time.parseStrictRFC3339.func1 changed time.Time.appendStrictRFC3339.func1 changed Change-Id: I0ad3b2363a9fe8c322dd05fbc13bf151a146d8cb Reviewed-on: https://go-review.googlesource.com/c/go/+/641756 Auto-Submit: Keith Randall <[email protected]> LUCI-TryBot-Result: Go LUCI <[email protected]> Reviewed-by: Cherry Mui <[email protected]> Reviewed-by: Keith Randall <[email protected]> Reviewed-by: Keith Randall <[email protected]>
| false
| 72
| 0
| 72
|
--- src/cmd/compile/internal/ssa/_gen/generic.rules
@@ -353,10 +353,6 @@
(Add64 (Const64 <t> [c*d]) (Mul64 <t> (Const64 <t> [c]) x))
(Mul32 (Const32 <t> [c]) (Add32 <t> (Const32 <t> [d]) x)) =>
(Add32 (Const32 <t> [c*d]) (Mul32 <t> (Const32 <t> [c]) x))
-(Mul16 (Const16 <t> [c]) (Add16 <t> (Const16 <t> [d]) x)) =>
- (Add16 (Const16 <t> [c*d]) (Mul16 <t> (Const16 <t> [c]) x))
-(Mul8 (Const8 <t> [c]) (Add8 <t> (Const8 <t> [d]) x)) =>
- (Add8 (Const8 <t> [c*d]) (Mul8 <t> (Const8 <t> [c]) x))
// Rewrite x*y ± x*z to x*(y±z)
(Add(64|32|16|8) <t> (Mul(64|32|16|8) x y) (Mul(64|32|16|8) x z))
--- src/cmd/compile/internal/ssa/rewritegeneric.go
@@ -18194,40 +18194,6 @@ func rewriteValuegeneric_OpMul16(v *Value) bool {
}
break
}
- // match: (Mul16 (Const16 <t> [c]) (Add16 <t> (Const16 <t> [d]) x))
- // result: (Add16 (Const16 <t> [c*d]) (Mul16 <t> (Const16 <t> [c]) x))
- for {
- for _i0 := 0; _i0 <= 1; _i0, v_0, v_1 = _i0+1, v_1, v_0 {
- if v_0.Op != OpConst16 {
- continue
- }
- t := v_0.Type
- c := auxIntToInt16(v_0.AuxInt)
- if v_1.Op != OpAdd16 || v_1.Type != t {
- continue
- }
- _ = v_1.Args[1]
- v_1_0 := v_1.Args[0]
- v_1_1 := v_1.Args[1]
- for _i1 := 0; _i1 <= 1; _i1, v_1_0, v_1_1 = _i1+1, v_1_1, v_1_0 {
- if v_1_0.Op != OpConst16 || v_1_0.Type != t {
- continue
- }
- d := auxIntToInt16(v_1_0.AuxInt)
- x := v_1_1
- v.reset(OpAdd16)
- v0 := b.NewValue0(v.Pos, OpConst16, t)
- v0.AuxInt = int16ToAuxInt(c * d)
- v1 := b.NewValue0(v.Pos, OpMul16, t)
- v2 := b.NewValue0(v.Pos, OpConst16, t)
- v2.AuxInt = int16ToAuxInt(c)
- v1.AddArg2(v2, x)
- v.AddArg2(v0, v1)
- return true
- }
- }
- break
- }
// match: (Mul16 (Const16 [0]) _)
// result: (Const16 [0])
for {
@@ -18951,40 +18917,6 @@ func rewriteValuegeneric_OpMul8(v *Value) bool {
}
break
}
- // match: (Mul8 (Const8 <t> [c]) (Add8 <t> (Const8 <t> [d]) x))
- // result: (Add8 (Const8 <t> [c*d]) (Mul8 <t> (Const8 <t> [c]) x))
- for {
- for _i0 := 0; _i0 <= 1; _i0, v_0, v_1 = _i0+1, v_1, v_0 {
- if v_0.Op != OpConst8 {
- continue
- }
- t := v_0.Type
- c := auxIntToInt8(v_0.AuxInt)
- if v_1.Op != OpAdd8 || v_1.Type != t {
- continue
- }
- _ = v_1.Args[1]
- v_1_0 := v_1.Args[0]
- v_1_1 := v_1.Args[1]
- for _i1 := 0; _i1 <= 1; _i1, v_1_0, v_1_1 = _i1+1, v_1_1, v_1_0 {
- if v_1_0.Op != OpConst8 || v_1_0.Type != t {
- continue
- }
- d := auxIntToInt8(v_1_0.AuxInt)
- x := v_1_1
- v.reset(OpAdd8)
- v0 := b.NewValue0(v.Pos, OpConst8, t)
- v0.AuxInt = int8ToAuxInt(c * d)
- v1 := b.NewValue0(v.Pos, OpMul8, t)
- v2 := b.NewValue0(v.Pos, OpConst8, t)
- v2.AuxInt = int8ToAuxInt(c)
- v1.AddArg2(v2, x)
- v.AddArg2(v0, v1)
- return true
- }
- }
- break
- }
// match: (Mul8 (Const8 [0]) _)
// result: (Const8 [0])
for {
|
go
|
golang
|
Go
|
Go
| 126,191
| 17,926
|
The Go programming language
|
golang_go
|
NEW_FEAT
|
Expanded the rule to cover 8 and 16 bit variants
|
690a2c8399f9724754b77ddd11026096c9038ad1
|
2024-08-26 14:37:02
|
rustdesk
|
still find delegate failure when my mac restarted automatically sometimes
| false
| 6
| 2
| 8
|
--- src/platform/macos.rs
@@ -507,10 +507,6 @@ pub fn start_os_service() {
.map(|p| p.start_time())
.unwrap_or_default() as i64;
log::info!("Startime: {my_start_time} vs {:?}", server);
- if my_start_time < server.unwrap().0 + 3 {
- log::error!("Please start --server first to make delegate work, earlier more 3 seconds",);
- std::process::exit(-1);
- }
std::thread::spawn(move || loop {
std::thread::sleep(std::time::Duration::from_secs(1));
@@ -523,9 +519,9 @@ pub fn start_os_service() {
);
std::process::exit(-1);
};
- if my_start_time < start_time + 3 {
+ if my_start_time <= start_time + 1 {
log::error!(
- "Agent start later, {my_start_time} vs {start_time}, please start --server first to make delegate work, earlier more 3 seconds",
+ "Agent start later, {my_start_time} vs {start_time}, please start --server first to make delegate work",
);
std::process::exit(-1);
}
|
rustdesk
|
rustdesk
|
Rust
|
Rust
| 83,345
| 11,693
|
An open-source remote desktop application designed for self-hosting, as an alternative to TeamViewer.
|
rustdesk_rustdesk
|
BUG_FIX
|
probably a bug fix
|
e223459db81c4c03e741450b65171647d2584798
|
2022-12-29 14:47:28
|
Chak-C
|
feature: Add context object pattern (#2304) * Add context object pattern and corresponding tests
* Update context pattern
* Add README, class diagram, puml file.
Add toString method in ServiceContext.java.
Add tests for coverage.
* Improvements:
Remove plugin in pom file
Add comments in app.java
Change local variable keyword to var in app.java
Use lombok for getters, setters and tostring
Change method signature in context object
* Refreshing environment-1
* Reconfigure pom file.
Co-authored-by: Alvis Chan <[email protected]>
Co-authored-by: u7287079 <[email protected]>
| false
| 498
| 0
| 498
|
--- context-object/README.md
@@ -1,184 +0,0 @@
----
-title: Context object
-category: Creational
-language: en
-tags:
-- Data access
----
-
-## Name / classification
-
-Context Object
-
-## Also known as
-
-Context, Encapsulate Context
-
-## Intent
-
-Decouple data from protocol-specific classes and store the scoped data in an object independent
-of the underlying protocol technology.
-
-## Explanation
-
-Real-world example
-
-> This application has different layers labelled A, B and C with each extracting specific information
-> from a similar context for further use in the software. Passing down each pieces of information
-> individually would be inefficient, a method to efficiently store and pass information is needed.
-
-In plain words
-
-> Create an object and store the data there and pass this object to where it is needed.
-
-[Core J2EE Patterns](http://corej2eepatterns.com/ContextObject.htm) says
-
-> Use a Context Object to encapsulate state in a protocol-independent way to be shared throughout your application.
-
-**Programmatic Example**
-
-We define what data a service context object contains.
-
-```Java
-public class ServiceContext {
-
- String ACCOUNT_SERVICE, SESSION_SERVICE, SEARCH_SERVICE;
-
- public void setACCOUNT_SERVICE(String ACCOUNT_SERVICE) {
- this.ACCOUNT_SERVICE = ACCOUNT_SERVICE;
- }
-
- public void setSESSION_SERVICE(String SESSION_SERVICE) {
- this.SESSION_SERVICE = SESSION_SERVICE;
- }
-
- public void setSEARCH_SERVICE(String SEARCH_SERVICE) {
- this.SEARCH_SERVICE = SEARCH_SERVICE;
- }
-
- public String getACCOUNT_SERVICE() {
- return ACCOUNT_SERVICE;
- }
-
- public String getSESSION_SERVICE() {
- return SESSION_SERVICE;
- }
-
- public String getSEARCH_SERVICE() {
- return SEARCH_SERVICE;
- }
-
- public String toString() { return ACCOUNT_SERVICE + " " + SESSION_SERVICE + " " + SEARCH_SERVICE;}
-}
-```
-
-Create an interface used in parts of the application for context objects to be created.
-
-```Java
-public class ServiceContextFactory {
-
- public static ServiceContext createContext() {
- return new ServiceContext();
- }
-}
-```
-
-Instantiate the context object in the first layer and the adjoining layer upcalls the context in the current layer, which
-then further structures the object.
-
-```Java
-public class LayerA {
-
- private static ServiceContext context;
-
- public LayerA() {
- context = ServiceContextFactory.createContext();
- }
-
- public static ServiceContext getContext() {
- return context;
- }
-
- public void addAccountInfo(String accountService) {
- context.setACCOUNT_SERVICE(accountService);
- }
-}
-
-public class LayerB {
-
- private static ServiceContext context;
-
- public LayerB(LayerA layerA) {
- this.context = layerA.getContext();
- }
-
- public static ServiceContext getContext() {
- return context;
- }
-
- public void addSessionInfo(String sessionService) {
- context.setSESSION_SERVICE(sessionService);
- }
-}
-
-public class LayerC {
-
- public static ServiceContext context;
-
- public LayerC(LayerB layerB) {
- this.context = layerB.getContext();
- }
-
- public static ServiceContext getContext() {
- return context;
- }
-
- public void addSearchInfo(String searchService) {
- context.setSEARCH_SERVICE(searchService);
- }
-}
-```
-Here is the context object and layers in action.
-
-```Java
-var layerA = new LayerA();
-layerA.addAccountInfo(SERVICE);
-LOGGER.info("Context = {}",layerA.getContext());
-var layerB = new LayerB(layerA);
-layerB.addSessionInfo(SERVICE);
-LOGGER.info("Context = {}",layerB.getContext());
-var layerC = new LayerC(layerB);
-layerC.addSearchInfo(SERVICE);
-LOGGER.info("Context = {}",layerC.getContext());
-```
-
-Program output:
-
-```Java
-Context = SERVICE null null
-Context = SERVICE SERVICE null
-Context = SERVICE SERVICE SERVICE
-```
-
-## Class diagram
-
-
-
-## Application
-
-Use the Context Object pattern for:
-
-* Sharing information across different system layers.
-* Decoupling software data from protocol-specific contexts.
-* Exposing only the relevant API's within the context.
-
-## Known uses
-* [Spring: ApplicationContext](https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/context/ApplicationContext.html)
-* [Oracle: SecurityContext](https://docs.oracle.com/javaee/7/api/javax/ws/rs/core/SecurityContext.html)
-* [Oracle: ServletContext](https://docs.oracle.com/javaee/6/api/javax/servlet/ServletContext.html)
-
-## Credits
-
-* [J2EE Design Patterns](http://corej2eepatterns.com/ContextObject.htm)
-* [Allan Kelly - The Encapsulate Context Pattern](https://accu.org/journals/overload/12/63/kelly_246/)
-* [Arvid S. Krishna et al. - Context Object](https://www.dre.vanderbilt.edu/~schmidt/PDF/Context-Object-Pattern.pdf)
\ No newline at end of file
--- context-object/etc/context-object.png
Binary files a/context-object/etc/context-object.png and /dev/null differ
--- context-object/etc/context-object.urm.puml
@@ -1,50 +0,0 @@
-@startuml
-package com.iluwatar.context.object {
- class App {
- - LOGGER : Logger {static}
- + App()
- + main(args : String[]) {static}
- }
- class ServiceContext {
- - ACCOUNT_SERVICE : String
- - SESSION_SERVICE : String
- - SEARCH_SERVICE : String
- + ServiceContext()
- + getACCOUNT_SERVICE() : String
- + getSESSION_SERVICE() : String
- + getSEARCH_SERVICE() : String
- + setACCOUNT_SERVICE(service : String)
- + setSESSION_SERVICE(service : String)
- + setSEARCH_SERVICE(service : String)
- }
- class ServiceContextFactory {
- + ServiceContextFactory()
- + createContext() : ServiceContext
- }
- class LayerA {
- - context : ServiceContext
- + LayerA()
- + getContext() : ServiceContext
- + addAccountInfo()
- }
- class LayerB {
- - context : ServiceContext
- + LayerB(layerA : LayerA)
- + getContext() : ServiceContext
- + addAccountInfo()
- }
- class LayerC {
- - context : ServiceContext
- + LayerC(layerB : LayerB)
- + getContext() : ServiceContext
- + addAccountInfo()
- }
-}
-
-LayerC ..|> LayerB
-ServiceContext --> LayerC
-ServiceContext --> LayerB
-ServiceContext --> LayerA
-ServiceContextFactory ..|> "<<creates>>" ServiceContext
-LayerB ..|> LayerA
-@enduml
\ No newline at end of file
--- context-object/pom.xml
@@ -1,62 +0,0 @@
-<?xml version="1.0" encoding="UTF-8"?>
-<!--
-
- This project is licensed under the MIT license. Module model-view-viewmodel is using ZK framework licensed under LGPL (see lgpl-3.0.txt).
-
- The MIT License
- Copyright © 2014-2022 Ilkka Seppälä
-
- Permission is hereby granted, free of charge, to any person obtaining a copy
- of this software and associated documentation files (the "Software"), to deal
- in the Software without restriction, including without limitation the rights
- to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
- copies of the Software, and to permit persons to whom the Software is
- furnished to do so, subject to the following conditions:
-
- The above copyright notice and this permission notice shall be included in
- all copies or substantial portions of the Software.
-
- THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
- IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
- FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
- AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
- LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
- OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
- THE SOFTWARE.
-
--->
-<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
- <modelVersion>4.0.0</modelVersion>
- <parent>
- <groupId>com.iluwatar</groupId>
- <artifactId>java-design-patterns</artifactId>
- <version>1.26.0-SNAPSHOT</version>
- </parent>
- <artifactId>context-object</artifactId>
- <dependencies>
- <dependency>
- <groupId>org.junit.jupiter</groupId>
- <artifactId>junit-jupiter-engine</artifactId>
- <scope>test</scope>
- </dependency>
- </dependencies>
- <build>
- <plugins>
- <plugin>
- <groupId>org.apache.maven.plugins</groupId>
- <artifactId>maven-assembly-plugin</artifactId>
- <executions>
- <execution>
- <configuration>
- <archive>
- <manifest>
- <mainClass>com.iluwatar.compositeview.App</mainClass>
- </manifest>
- </archive>
- </configuration>
- </execution>
- </executions>
- </plugin>
- </plugins>
- </build>
-</project>
\ No newline at end of file
--- context-object/src/main/java/com/iluwatar/context/object/App.java
@@ -1,41 +0,0 @@
-package com.iluwatar.context.object;
-
-import lombok.extern.slf4j.Slf4j;
-
-/**
- * In the context object pattern, information and data from underlying protocol-specific classes/systems is decoupled
- * and stored into a protocol-independent object in an organised format. The pattern ensures the data contained within
- * the context object can be shared and further structured between different layers of a software system.
- *
- * <p> In this example we show how a context object {@link ServiceContext} can be initiated, edited and passed/retrieved
- * in different layers of the program ({@link LayerA}, {@link LayerB}, {@link LayerC}) through use of static methods. </p>
- */
-@Slf4j
-public class App {
-
- private static final String SERVICE = "SERVICE";
-
- /**
- * Program entry point.
- * @param args command line args
- */
- public static void main(String[] args) {
- //Initiate first layer and add service information into context
- var layerA = new LayerA();
- layerA.addAccountInfo(SERVICE);
-
- LOGGER.info("Context = {}",layerA.getContext());
-
- //Initiate second layer and preserving information retrieved in first layer through passing context object
- var layerB = new LayerB(layerA);
- layerB.addSessionInfo(SERVICE);
-
- LOGGER.info("Context = {}",layerB.getContext());
-
- //Initiate third layer and preserving information retrieved in first and second layer through passing context object
- var layerC = new LayerC(layerB);
- layerC.addSearchInfo(SERVICE);
-
- LOGGER.info("Context = {}",layerC.getContext());
- }
-}
--- context-object/src/main/java/com/iluwatar/context/object/LayerA.java
@@ -1,17 +0,0 @@
-package com.iluwatar.context.object;
-
-import lombok.Getter;
-
-@Getter
-public class LayerA {
-
- private ServiceContext context;
-
- public LayerA() {
- context = ServiceContextFactory.createContext();
- }
-
- public void addAccountInfo(String accountService) {
- context.setAccountService(accountService);
- }
-}
--- context-object/src/main/java/com/iluwatar/context/object/LayerB.java
@@ -1,17 +0,0 @@
-package com.iluwatar.context.object;
-
-import lombok.Getter;
-
-@Getter
-public class LayerB {
-
- private ServiceContext context;
-
- public LayerB(LayerA layerA) {
- this.context = layerA.getContext();
- }
-
- public void addSessionInfo(String sessionService) {
- context.setSessionService(sessionService);
- }
-}
--- context-object/src/main/java/com/iluwatar/context/object/LayerC.java
@@ -1,17 +0,0 @@
-package com.iluwatar.context.object;
-
-import lombok.Getter;
-
-@Getter
-public class LayerC {
-
- public ServiceContext context;
-
- public LayerC(LayerB layerB) {
- this.context = layerB.getContext();
- }
-
- public void addSearchInfo(String searchService) {
- context.setSearchService(searchService);
- }
-}
--- context-object/src/main/java/com/iluwatar/context/object/ServiceContext.java
@@ -1,16 +0,0 @@
-package com.iluwatar.context.object;
-
-import lombok.Getter;
-import lombok.Setter;
-import lombok.ToString;
-
-/**
- * Where context objects are defined.
- */
-@ToString
-@Getter
-@Setter
-public class ServiceContext {
-
- String AccountService, SessionService, SearchService;
-}
--- context-object/src/main/java/com/iluwatar/context/object/ServiceContextFactory.java
@@ -1,11 +0,0 @@
-package com.iluwatar.context.object;
-
-/**
- * An interface to create context objects passed through layers.
- */
-public class ServiceContextFactory {
-
- public static ServiceContext createContext() {
- return new ServiceContext();
- }
-}
--- context-object/src/test/java/com/iluwatar/contect/object/AppTest.java
@@ -1,17 +0,0 @@
-package com.iluwatar.contect.object;
-
-import com.iluwatar.context.object.App;
-import org.junit.jupiter.api.Test;
-
-import static org.junit.jupiter.api.Assertions.assertDoesNotThrow;
-
-public class AppTest {
-
- /**
- * Test example app runs without error.
- */
- @Test
- void shouldExecuteWithoutException() {
- assertDoesNotThrow(() -> App.main(new String[]{}));
- }
-}
--- context-object/src/test/java/com/iluwatar/contect/object/ServiceContextTest.java
@@ -1,66 +0,0 @@
-package com.iluwatar.contect.object;
-
-import com.iluwatar.context.object.LayerA;
-import com.iluwatar.context.object.LayerB;
-import com.iluwatar.context.object.LayerC;
-import com.iluwatar.context.object.ServiceContext;
-import org.junit.jupiter.api.BeforeEach;
-import org.junit.jupiter.api.Test;
-
-import static org.junit.jupiter.api.Assertions.assertSame;
-import static org.junit.jupiter.api.Assertions.assertEquals;
-import static org.junit.jupiter.api.Assertions.assertNull;
-
-/**
- * Date: 10/24/2022 - 3:18
- *
- * @author Chak Chan
- */
-public class ServiceContextTest {
-
- private static final String SERVICE = "SERVICE";
-
- private LayerA layerA;
-
- @BeforeEach
- void initiateLayerA() {
- this.layerA = new LayerA();
- }
-
- @Test
- void testSameContextPassedBetweenLayers() {
- ServiceContext context1 = layerA.getContext();
- var layerB = new LayerB(layerA);
- ServiceContext context2 = layerB.getContext();
- var layerC = new LayerC(layerB);
- ServiceContext context3 = layerC.getContext();
-
- assertSame(context1, context2);
- assertSame(context2, context3);
- assertSame(context3, context1);
- }
-
- @Test
- void testScopedDataPassedBetweenLayers() {
- layerA.addAccountInfo(SERVICE);
- var layerB = new LayerB(layerA);
- var layerC = new LayerC(layerB);
- layerC.addSearchInfo(SERVICE);
- ServiceContext context = layerC.getContext();
-
- assertEquals(SERVICE,context.getAccountService());
- assertNull(context.getSessionService());
- assertEquals(SERVICE,context.getSearchService());
- }
-
- @Test
- void testToString() {
- assertEquals(layerA.getContext().toString(),"ServiceContext(AccountService=null, SessionService=null, SearchService=null)");
- layerA.addAccountInfo(SERVICE);
- assertEquals(layerA.getContext().toString(), "ServiceContext(AccountService=SERVICE, SessionService=null, SearchService=null)");
- var layerB = new LayerB(layerA);
- layerB.addSessionInfo(SERVICE);
- var layerC = new LayerC(layerB);
- assertEquals(layerC.getContext().toString(), "ServiceContext(AccountService=SERVICE, SessionService=SERVICE, SearchService=null)");
- }
-}
|
java-design-patterns
|
iluwatar
|
Java
|
Java
| 90,911
| 26,831
|
Design patterns implemented in Java
|
iluwatar_java-design-patterns
|
NEW_FEAT
|
Matched \bfeat(ure)?(#?\d+)?\b in message
|
cdd0eb4c55b46acd08123a7d04389f450a996f4e
|
2025-04-05T18:27:50Z
|
Sinan Sahin
|
[AuthTab] Remove experimental annotation and roll AndroidX Fixes this roll: https://chromium-review.googlesource.com/c/chromium/src/+/6435310 Bug: 353530443 Change-Id: I22438e4fdd1ea47dbe41759193132d9318f868aa Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/6435312 Reviewed-by: Andrew Grieve <[email protected]> Commit-Queue: Andrew Grieve <[email protected]> Cr-Commit-Position: refs/heads/main@{#1443120}
| false
| 3
| 30
| 33
|
--- DEPS
@@ -1660,7 +1660,7 @@ deps = {
'packages': [
{
'package': 'chromium/third_party/androidx',
- 'version': 'brlEP-hD-L2HXCYuK64xAWpsFLigTrCr63IEUJFguisC',
+ 'version': '0JMvR6y69sxoozYf2KYxRocmA_cdAm8ncbJmReEWO7MC',
},
],
'condition': 'checkout_android and non_git_source',
--- chrome/android/java/src/org/chromium/chrome/browser/LaunchIntentDispatcher.java
@@ -19,8 +19,6 @@
import android.text.TextUtils;
import androidx.annotation.IntDef;
-import androidx.annotation.OptIn;
-import androidx.browser.auth.ExperimentalAuthTab;
import androidx.browser.customtabs.CustomTabsIntent;
import androidx.browser.customtabs.TrustedWebUtils;
import androidx.core.os.BuildCompat;
@@ -264,7 +262,6 @@ private void maybePrefetchDnsInBackground() {
/**
* @return Whether the intent is for launching a Custom Tab.
*/
- @OptIn(markerClass = ExperimentalAuthTab.class)
public static boolean isCustomTabIntent(Intent intent) {
if (intent == null) return false;
Log.w(
--- chrome/android/java/src/org/chromium/chrome/browser/base/SplitCompatCustomTabsService.java
@@ -10,9 +10,7 @@
import android.os.Bundle;
import android.os.IBinder;
-import androidx.annotation.OptIn;
import androidx.browser.auth.AuthTabSessionToken;
-import androidx.browser.auth.ExperimentalAuthTab;
import androidx.browser.customtabs.CustomTabsService;
import androidx.browser.customtabs.CustomTabsSessionToken;
import androidx.browser.customtabs.EngagementSignalsCallback;
@@ -29,7 +27,6 @@
* CustomTabsService base class which will call through to the given {@link Impl}. This class must
* be present in the base module, while the Impl can be in the chrome module.
*/
-@OptIn(markerClass = ExperimentalAuthTab.class)
@NullMarked
public class SplitCompatCustomTabsService extends CustomTabsService {
private String mServiceClassName;
--- chrome/android/java/src/org/chromium/chrome/browser/browserservices/ui/controller/AuthTabVerifier.java
@@ -14,11 +14,9 @@
import android.os.SystemClock;
import android.text.TextUtils;
-import androidx.annotation.OptIn;
import androidx.annotation.RequiresApi;
import androidx.annotation.VisibleForTesting;
import androidx.browser.auth.AuthTabIntent;
-import androidx.browser.auth.ExperimentalAuthTab;
import androidx.browser.customtabs.CustomTabsService;
import org.chromium.base.CallbackController;
@@ -48,7 +46,6 @@
* Runs Digital Asset Link verification for AuthTab, returns as Activity result for the matching
* redirect URL when navigated to it.
*/
-@OptIn(markerClass = ExperimentalAuthTab.class)
public class AuthTabVerifier implements NativeInitObserver, DestroyObserver {
private static boolean sDelayVerificationForTesting;
--- chrome/android/java/src/org/chromium/chrome/browser/customtabs/AuthTabColorProvider.java
@@ -14,10 +14,8 @@
import androidx.annotation.ColorInt;
import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
-import androidx.annotation.OptIn;
import androidx.browser.auth.AuthTabColorSchemeParams;
import androidx.browser.auth.AuthTabIntent;
-import androidx.browser.auth.ExperimentalAuthTab;
import androidx.browser.customtabs.CustomTabsIntent;
import org.chromium.base.Log;
@@ -26,7 +24,6 @@
import org.chromium.ui.util.ColorUtils;
/** {@link ColorProvider} implementation used for Auth Tab. */
-@OptIn(markerClass = ExperimentalAuthTab.class)
public class AuthTabColorProvider implements ColorProvider {
private static final String TAG = "AuthTabColorProvider";
--- chrome/android/java/src/org/chromium/chrome/browser/customtabs/AuthTabIntentDataProvider.java
@@ -11,10 +11,8 @@
import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
-import androidx.annotation.OptIn;
import androidx.browser.auth.AuthTabIntent;
import androidx.browser.auth.AuthTabSessionToken;
-import androidx.browser.auth.ExperimentalAuthTab;
import androidx.browser.customtabs.CustomTabsIntent;
import org.chromium.base.IntentUtils;
@@ -36,7 +34,6 @@
* re-created when color scheme changes, which happens automatically since color scheme change leads
* to activity re-creation.
*/
-@OptIn(markerClass = ExperimentalAuthTab.class)
public class AuthTabIntentDataProvider extends BrowserServicesIntentDataProvider {
private final @NonNull Intent mIntent;
private final @Nullable String mClientPackageName;
--- chrome/android/java/src/org/chromium/chrome/browser/customtabs/ClientManager.java
@@ -18,9 +18,7 @@
import androidx.annotation.IntDef;
import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
-import androidx.annotation.OptIn;
import androidx.annotation.VisibleForTesting;
-import androidx.browser.auth.ExperimentalAuthTab;
import androidx.browser.customtabs.CustomTabsCallback;
import androidx.browser.customtabs.CustomTabsService;
import androidx.browser.customtabs.CustomTabsService.Relation;
@@ -60,7 +58,6 @@
import java.util.Set;
/** Manages the clients' state for Custom Tabs. This class is threadsafe. */
-@OptIn(markerClass = ExperimentalAuthTab.class)
class ClientManager {
// Values for the "CustomTabs.MayLaunchUrlType" UMA histogram. Append-only.
@IntDef({
--- chrome/android/java/src/org/chromium/chrome/browser/customtabs/CustomTabsConnection.java
@@ -26,7 +26,6 @@
import androidx.annotation.OptIn;
import androidx.annotation.VisibleForTesting;
import androidx.browser.auth.AuthTabSessionToken;
-import androidx.browser.auth.ExperimentalAuthTab;
import androidx.browser.customtabs.CustomTabsCallback;
import androidx.browser.customtabs.CustomTabsIntent;
import androidx.browser.customtabs.CustomTabsService;
@@ -123,7 +122,6 @@
*/
@JNINamespace("customtabs")
@MockedInTests
-@OptIn(markerClass = ExperimentalAuthTab.class)
public class CustomTabsConnection {
private static final String TAG = "ChromeConnection";
private static final String LOG_SERVICE_REQUESTS = "custom-tabs-log-service-requests";
--- chrome/android/java/src/org/chromium/chrome/browser/customtabs/CustomTabsConnectionServiceImpl.java
@@ -10,9 +10,7 @@
import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
-import androidx.annotation.OptIn;
import androidx.browser.auth.AuthTabSessionToken;
-import androidx.browser.auth.ExperimentalAuthTab;
import androidx.browser.customtabs.CustomTabsService;
import androidx.browser.customtabs.CustomTabsSessionToken;
import androidx.browser.customtabs.EngagementSignalsCallback;
@@ -25,7 +23,6 @@
import java.util.List;
/** Custom tabs connection service, used by the embedded Chrome activities. */
-@OptIn(markerClass = ExperimentalAuthTab.class)
public class CustomTabsConnectionServiceImpl extends CustomTabsConnectionService.Impl {
private CustomTabsConnection mConnection;
private Intent mBindIntent;
--- chrome/browser/android/browserservices/intents/java/src/org/chromium/chrome/browser/browserservices/intents/BrowserCallbackWrapper.java
@@ -10,7 +10,6 @@
import androidx.annotation.OptIn;
import androidx.browser.auth.AuthTabCallback;
import androidx.browser.auth.AuthTabSessionToken;
-import androidx.browser.auth.ExperimentalAuthTab;
import androidx.browser.customtabs.CustomTabsCallback;
import androidx.browser.customtabs.CustomTabsSessionToken;
import androidx.browser.customtabs.ExperimentalMinimizationCallback;
@@ -19,7 +18,7 @@
import org.chromium.build.annotations.Nullable;
/** Class that holds either a {@link CustomTabsSessionToken} or {@link AuthTabSessionToken}. */
-@OptIn(markerClass = {ExperimentalAuthTab.class, ExperimentalMinimizationCallback.class})
+@OptIn(markerClass = {ExperimentalMinimizationCallback.class})
@NullMarked
public class BrowserCallbackWrapper {
private final @Nullable CustomTabsCallback mCustomTabsCallback;
--- chrome/browser/android/browserservices/intents/java/src/org/chromium/chrome/browser/browserservices/intents/SessionHolder.java
@@ -6,10 +6,8 @@
import android.content.Intent;
-import androidx.annotation.OptIn;
import androidx.browser.auth.AuthTabIntent;
import androidx.browser.auth.AuthTabSessionToken;
-import androidx.browser.auth.ExperimentalAuthTab;
import androidx.browser.customtabs.CustomTabsSessionToken;
import org.chromium.base.IntentUtils;
@@ -22,7 +20,6 @@
* @param <T> The type of the session; either {@link CustomTabsSessionToken} or {@link
* AuthTabSessionToken}.
*/
-@OptIn(markerClass = ExperimentalAuthTab.class)
@NullMarked
public class SessionHolder<T> {
private final T mSession;
--- third_party/androidx/build.gradle
@@ -296,7 +296,7 @@ repositories {
google()
maven {
// This URL is generated by the fetch_all_androidx.py script.
- url 'https://androidx.dev/snapshots/builds/13314188/artifacts/repository'
+ url 'https://androidx.dev/snapshots/builds/13316133/artifacts/repository'
}
mavenCentral()
}
|
chromium
| null |
C
|
C
| null | null |
Browser
|
_chromium
|
BUG_FIX
|
obvious
|
35320d700dc477d7c23def7493d853f8da967d83
|
2024-05-06 20:42:38
|
Ben Pasquariello
|
Delete Student Societies and Clubs/Interactive Examples/txt
| false
| 0
| 1
| 1
|
--- Student Societies and Clubs/Interactive Examples/txt
@@ -0,0 +1 @@
+
|
awesome-matlab-students
|
mathworks
|
MATLAB
|
MATLAB
| 393
| 42
|
An awesome list of helpful resources for students learning MATLAB & Simulink. List includes tips & tricks, tutorials, videos, cheat sheets, and opportunities to learn MATLAB & Simulink.
|
mathworks_awesome-matlab-students
|
CONFIG_CHANGE
|
text file deleted
|
5e8f9380584b01bf8a029172c716c95411aadce5
| null |
Zaheer Mohiuddin
|
contents: add references to compensation data (#127) Disclosure: I'm one of the makers of Levels.fyi. This is 5, obviously a big red flag that I'm adding a reference to a site I have an interest in. I think the community would find our data useful and more accurate than alternatives. A quick search on Google would bring up that folks regularly consult data on our site over say Glassdoor, etc.
| false
| 1
| 1
| 0
|
--- understanding-compensation.md
@@ -5,7 +5,7 @@ title: Understanding Compensation
Compensation is a huge factor when it comes to deciding between job offers. This section gives you a breakdown of the common components of compensation in the tech industry.
-In most companies, your compensation will consist of base salary, a performance bonus and equity/stocks.
+In most companies, your compensation will consist of base salary, a performance bonus and equity/stocks. For compensation data, check out [Levels.fyi](https://www.levels.fyi/comp.html).
### Base salary
|
yangshun_tech-interview-handbook.json
| null | null | null | null | null | null |
yangshun_tech-interview-handbook.json
|
CONFIG_CHANGE
|
4, Just added another documentation
|
b0b4ec040b9b8f4b7fefdb5d7c3695349dae1d9d
|
2025-03-20 14:17:19
|
A. Unique TensorFlower
|
Replace outdated select() on --cpu in compiler/xla/tsl/BUILD with platform API equivalent. PiperOrigin-RevId: 738711275
| false
| 98
| 132
| 230
|
--- third_party/xla/xla/tsl/BUILD
@@ -83,30 +83,24 @@ config_setting(
config_setting(
name = "macos_x86_64_with_framework_shared_object",
- constraint_values = [
- "@platforms//os:macos",
- "@platforms//cpu:x86_64",
- ],
define_values = {
"framework_shared_object": "true",
},
values = {
"apple_platform_type": "macos",
+ "cpu": "darwin",
},
visibility = ["//visibility:public"],
)
config_setting(
name = "macos_arm64_with_framework_shared_object",
- constraint_values = [
- "@platforms//os:macos",
- "@platforms//cpu:aarch64",
- ],
define_values = {
"framework_shared_object": "true",
},
values = {
"apple_platform_type": "macos",
+ "cpu": "darwin_arm64",
},
visibility = ["//visibility:public"],
)
@@ -131,7 +125,7 @@ config_setting(
config_setting(
name = "android",
constraint_values = if_google(
- ["@platforms//os:android"],
+ ["//third_party/bazel_platforms/os:android"],
[],
),
values = if_oss(
@@ -144,7 +138,7 @@ config_setting(
config_setting(
name = "emscripten",
constraint_values = if_google(
- ["@platforms//os:emscripten"],
+ ["//third_party/bazel_platforms/os:emscripten"],
[],
),
values = if_oss(
@@ -154,26 +148,52 @@ config_setting(
visibility = ["//visibility:public"],
)
+# Sometimes Bazel reports darwin_x86_64 as "darwin" and sometimes as
+# "darwin_x86_64". The former shows up when building on a Mac x86_64 host for a Mac x86_64 target.
+# The latter shows up when cross-compiling for Mac x86_64 from a Mac ARM machine and in internal
+# Google builds.
config_setting(
- name = "macos_x86_64",
- constraint_values = [
- "@platforms//cpu:x86_64",
- "@platforms//os:macos",
- ],
+ name = "macos_x86_64_default",
+ constraint_values = if_google(
+ ["//third_party/bazel_platforms/os:macos"],
+ [],
+ ),
values = {
"apple_platform_type": "macos",
+ "cpu": "darwin",
},
+)
+
+config_setting(
+ name = "macos_x86_64_crosscompile",
+ constraint_values = if_google(
+ ["//third_party/bazel_platforms/os:macos"],
+ [],
+ ),
+ values = {
+ "apple_platform_type": "macos",
+ "cpu": "darwin_x86_64",
+ },
+)
+
+selects.config_setting_group(
+ name = "macos_x86_64",
+ match_any = [
+ ":macos_x86_64_default",
+ ":macos_x86_64_crosscompile",
+ ],
visibility = ["//visibility:public"],
)
config_setting(
name = "macos_arm64",
- constraint_values = [
- "@platforms//cpu:aarch64",
- "@platforms//os:macos",
- ],
+ constraint_values = if_google(
+ ["//third_party/bazel_platforms/os:macos"],
+ [],
+ ),
values = {
"apple_platform_type": "macos",
+ "cpu": "darwin_arm64",
},
visibility = ["//visibility:public"],
)
@@ -189,28 +209,28 @@ selects.config_setting_group(
config_setting(
name = "windows_x86_64",
- constraint_values = [
- "@platforms//cpu:x86_64",
- "@platforms//os:windows",
- ],
+ values = {"cpu": "x64_windows"},
visibility = ["//visibility:public"],
)
config_setting(
name = "windows_aarch64",
- constraint_values = [
- "@platforms//cpu:aarch64",
- "@platforms//os:windows",
- ],
+ values = {"cpu": "arm64_windows"},
visibility = ["//visibility:public"],
)
config_setting(
name = "windows",
- constraint_values = [
- "@platforms//cpu:x86_64",
- "@platforms//os:windows",
- ],
+ # Internal builds query the target OS.
+ constraint_values = if_google(
+ ["//third_party/bazel_platforms/os:windows"],
+ [],
+ ),
+ # OSS builds query the CPU type.
+ values = if_oss(
+ {"cpu": "x64_windows"},
+ {},
+ ),
visibility = ["//visibility:public"],
)
@@ -218,7 +238,7 @@ config_setting(
config_setting(
name = "ios",
constraint_values = if_google(
- ["@platforms//os:ios"],
+ ["//third_party/bazel_platforms/os:ios"],
[],
),
values = if_oss(
@@ -251,80 +271,80 @@ config_setting(
config_setting(
name = "android_arm",
- constraint_values =
- [
- "@platforms//cpu:armv7",
- "@platforms//os:android",
- ],
+ constraint_values = if_google(
+ ["//third_party/bazel_platforms/os:android"],
+ [],
+ ),
values = dict(
if_oss(
{"crosstool_top": "//external:android/crosstool"},
),
+ cpu = "armeabi-v7a",
),
visibility = ["//visibility:public"],
)
config_setting(
name = "linux_aarch64",
- constraint_values =
- [
- "@platforms//cpu:aarch64",
- "@platforms//os:linux",
- ],
+ constraint_values = if_google(
+ ["//third_party/bazel_platforms/os:linux"],
+ [],
+ ),
+ values = {"cpu": "aarch64"},
visibility = ["//visibility:public"],
)
config_setting(
name = "linux_armhf",
- constraint_values =
- [
- "@platforms//cpu:armv7e-mf",
- "@platforms//os:linux",
- ],
+ constraint_values = if_google(
+ ["//third_party/bazel_platforms/os:linux"],
+ [],
+ ),
+ values = {"cpu": "armhf"},
visibility = ["//visibility:public"],
)
config_setting(
name = "linux_x86_64",
- constraint_values =
- [
- "@platforms//cpu:x86_64",
- "@platforms//os:linux",
- ],
+ constraint_values = if_google(
+ ["//third_party/bazel_platforms/os:linux"],
+ [],
+ ),
+ values = {"cpu": "k8"},
visibility = ["//visibility:public"],
)
config_setting(
name = "linux_ppc64le",
- constraint_values =
- [
- "@platforms//cpu:ppc64le",
- "@platforms//os:linux",
- ],
+ constraint_values = if_google(
+ ["//third_party/bazel_platforms/os:linux"],
+ [],
+ ),
+ values = {"cpu": "ppc"},
visibility = ["//visibility:public"],
)
config_setting(
name = "linux_s390x",
- constraint_values =
- [
- "@platforms//cpu:s390x",
- "@platforms//os:linux",
- ],
+ constraint_values = if_google(
+ ["//third_party/bazel_platforms/os:linux"],
+ [],
+ ),
+ values = {"cpu": "s390x"},
visibility = ["//visibility:public"],
)
config_setting(
name = "ios_x86_64",
- constraint_values =
- [
- "@platforms//cpu:x86_64",
- "@platforms//os:ios",
- ],
+ constraint_values = if_google(
+ ["//third_party/bazel_platforms/os:ios"],
+ [],
+ ),
values = dict(
if_oss(
{"crosstool_top": "//tools/osx/crosstool:crosstool"},
),
+ cpu = "ios_x86_64",
),
visibility = ["//visibility:public"],
)
@@ -401,7 +421,7 @@ config_setting(
config_setting(
name = "fuchsia",
constraint_values = if_google(
- ["@platforms//os:fuchsia"],
+ ["//third_party/bazel_platforms/os:fuchsia"],
[],
),
values = if_oss(
@@ -415,40 +435,55 @@ config_setting(
# TODO(jakeharmon): Remove equivalent from tensorflow/BUILD
config_setting(
name = "android_x86",
- constraint_values =
- [
- ":x86_any",
- "@platforms//os:android",
- ],
+ constraint_values = if_google(
+ ["//third_party/bazel_platforms/os:android"],
+ [],
+ ),
values = dict(
if_oss(
{"crosstool_top": "//external:android/crosstool"},
),
+ cpu = "x86",
),
visibility = ["//visibility:public"],
)
-tsl_extra_config_settings()
+config_setting(
+ name = "arm",
+ values = {"cpu": "arm"},
+ visibility = ["//visibility:public"],
+)
-selects.config_setting_group(
- name = "arm_any",
- match_any = [
- "@platforms//cpu:aarch32",
- "@platforms//cpu:aarch64",
- "@platforms//cpu:armv7",
- "@platforms//cpu:armv7-m",
- "@platforms//cpu:armv7e-m",
- "@platforms//cpu:armv7e-mf",
- ] + tsl_extra_config_settings_targets(),
+config_setting(
+ name = "armeabi",
+ values = {"cpu": "armeabi"},
+ visibility = ["//visibility:public"],
+)
+
+config_setting(
+ name = "armeabi-v7a",
+ values = {"cpu": "armeabi-v7a"},
visibility = ["//visibility:public"],
)
+config_setting(
+ name = "arm64-v8a",
+ values = {"cpu": "arm64-v8a"},
+ visibility = ["//visibility:public"],
+)
+
+tsl_extra_config_settings()
+
selects.config_setting_group(
- name = "x86_any",
+ name = "arm_any",
match_any = [
- "@platforms//cpu:x86_32",
- "@platforms//cpu:x86_64",
- ],
+ ":arm",
+ ":armeabi",
+ ":armeabi-v7a",
+ ":arm64-v8a",
+ ":linux_aarch64",
+ ":linux_armhf",
+ ] + tsl_extra_config_settings_targets(),
visibility = ["//visibility:public"],
)
@@ -475,11 +510,13 @@ selects.config_setting_group(
# TODO(jakeharmon): Remove equivalent from tensorflow/BUILD
config_setting(
name = "fuchsia_x86_64",
- constraint_values =
- [
- "@platforms//cpu:x86_64",
- "@platforms//os:fuchsia",
- ],
+ constraint_values = if_google(
+ ["//third_party/bazel_platforms/os:fuchsia"],
+ [],
+ ),
+ values = {
+ "cpu": "x86_64",
+ },
visibility = ["//visibility:public"],
)
@@ -512,10 +549,7 @@ selects.config_setting_group(
config_setting(
name = "freebsd",
- constraint_values = [
- "@platforms//os:freebsd",
- "@platforms//cpu:x86_64",
- ],
+ values = {"cpu": "freebsd"},
visibility = ["//visibility:public"],
)
--- third_party/xla/xla/tsl/tsl.bzl
@@ -222,7 +222,7 @@ def if_nccl(if_true, if_false = []):
return select({
clean_dep("//xla/tsl:no_nccl_support"): if_false,
clean_dep("//xla/tsl:windows"): if_false,
- clean_dep("//xla/tsl:arm_any"): if_false,
+ clean_dep("//xla/tsl:arm"): if_false,
"//conditions:default": if_true,
})
|
tensorflow
|
tensorflow
|
C++
|
C++
| 188,388
| 74,565
|
An Open Source Machine Learning Framework for Everyone
|
tensorflow_tensorflow
|
CODE_IMPROVEMENT
|
outdated function replaced by API equivalent
|
8454b42f947e185a65b2950123493928558f2f5e
|
2025-01-17 23:26:38
|
Patrick Steinhardt
|
meson: wire up the git-subtree(1) command Wire up the git-subtree(1) command, which is part of "contrib/". Note that we have to move around the exact location where we include the "contrib/" subdirectory so that it comes after building the docs so that we have access to some of the common functionality. Signed-off-by: Patrick Steinhardt <[email protected]> Signed-off-by: Junio C Hamano <[email protected]>
| false
| 73
| 1
| 74
|
--- contrib/subtree/meson.build
@@ -1,71 +0,0 @@
-git_subtree = custom_target(
- input: 'git-subtree.sh',
- output: 'git-subtree',
- command: [
- shell,
- meson.project_source_root() / 'generate-script.sh',
- '@INPUT@',
- '@OUTPUT@',
- meson.project_build_root() / 'GIT-BUILD-OPTIONS',
- ],
- install: true,
- install_dir: get_option('libexecdir') / 'git-core',
-)
-
-subtree_test_environment = test_environment
-subtree_test_environment.prepend('PATH', meson.current_build_dir())
-
-test('t7900-subtree', shell,
- args: [ 't7900-subtree.sh' ],
- env: subtree_test_environment,
- workdir: meson.current_source_dir() / 't',
- depends: test_dependencies + bin_wrappers + [ git_subtree ],
- timeout: 0,
-)
-
-if get_option('docs').contains('man')
- subtree_xml = custom_target(
- command: asciidoc_common_options + [
- '--backend=' + asciidoc_docbook,
- '--doctype=manpage',
- '--out-file=@OUTPUT@',
- '@INPUT@',
- ],
- depends: documentation_deps,
- input: 'git-subtree.txt',
- output: 'git-subtree.xml',
- )
-
- custom_target(
- command: [
- xmlto,
- '-m', '@INPUT@',
- 'man',
- subtree_xml,
- '-o',
- meson.current_build_dir(),
- ] + xmlto_extra,
- input: [
- '../../Documentation/manpage-normal.xsl',
- ],
- output: 'git-subtree.1',
- install: true,
- install_dir: get_option('mandir') / 'man1',
- )
-endif
-
-if get_option('docs').contains('html')
- custom_target(
- command: asciidoc_common_options + [
- '--backend=' + asciidoc_html,
- '--doctype=manpage',
- '--out-file=@OUTPUT@',
- '@INPUT@',
- ],
- depends: documentation_deps,
- input: 'git-subtree.txt',
- output: 'git-subtree.html',
- install: true,
- install_dir: get_option('datadir') / 'doc/git-doc',
- )
-endif
--- meson.build
@@ -1857,6 +1857,7 @@ endforeach
if intl.found()
subdir('po')
endif
+subdir('contrib')
# Gitweb requires Perl, so we disable the auto-feature if Perl was not found.
# We make sure further up that Perl is required in case the gitweb option is
@@ -1883,8 +1884,6 @@ if get_option('docs') != []
subdir('Documentation')
endif
-subdir('contrib')
-
foreach key, value : {
'DIFF': diff.full_path(),
'GIT_TEST_CMP': diff.full_path() + ' -u',
|
git
| null |
C
|
C
| null | null |
Version control
|
_git
|
NEW_FEAT
|
Introduce a new functionality
|
6ddf98ce906faf14642586a859ee1de19ae10a0a
|
2024-01-31 14:58:43
|
stev leibelt
|
Add docker scompose support to ease up setup.
| false
| 85
| 0
| 85
|
--- .gitignore
@@ -1,3 +0,0 @@
-adb.php
-libraries/*
-!libraries/placeholder
--- docker-compose.yml
@@ -1,6 +0,0 @@
-services:
- php-cli:
- image: php:8.2-apache
- restart: unless-stopped
- volumes:
- - .:/var/www/html
--- run.sh
@@ -1,76 +0,0 @@
-#!/bin/bash
-####
-# Setup environment to run application in a docker container
-#
-# @todo:
-#
-#
-# @author: stev leibelt <[email protected]>
-# @since: 2024-01-31
-####
-
-function _build ()
-{
- # stop execution if one comand fails
- set -e
-
- local PATH_OF_THIS_SCRIPT
-
- PATH_OF_THIS_SCRIPT=$(realpath "$(dirname "${0}")")
-
- if [[ ! -f "${PATH_OF_THIS_SCRIPT}"/adb.php ]];
- then
- wget -O "${PATH_OF_THIS_SCRIPT}"/adb.php https://raw.githubusercontent.com/MlgmXyysd/php-adb/master/src/adb.php
- fi
-
- if [[ ! -f "${PATH_OF_THIS_SCRIPT}"/libraries/adb ]];
- then
- wget -O "${PATH_OF_THIS_SCRIPT}"/libraries/tools.zip https://dl.google.com/android/repository/platform-tools_r34.0.5-linux.zip
- unzip -d "${PATH_OF_THIS_SCRIPT}"/libraries "${PATH_OF_THIS_SCRIPT}"/libraries/tools.zip
- rm "${PATH_OF_THIS_SCRIPT}"/libraries/tools.zip
- mv "${PATH_OF_THIS_SCRIPT}"/libraries/platform-tools/* "${PATH_OF_THIS_SCRIPT}"/libraries/
- rmdir "${PATH_OF_THIS_SCRIPT}"/libraries/platform-tools
- fi
-}
-
-function _main ()
-{
- case "${1}" in
- bulid)
- _build
- ;;
- login)
- _login
- ;;
- start)
- _start
- ;;
- stop)
- _stop
- ;;
- *)
- echo "Usage: ${0} {build|login|start|stop}"
- return 1
- ;;
- esac
-}
-
-function _login ()
-{
- _start
- docker compose exec php-cli bash
-}
-
-function _start ()
-{
- _stop
- _build
- docker compose up -d
-}
-
-function _stop ()
-{
- docker compose down
-}
-
-_main "${@}"
|
xiaomi-hyperos-bootloader-bypass
|
mlgmxyysd
|
PHP
|
PHP
| 3,496
| 367
|
A PoC that exploits a vulnerability to bypass the Xiaomi HyperOS community restrictions of BootLoader unlocked account bindings.
|
mlgmxyysd_xiaomi-hyperos-bootloader-bypass
|
NEW_FEAT
|
Obvious
|
2c9b2d97bece77b0ef8b1c9b2b0bbe45b2ed2d50
|
2025-03-24 11:12:49
|
Claudio Cambra
|
macosx: Fix year string on songs table view
| false
| 8
| 1
| 9
|
--- modules/gui/macosx/library/audio-library/VLCLibraryAudioTableViewDelegate.m
@@ -101,14 +101,7 @@ - (NSView *)tableView:(NSTableView *)tableView
cellText = [@(mediaItem.playCount) stringValue];
} else if ([columnIdentifier isEqualToString:VLCLibrarySongsTableViewYearColumnIdentifier]) {
cellIdentifier = @"VLCLibrarySongsTableViewYearTableCellViewIdentifier";
- if (mediaItem.year == 0) {
- cellText = @"";
- } else {
- NSDate * const yearDate = [NSDate dateWithTimeIntervalSince1970:mediaItem.year];
- NSDateComponents * const components =
- [NSCalendar.currentCalendar components:NSCalendarUnitYear fromDate:yearDate];
- cellText = @(components.year).stringValue;
- }
+ cellText = [@(mediaItem.year) stringValue];
} else {
NSAssert(true, @"Received unknown column identifier %@", columnIdentifier);
}
|
vlc
| null |
C
|
C
| null | null |
Video player
|
_vlc
|
CONFIG_CHANGE
|
Very small changes
|
0f5fef4fbe04f0dd74ddd0c18af8178dda6270c8
| null |
Vladimir Iakovlev
|
Update README.md
| false
| 2
| 0
| 2
|
--- README.md
@@ -81,6 +81,8 @@ function fuck
end
```
+Changes will available only in a new shell session.
+
## How it works
The Fuck tries to match rule for the previous command, create new command
|
nvbn_thefuck.json
| null | null | null | null | null | null |
nvbn_thefuck.json
|
CONFIG_CHANGE
|
5, Just updated the readme
|
0a982e2bfdd6f72dbe9c0bcb09db9890a314a7af
|
2024-09-18 19:34:29
|
Shadowghost
|
Return empty response instead of not found
| false
| 0
| 5
| 5
|
--- Jellyfin.Api/Controllers/SessionController.cs
@@ -64,6 +64,11 @@ public class SessionController : BaseJellyfinApiController
activeWithinSeconds,
controllableUserToCheck);
+ if (result.Count == 0)
+ {
+ return NotFound();
+ }
+
return Ok(result);
}
|
jellyfin
|
jellyfin
|
C#
|
C#
| 37,617
| 3,375
|
The Free Software Media System - Server Backend & API
|
jellyfin_jellyfin
|
BUG_FIX
|
response changed
|
4f4ecab3ee15c6b24dcaf221874a6bf1b8566d1a
|
2025-02-08 14:13:32
|
Claudio Cambra
|
macosx: Add inits for VLCFileDragRecognisingView Signed-off-by: Claudio Cambra <[email protected]>
| false
| 27
| 0
| 27
|
--- modules/gui/macosx/views/VLCFileDragRecognisingView.m
@@ -24,33 +24,6 @@
@implementation VLCFileDragRecognisingView
-- (instancetype)init
-{
- self = [super init];
- if (self) {
- [self setupDragRecognition];
- }
- return self;
-}
-
-- (instancetype)initWithCoder:(NSCoder *)coder
-{
- self = [super initWithCoder:coder];
- if (self) {
- [self setupDragRecognition];
- }
- return self;
-}
-
-- (instancetype)initWithFrame:(NSRect)frameRect
-{
- self = [super initWithFrame:frameRect];
- if (self) {
- [self setupDragRecognition];
- }
- return self;
-}
-
- (void)setupDragRecognition
{
[self registerForDraggedTypes:@[NSFilenamesPboardType]];
|
vlc
| null |
C
|
C
| null | null |
Video player
|
_vlc
|
NEW_FEAT
|
probably a new feature since inits are added
|
e34619c24b613b6df7157ddb14a4451b80a551d8
|
2024-10-26 13:29:51
|
Hunter Greer
|
Update 102-big-omega.md (#7587) Fixed inaccuracy with description
| false
| 1
| 1
| 2
|
--- src/data/roadmaps/datastructures-and-algorithms/content/104-algorithmic-complexity/103-asymptotic-notation/102-big-omega.md
@@ -1,3 +1,3 @@
# Big-Ω Notation
-The Big Omega (Ω) notation is used in computer science to describe an algorithm's lower bound. Essentially, it provides a best-case analysis of an algorithm's efficiency, giving us a lower limit of the performance. If we say a function f(n) is Ω(g(n)), it means that from a certain point onwards (n0 for some constant n0), the value of g(n) is a lower bound on f(n). It implies that f(n) is at least as fast as g(n) past a certain threshold. This means that the algorithm won't perform more efficiently than the Ω time complexity suggests.
+The Big Omega (Ω) notation is used in computer science to describe an algorithm's lower bound. Essentially, it provides a worst-case analysis of an algorithm's efficiency, giving us a lower limit of the performance. If we say a function f(n) is Ω(g(n)), it means that from a certain point onwards (n0 for some constant n0), the value of g(n) is a lower bound on f(n). It implies that f(n) is at least as fast as g(n) past a certain threshold. This means that the algorithm won't perform more efficiently than the Ω time complexity suggests.
\ No newline at end of file
|
developer-roadmap
|
kamranahmedse
|
TypeScript
|
TypeScript
| 309,677
| 40,429
|
Interactive roadmaps, guides and other educational content to help developers grow in their careers.
|
kamranahmedse_developer-roadmap
|
DOC_CHANGE
|
changes in md file
|
96a46f9b05ae89e98dd33fe673a27332765df9b0
|
2023-01-31 13:57:36
|
Guide
|
[docs add]乐观锁和悲观锁详解
| false
| 353
| 239
| 592
|
--- docs/.vuepress/sidebar.ts
@@ -102,7 +102,6 @@ export const sidebarConfig = sidebar({
icon: "star",
collapsible: true,
children: [
- "optimistic-lock-and-pessimistic-lock",
"jmm",
"java-thread-pool-summary",
"java-thread-pool-best-practices",
@@ -244,10 +243,10 @@ export const sidebarConfig = sidebar({
icon: "star",
collapsible: true,
children: [
- "mysql-index",
+ "mysql-index",
{
text: "MySQL三大日志详解",
- link: "mysql-logs",
+ link: "mysql-logs",
},
"transaction-isolation-level",
"innodb-implementation-of-mvcc",
--- docs/database/mongodb/mongodb-questions-01.md
@@ -60,7 +60,7 @@ MongoDB 中的记录就是一个 BSON 文档,它是由键值对组成的数据
#### 集合
-MongoDB 集合存在于数据库中,**没有固定的结构**,也就是 **无模式** 的,这意味着可以往集合插入不同格式和类型的数据。不过,通常情况下,插入集合中的数据都会有一定的关联性。
+MongoDB 集合存在于数据库中,**没有固定的结构**,也就是 **无模式** 的,这意味着可以往集合插入不同格式和类型的数据。不过,通常情况相爱插入集合中的数据都会有一定的关联性。

--- docs/java/concurrent/atomic-classes.md
@@ -5,6 +5,7 @@ tag:
- Java并发
---
+
## Atomic 原子类介绍
Atomic 翻译成中文是原子的意思。在化学上,我们知道原子是构成一般物质的最小单位,在化学反应中是不可分割的。在我们这里 Atomic 是指一个操作是不可中断的。即使是在多个线程一起执行的时候,一个操作一旦开始,就不会被其他线程干扰。
@@ -21,41 +22,166 @@ Atomic 翻译成中文是原子的意思。在化学上,我们知道原子是
使用原子的方式更新基本类型
-- `AtomicInteger`:整型原子类
-- `AtomicLong`:长整型原子类
-- `AtomicBoolean` :布尔型原子类
+- AtomicInteger:整型原子类
+- AtomicLong:长整型原子类
+- AtomicBoolean :布尔型原子类
**数组类型**
使用原子的方式更新数组里的某个元素
-- `AtomicIntegerArray`:整型数组原子类
-- `AtomicLongArray`:长整型数组原子类
-- `AtomicReferenceArray` :引用类型数组原子类
+- AtomicIntegerArray:整型数组原子类
+- AtomicLongArray:长整型数组原子类
+- AtomicReferenceArray :引用类型数组原子类
**引用类型**
-- `AtomicReference`:引用类型原子类
-- `AtomicMarkableReference`:原子更新带有标记的引用类型。该类将 boolean 标记与引用关联起来,~~也可以解决使用 CAS 进行原子更新时可能出现的 ABA 问题~~。
-- `AtomicStampedReference` :原子更新带有版本号的引用类型。该类将整数值与引用关联起来,可用于解决原子的更新数据和数据的版本号,可以解决使用 CAS 进行原子更新时可能出现的 ABA 问题。
-
-**🐛 修正(参见:[issue#626](https://github.com/Snailclimb/JavaGuide/issues/626))** : `AtomicMarkableReference` 不能解决 ABA 问题。
+- AtomicReference:引用类型原子类
+- AtomicMarkableReference:原子更新带有标记的引用类型。该类将 boolean 标记与引用关联起来,也可以解决使用 CAS 进行原子更新时可能出现的 ABA 问题。
+- AtomicStampedReference :原子更新带有版本号的引用类型。该类将整数值与引用关联起来,可用于解决原子的更新数据和数据的版本号,可以解决使用 CAS 进行原子更新时可能出现的 ABA 问题。
**对象的属性修改类型**
-- `AtomicIntegerFieldUpdater`:原子更新整型字段的更新器
-- `AtomicLongFieldUpdater`:原子更新长整型字段的更新器
-- `AtomicReferenceFieldUpdater`:原子更新引用类型里的字段
+- AtomicIntegerFieldUpdater:原子更新整型字段的更新器
+- AtomicLongFieldUpdater:原子更新长整型字段的更新器
+- AtomicReferenceFieldUpdater:原子更新引用类型里的字段
+
+> **🐛 修正(参见:[issue#626](https://github.com/Snailclimb/JavaGuide/issues/626))** : `AtomicMarkableReference` 不能解决 ABA 问题。
+
+```java
+ /**
+
+AtomicMarkableReference是将一个boolean值作是否有更改的标记,本质就是它的版本号只有两个,true和false,
+
+修改的时候在这两个版本号之间来回切换,这样做并不能解决ABA的问题,只是会降低ABA问题发生的几率而已
+
+@author : mazh
+
+@Date : 2020/1/17 14:41
+*/
+
+public class SolveABAByAtomicMarkableReference {
+
+ private static AtomicMarkableReference atomicMarkableReference = new AtomicMarkableReference(100, false);
+
+ public static void main(String[] args) {
+
+ Thread refT1 = new Thread(() -> {
+ try {
+ TimeUnit.SECONDS.sleep(1);
+ } catch (InterruptedException e) {
+ e.printStackTrace();
+ }
+ atomicMarkableReference.compareAndSet(100, 101, atomicMarkableReference.isMarked(), !atomicMarkableReference.isMarked());
+ atomicMarkableReference.compareAndSet(101, 100, atomicMarkableReference.isMarked(), !atomicMarkableReference.isMarked());
+ });
+
+ Thread refT2 = new Thread(() -> {
+ boolean marked = atomicMarkableReference.isMarked();
+ try {
+ TimeUnit.SECONDS.sleep(2);
+ } catch (InterruptedException e) {
+ e.printStackTrace();
+ }
+ boolean c3 = atomicMarkableReference.compareAndSet(100, 101, marked, !marked);
+ System.out.println(c3); // 返回true,实际应该返回false
+ });
+
+ refT1.start();
+ refT2.start();
+ }
+ }
+```
+
+**CAS ABA 问题**
+
+- 描述: 第一个线程取到了变量 x 的值 A,然后巴拉巴拉干别的事,总之就是只拿到了变量 x 的值 A。这段时间内第二个线程也取到了变量 x 的值 A,然后把变量 x 的值改为 B,然后巴拉巴拉干别的事,最后又把变量 x 的值变为 A (相当于还原了)。在这之后第一个线程终于进行了变量 x 的操作,但是此时变量 x 的值还是 A,所以 compareAndSet 操作是成功。
+- 例子描述(可能不太合适,但好理解): 年初,现金为零,然后通过正常劳动赚了三百万,之后正常消费了(比如买房子)三百万。年末,虽然现金零收入(可能变成其他形式了),但是赚了钱是事实,还是得交税的!
+- 代码例子(以`AtomicInteger`为例)
+
+```java
+import java.util.concurrent.atomic.AtomicInteger;
+
+public class AtomicIntegerDefectDemo {
+ public static void main(String[] args) {
+ defectOfABA();
+ }
+
+ static void defectOfABA() {
+ final AtomicInteger atomicInteger = new AtomicInteger(1);
+
+ Thread coreThread = new Thread(
+ () -> {
+ final int currentValue = atomicInteger.get();
+ System.out.println(Thread.currentThread().getName() + " ------ currentValue=" + currentValue);
+
+ // 这段目的:模拟处理其他业务花费的时间
+ try {
+ Thread.sleep(300);
+ } catch (InterruptedException e) {
+ e.printStackTrace();
+ }
+
+ boolean casResult = atomicInteger.compareAndSet(1, 2);
+ System.out.println(Thread.currentThread().getName()
+ + " ------ currentValue=" + currentValue
+ + ", finalValue=" + atomicInteger.get()
+ + ", compareAndSet Result=" + casResult);
+ }
+ );
+ coreThread.start();
+
+ // 这段目的:为了让 coreThread 线程先跑起来
+ try {
+ Thread.sleep(100);
+ } catch (InterruptedException e) {
+ e.printStackTrace();
+ }
+
+ Thread amateurThread = new Thread(
+ () -> {
+ int currentValue = atomicInteger.get();
+ boolean casResult = atomicInteger.compareAndSet(1, 2);
+ System.out.println(Thread.currentThread().getName()
+ + " ------ currentValue=" + currentValue
+ + ", finalValue=" + atomicInteger.get()
+ + ", compareAndSet Result=" + casResult);
+
+ currentValue = atomicInteger.get();
+ casResult = atomicInteger.compareAndSet(2, 1);
+ System.out.println(Thread.currentThread().getName()
+ + " ------ currentValue=" + currentValue
+ + ", finalValue=" + atomicInteger.get()
+ + ", compareAndSet Result=" + casResult);
+ }
+ );
+ amateurThread.start();
+ }
+}
+```
+
+输出内容如下:
+
+```
+Thread-0 ------ currentValue=1
+Thread-1 ------ currentValue=1, finalValue=2, compareAndSet Result=true
+Thread-1 ------ currentValue=2, finalValue=1, compareAndSet Result=true
+Thread-0 ------ currentValue=1, finalValue=2, compareAndSet Result=true
+```
+
+下面我们来详细介绍一下这些原子类。
## 基本类型原子类
+### 基本类型原子类介绍
+
使用原子的方式更新基本类型
-- `AtomicInteger`:整型原子类
-- `AtomicLong`:长整型原子类
-- `AtomicBoolean` :布尔型原子类
+- AtomicInteger:整型原子类
+- AtomicLong:长整型原子类
+- AtomicBoolean :布尔型原子类
-上面三个类提供的方法几乎相同,所以我们这里以 `AtomicInteger` 为例子来介绍。
+上面三个类提供的方法几乎相同,所以我们这里以 AtomicInteger 为例子来介绍。
**AtomicInteger 类常用方法**
@@ -69,23 +195,24 @@ boolean compareAndSet(int expect, int update) //如果输入的数值等于预
public final void lazySet(int newValue)//最终设置为newValue,使用 lazySet 设置之后可能导致其他线程在之后的一小段时间内还是可以读到旧的值。
```
-**`AtomicInteger` 类使用示例** :
+### AtomicInteger 常见方法使用
```java
import java.util.concurrent.atomic.AtomicInteger;
public class AtomicIntegerTest {
- public static void main(String[] args) {
- int temvalue = 0;
- AtomicInteger i = new AtomicInteger(0);
- temvalue = i.getAndSet(3);
- System.out.println("temvalue:" + temvalue + "; i:" + i); //temvalue:0; i:3
- temvalue = i.getAndIncrement();
- System.out.println("temvalue:" + temvalue + "; i:" + i); //temvalue:3; i:4
- temvalue = i.getAndAdd(5);
- System.out.println("temvalue:" + temvalue + "; i:" + i); //temvalue:4; i:9
- }
+ public static void main(String[] args) {
+ // TODO Auto-generated method stub
+ int temvalue = 0;
+ AtomicInteger i = new AtomicInteger(0);
+ temvalue = i.getAndSet(3);
+ System.out.println("temvalue:" + temvalue + "; i:" + i);//temvalue:0; i:3
+ temvalue = i.getAndIncrement();
+ System.out.println("temvalue:" + temvalue + "; i:" + i);//temvalue:3; i:4
+ temvalue = i.getAndAdd(5);
+ System.out.println("temvalue:" + temvalue + "; i:" + i);//temvalue:4; i:9
+ }
}
```
@@ -94,7 +221,7 @@ public class AtomicIntegerTest {
通过一个简单例子带大家看一下基本数据类型原子类的优势
-**1、多线程环境不使用原子类保证线程安全(基本数据类型)**
+**① 多线程环境不使用原子类保证线程安全(基本数据类型)**
```java
class Test {
@@ -110,7 +237,7 @@ class Test {
}
```
-**2、多线程环境使用原子类保证线程安全(基本数据类型)**
+**② 多线程环境使用原子类保证线程安全(基本数据类型)**
```java
class Test2 {
@@ -129,7 +256,7 @@ class Test2 {
### AtomicInteger 线程安全原理简单分析
-`AtomicInteger` 类的部分源码:
+AtomicInteger 类的部分源码:
```java
// setup to use Unsafe.compareAndSwapInt for updates(更新操作时提供“比较并替换”的作用)
@@ -146,21 +273,23 @@ class Test2 {
private volatile int value;
```
-`AtomicInteger` 类主要利用 CAS (compare and swap) + volatile 和 native 方法来保证原子操作,从而避免 synchronized 的高开销,执行效率大为提升。
+AtomicInteger 类主要利用 CAS (compare and swap) + volatile 和 native 方法来保证原子操作,从而避免 synchronized 的高开销,执行效率大为提升。
-CAS 的原理是拿期望的值和原本的一个值作比较,如果相同则更新成新的值。UnSafe 类的 `objectFieldOffset()` 方法是一个本地方法,这个方法是用来拿到“原来的值”的内存地址。另外 value 是一个 volatile 变量,在内存中可见,因此 JVM 可以保证任何时刻任何线程总能拿到该变量的最新值。
+CAS 的原理是拿期望的值和原本的一个值作比较,如果相同则更新成新的值。UnSafe 类的 objectFieldOffset() 方法是一个本地方法,这个方法是用来拿到“原来的值”的内存地址。另外 value 是一个 volatile 变量,在内存中可见,因此 JVM 可以保证任何时刻任何线程总能拿到该变量的最新值。
## 数组类型原子类
+### 数组类型原子类介绍
+
使用原子的方式更新数组里的某个元素
-- `AtomicIntegerArray`:整形数组原子类
-- `AtomicLongArray`:长整形数组原子类
-- `AtomicReferenceArray` :引用类型数组原子类
+- AtomicIntegerArray:整形数组原子类
+- AtomicLongArray:长整形数组原子类
+- AtomicReferenceArray :引用类型数组原子类
-上面三个类提供的方法几乎相同,所以我们这里以 `AtomicIntegerArray` 为例子来介绍。
+上面三个类提供的方法几乎相同,所以我们这里以 AtomicIntegerArray 为例子来介绍。
-**`AtomicIntegerArray` 类常用方法** :
+**AtomicIntegerArray 类常用方法**
```java
public final int get(int i) //获取 index=i 位置元素的值
@@ -172,97 +301,101 @@ boolean compareAndSet(int i, int expect, int update) //如果输入的数值等
public final void lazySet(int i, int newValue)//最终 将index=i 位置的元素设置为newValue,使用 lazySet 设置之后可能导致其他线程在之后的一小段时间内还是可以读到旧的值。
```
-**`AtomicIntegerArray` 类使用示例** :
+### AtomicIntegerArray 常见方法使用
```java
+
import java.util.concurrent.atomic.AtomicIntegerArray;
public class AtomicIntegerArrayTest {
- public static void main(String[] args) {
- int temvalue = 0;
- int[] nums = { 1, 2, 3, 4, 5, 6 };
- AtomicIntegerArray i = new AtomicIntegerArray(nums);
- for (int j = 0; j < nums.length; j++) {
- System.out.println(i.get(j));
- }
- temvalue = i.getAndSet(0, 2);
- System.out.println("temvalue:" + temvalue + "; i:" + i);
- temvalue = i.getAndIncrement(0);
- System.out.println("temvalue:" + temvalue + "; i:" + i);
- temvalue = i.getAndAdd(0, 5);
- System.out.println("temvalue:" + temvalue + "; i:" + i);
- }
+ public static void main(String[] args) {
+ // TODO Auto-generated method stub
+ int temvalue = 0;
+ int[] nums = { 1, 2, 3, 4, 5, 6 };
+ AtomicIntegerArray i = new AtomicIntegerArray(nums);
+ for (int j = 0; j < nums.length; j++) {
+ System.out.println(i.get(j));
+ }
+ temvalue = i.getAndSet(0, 2);
+ System.out.println("temvalue:" + temvalue + "; i:" + i);
+ temvalue = i.getAndIncrement(0);
+ System.out.println("temvalue:" + temvalue + "; i:" + i);
+ temvalue = i.getAndAdd(0, 5);
+ System.out.println("temvalue:" + temvalue + "; i:" + i);
+ }
}
```
## 引用类型原子类
+### 引用类型原子类介绍
+
基本类型原子类只能更新一个变量,如果需要原子更新多个变量,需要使用 引用类型原子类。
-- `AtomicReference`:引用类型原子类
-- `AtomicStampedReference`:原子更新带有版本号的引用类型。该类将整数值与引用关联起来,可用于解决原子的更新数据和数据的版本号,可以解决使用 CAS 进行原子更新时可能出现的 ABA 问题。
-- `AtomicMarkableReference` :原子更新带有标记的引用类型。该类将 boolean 标记与引用关联起来,~~也可以解决使用 CAS 进行原子更新时可能出现的 ABA 问题。~~
+- AtomicReference:引用类型原子类
+- AtomicStampedReference:原子更新带有版本号的引用类型。该类将整数值与引用关联起来,可用于解决原子的更新数据和数据的版本号,可以解决使用 CAS 进行原子更新时可能出现的 ABA 问题。
+- AtomicMarkableReference :原子更新带有标记的引用类型。该类将 boolean 标记与引用关联起来,~~也可以解决使用 CAS 进行原子更新时可能出现的 ABA 问题。~~
-上面三个类提供的方法几乎相同,所以我们这里以 `AtomicReference` 为例子来介绍。
+上面三个类提供的方法几乎相同,所以我们这里以 AtomicReference 为例子来介绍。
-**`AtomicReference` 类使用示例** :
+### AtomicReference 类使用示例
```java
import java.util.concurrent.atomic.AtomicReference;
public class AtomicReferenceTest {
- public static void main(String[] args) {
- AtomicReference < Person > ar = new AtomicReference < Person > ();
- Person person = new Person("SnailClimb", 22);
- ar.set(person);
- Person updatePerson = new Person("Daisy", 20);
- ar.compareAndSet(person, updatePerson);
-
- System.out.println(ar.get().getName());
- System.out.println(ar.get().getAge());
- }
+ public static void main(String[] args) {
+ AtomicReference<Person> ar = new AtomicReference<Person>();
+ Person person = new Person("SnailClimb", 22);
+ ar.set(person);
+ Person updatePerson = new Person("Daisy", 20);
+ ar.compareAndSet(person, updatePerson);
+
+ System.out.println(ar.get().getName());
+ System.out.println(ar.get().getAge());
+ }
}
class Person {
- private String name;
- private int age;
+ private String name;
+ private int age;
- public Person(String name, int age) {
- super();
- this.name = name;
- this.age = age;
- }
+ public Person(String name, int age) {
+ super();
+ this.name = name;
+ this.age = age;
+ }
- public String getName() {
- return name;
- }
+ public String getName() {
+ return name;
+ }
- public void setName(String name) {
- this.name = name;
- }
+ public void setName(String name) {
+ this.name = name;
+ }
- public int getAge() {
- return age;
- }
+ public int getAge() {
+ return age;
+ }
- public void setAge(int age) {
- this.age = age;
- }
+ public void setAge(int age) {
+ this.age = age;
+ }
}
```
-上述代码首先创建了一个 `Person` 对象,然后把 `Person` 对象设置进 `AtomicReference` 对象中,然后调用 `compareAndSet` 方法,该方法就是通过 CAS 操作设置 ar。如果 ar 的值为 `person` 的话,则将其设置为 `updatePerson`。实现原理与 `AtomicInteger` 类中的 `compareAndSet` 方法相同。运行上面的代码后的输出结果如下:
+上述代码首先创建了一个 Person 对象,然后把 Person 对象设置进 AtomicReference 对象中,然后调用 compareAndSet 方法,该方法就是通过 CAS 操作设置 ar。如果 ar 的值为 person 的话,则将其设置为 updatePerson。实现原理与 AtomicInteger 类中的 compareAndSet 方法相同。运行上面的代码后的输出结果如下:
```
Daisy
20
```
-**`AtomicStampedReference` 类使用示例** :
+### AtomicStampedReference 类使用示例
```java
import java.util.concurrent.atomic.AtomicStampedReference;
@@ -321,7 +454,7 @@ currentValue=0, currentStamp=0
currentValue=666, currentStamp=999, wCasResult=true
```
-**`AtomicMarkableReference` 类使用示例** :
+### AtomicMarkableReference 类使用示例
```java
import java.util.concurrent.atomic.AtomicMarkableReference;
@@ -382,17 +515,19 @@ currentValue=true, currentMark=true, wCasResult=true
## 对象的属性修改类型原子类
+### 对象的属性修改类型原子类介绍
+
如果需要原子更新某个类里的某个字段时,需要用到对象的属性修改类型原子类。
-- `AtomicIntegerFieldUpdater`:原子更新整形字段的更新器
-- `AtomicLongFieldUpdater`:原子更新长整形字段的更新器
-- `AtomicReferenceFieldUpdater` :原子更新引用类型里的字段的更新器
+- AtomicIntegerFieldUpdater:原子更新整形字段的更新器
+- AtomicLongFieldUpdater:原子更新长整形字段的更新器
+- AtomicReferenceFieldUpdater :原子更新引用类型里的字段的更新器
要想原子地更新对象的属性需要两步。第一步,因为对象的属性修改类型原子类都是抽象类,所以每次使用都必须使用静态方法 newUpdater()创建一个更新器,并且需要设置想要更新的类和属性。第二步,更新的对象属性必须使用 public volatile 修饰符。
上面三个类提供的方法几乎相同,所以我们这里以 `AtomicIntegerFieldUpdater`为例子来介绍。
-**`AtomicIntegerFieldUpdater` 类使用示例** :
+### AtomicIntegerFieldUpdater 类使用示例
```java
import java.util.concurrent.atomic.AtomicIntegerFieldUpdater;
@@ -443,6 +578,6 @@ class User {
23
```
-## 参考
+## Reference
- 《Java 并发编程的艺术》
--- docs/java/concurrent/java-concurrent-questions-02.md
@@ -12,127 +12,7 @@ head:
content: Java并发常见知识点和面试题总结(含详细解答)。
---
-## 乐观锁和悲观锁
-
-### 什么是悲观锁?使用场景是什么?
-
-悲观锁总是假设最坏的情况,认为共享资源每次被访问的时候就会出现问题(比如共享数据被修改),所以每次在获取资源操作的时候都会上锁,这样其他线程想拿到这个资源就会阻塞直到锁被上一个持有者释放。
-
-也就是说,**共享资源每次只给一个线程使用,其它线程阻塞,用完后再把资源转让给其它线程**。
-
-像 Java 中`synchronized`和`ReentrantLock`等独占锁就是悲观锁思想的实现。
-
-**悲观锁通常多用于写多比较多的情况下(多写场景),避免频繁失败和重试影响性能。**
-
-### 什么是乐观锁?使用场景是什么?
-
-乐观锁总是假设最好的情况,认为共享资源每次被访问的时候不会出现问题,线程可以不停地执行,无需加锁也无需等待,只是在提交修改的时候去验证对应的资源(也就是数据)是否被其它线程修改了(具体方法可以使用版本号机制或 CAS 算法)。
-
-在 Java 中`java.util.concurrent.atomic`包下面的原子变量类就是使用了乐观锁的一种实现方式 **CAS** 实现的。
-
-**乐观锁通常多于写比较少的情况下(多读场景),避免频繁加锁影响性能,大大提升了系统的吞吐量。**
-
-### 如何实现乐观锁?
-
-乐观锁一般会使用版本号机制或 CAS 算法实现,CAS 算法相对来说更多一些,这里需要格外注意。
-
-#### 版本号机制
-
-一般是在数据表中加上一个数据版本号 `version` 字段,表示数据被修改的次数。当数据被修改时,`version` 值会加一。当线程 A 要更新数据值时,在读取数据的同时也会读取 `version` 值,在提交更新时,若刚才读取到的 version 值为当前数据库中的 `version` 值相等时才更新,否则重试更新操作,直到更新成功。
-
-**举一个简单的例子** :假设数据库中帐户信息表中有一个 version 字段,当前值为 1 ;而当前帐户余额字段( `balance` )为 \$100 。
-
-1. 操作员 A 此时将其读出( `version`=1 ),并从其帐户余额中扣除 $50( $100-\$50 )。
-2. 在操作员 A 操作的过程中,操作员 B 也读入此用户信息( `version`=1 ),并从其帐户余额中扣除 $20 ( $100-\$20 )。
-3. 操作员 A 完成了修改工作,将数据版本号( `version`=1 ),连同帐户扣除后余额( `balance`=\$50 ),提交至数据库更新,此时由于提交数据版本等于数据库记录当前版本,数据被更新,数据库记录 `version` 更新为 2 。
-4. 操作员 B 完成了操作,也将版本号( `version`=1 )试图向数据库提交数据( `balance`=\$80 ),但此时比对数据库记录版本时发现,操作员 B 提交的数据版本号为 1 ,数据库记录当前版本也为 2 ,不满足 “ 提交版本必须等于当前版本才能执行更新 “ 的乐观锁策略,因此,操作员 B 的提交被驳回。
-
-这样就避免了操作员 B 用基于 `version`=1 的旧数据修改的结果覆盖操作员 A 的操作结果的可能。
-
-#### CAS 算法
-
-CAS 的全称是 **Compare And Swap(比较与交换)** ,用于实现乐观锁,被广泛应用于各大框架中。CAS 的思想很简单,就是用一个预期值和要更新的变量值进行比较,两值相等才会进行更新。
-
-CAS 是一个原子操作,底层依赖于一条 CPU 的原子指令。
-
-> **原子操作** 即最小不可拆分的操作,也就是说操作一旦开始,就不能被打断,直到操作完成。
-
-CAS 涉及到三个操作数:
-
-- **V** :要更新的变量值(Var)
-- **E** :预期值(Expected)
-- **N** :拟写入的新值(New)
-
-当且仅当 V 的值等于 E 时,CAS 通过原子方式用新值 N 来更新 V 的值。如果不等,说明已经有其它线程更新了V,则当前线程放弃更新。
-
-**举一个简单的例子** :线程 A 要修改变量 i 的值为 6,i 原值为 1(V = 1,E=1,N=6,假设不存在 ABA 问题)。
-
-1. i 与1 进行比较,如果相等, 则说明没被其他线程修改,可以被设置为 6 。
-2. i 与1 进行比较,如果不相等,则说明被其他线程修改,当前线程放弃更新,CAS 操作失败。
-
-当多个线程同时使用 CAS 操作一个变量时,只有一个会胜出,并成功更新,其余均会失败,但失败的线程并不会被挂起,仅是被告知失败,并且允许再次尝试,当然也允许失败的线程放弃操作。
-
-Java 语言并没有直接实现 CAS,CAS 相关的实现是通过 C++ 内联汇编的形式实现的(JNI 调用)。因此, CAS 的具体实现和操作系统以及CPU都有关系。
-
-`sun.misc`包下的`Unsafe`类提供了`compareAndSwapObject`、`compareAndSwapInt`、`compareAndSwapLong`方法来实现的对`Object`、`int`、`long`类型的 CAS 操作
-
-```java
-/**
- * CAS
- * @param o 包含要修改field的对象
- * @param offset 对象中某field的偏移量
- * @param expected 期望值
- * @param update 更新值
- * @return true | false
- */
-public final native boolean compareAndSwapObject(Object o, long offset, Object expected, Object update);
-
-public final native boolean compareAndSwapInt(Object o, long offset, int expected,int update);
-
-public final native boolean compareAndSwapLong(Object o, long offset, long expected, long update);
-```
-
-关于 `Unsafe` 类的详细介绍可以看这篇文章:[Java 魔法类 Unsafe 详解 - JavaGuide - 2022](https://javaguide.cn/java/basis/unsafe.html) 。
-
-### 乐观锁存在哪些问题?
-
-ABA 问题是乐观锁最常见的问题。
-
-#### ABA 问题
-
-如果一个变量 V 初次读取的时候是 A 值,并且在准备赋值的时候检查到它仍然是 A 值,那我们就能说明它的值没有被其他线程修改过了吗?很明显是不能的,因为在这段时间它的值可能被改为其他值,然后又改回 A,那 CAS 操作就会误认为它从来没有被修改过。这个问题被称为 CAS 操作的 **"ABA"问题。**
-
-ABA 问题的解决思路是在变量前面追加上**版本号或者时间戳**。JDK 1.5 以后的 `AtomicStampedReference ` 类就是用来解决 ABA 问题的,其中的 `compareAndSet()` 方法就是首先检查当前引用是否等于预期引用,并且当前标志是否等于预期标志,如果全部相等,则以原子方式将该引用和该标志的值设置为给定的更新值。
-
-```java
-public boolean compareAndSet(V expectedReference,
- V newReference,
- int expectedStamp,
- int newStamp) {
- Pair<V> current = pair;
- return
- expectedReference == current.reference &&
- expectedStamp == current.stamp &&
- ((newReference == current.reference &&
- newStamp == current.stamp) ||
- casPair(current, Pair.of(newReference, newStamp)));
-}
-```
-
-#### 循环时间长开销大
-
-CAS 经常会用到自旋操作来进行重试,也就是不成功就一直循环执行直到成功。如果长时间不成功,会给 CPU 带来非常大的执行开销。
-
-如果 JVM 能支持处理器提供的 pause 指令那么效率会有一定的提升,pause 指令有两个作用:
-
-1. 可以延迟流水线执行指令,使 CPU 不会消耗过多的执行资源,延迟的时间取决于具体实现的版本,在一些处理器上延迟时间是零。
-2. 可以避免在退出循环的时候因内存顺序冲而引起 CPU 流水线被清空,从而提高 CPU 的执行效率。
-
-#### 只能保证一个共享变量的原子操作
-
-CAS 只对单个共享变量有效,当操作涉及跨多个共享变量时 CAS 无效。但是从 JDK 1.5 开始,提供了`AtomicReference`类来保证引用对象之间的原子性,你可以把多个变量放在一个对象里来进行 CAS 操作.所以我们可以使用锁或者利用`AtomicReference`类把多个共享变量合并成一个共享变量来操作。
-
-## JMM(Java 内存模型)
+## JMM(Java Memory Model)
JMM(Java 内存模型)相关的问题比较多,也比较重要,于是我单独抽了一篇文章来总结 JMM 相关的知识点和问题: [JMM(Java 内存模型)详解](./jmm.md) 。
@@ -647,10 +527,6 @@ static class Entry extends WeakReference<ThreadLocal<?>> {
>
> 弱引用可以和一个引用队列(ReferenceQueue)联合使用,如果弱引用所引用的对象被垃圾回收,Java 虚拟机就会把这个弱引用加入到与之关联的引用队列中。
-## Atomic 原子类
-
-Atomic 原子类部分的内容我单独写了一篇文章来总结: [Atomic 原子类总结](./atomic-classes.md) 。
-
## 参考
- 《深入理解 Java 虚拟机》
--- docs/java/concurrent/java-concurrent-questions-03.md
@@ -294,6 +294,10 @@ head:
CPU 密集型简单理解就是利用 CPU 计算能力的任务比如你在内存中对大量数据进行排序。但凡涉及到网络读取,文件读取这类都是 IO 密集型,这类任务的特点是 CPU 计算耗费时间相比于等待 IO 操作完成的时间来说很少,大部分时间都花在了等待 IO 操作完成上。
+## Atomic 原子类
+
+Atomic 原子类部分的内容我单独写了一篇文章来总结: [Atomic 原子类总结](./atomic-classes.md) 。
+
## AQS
### AQS 是什么?
--- docs/java/concurrent/optimistic-lock-and-pessimistic-lock.md
@@ -1,128 +0,0 @@
----
-title: 乐观锁和悲观锁详解
-category: Java
-tag:
- - Java并发
----
-
-如果将悲观锁和乐观锁对应到现实生活中来。悲观锁有点像是一位比较悲观(也可以说是未雨绸缪)的人,总是会假设最坏的情况,避免出现问题。乐观锁有点像是一位比较乐观的人,总是会假设最好的情况,在要出现问题之前快速解决问题。
-
-在程序世界中,乐观锁和悲观锁的最终目的都是为了保证线程安全,避免在并发场景下的资源竞争问题。但是,相比于乐观锁,悲观锁对性能的影响更大!
-
-## 什么是悲观锁?使用场景是什么?
-
-悲观锁总是假设最坏的情况,认为共享资源每次被访问的时候就会出现问题(比如共享数据被修改),所以每次在获取资源操作的时候都会上锁,这样其他线程想拿到这个资源就会阻塞直到锁被上一个持有者释放。
-
-也就是说,**共享资源每次只给一个线程使用,其它线程阻塞,用完后再把资源转让给其它线程**。
-
-像 Java 中`synchronized`和`ReentrantLock`等独占锁就是悲观锁思想的实现。
-
-**悲观锁通常多用于写多比较多的情况下(多写场景),避免频繁失败和重试影响性能。**
-
-## 什么是乐观锁?使用场景是什么?
-
-乐观锁总是假设最好的情况,认为共享资源每次被访问的时候不会出现问题,线程可以不停地执行,无需加锁也无需等待,只是在提交修改的时候去验证对应的资源(也就是数据)是否被其它线程修改了(具体方法可以使用版本号机制或 CAS 算法)。
-
-在 Java 中`java.util.concurrent.atomic`包下面的原子变量类就是使用了乐观锁的一种实现方式 **CAS** 实现的。
-
-**乐观锁通常多于写比较少的情况下(多读场景),避免频繁加锁影响性能,大大提升了系统的吞吐量。**
-
-## 如何实现乐观锁?
-
-乐观锁一般会使用版本号机制或 CAS 算法实现,CAS 算法相对来说更多一些,这里需要格外注意。
-
-### 版本号机制
-
-一般是在数据表中加上一个数据版本号 `version` 字段,表示数据被修改的次数。当数据被修改时,`version` 值会加一。当线程 A 要更新数据值时,在读取数据的同时也会读取 `version` 值,在提交更新时,若刚才读取到的 version 值为当前数据库中的 `version` 值相等时才更新,否则重试更新操作,直到更新成功。
-
-**举一个简单的例子** :假设数据库中帐户信息表中有一个 version 字段,当前值为 1 ;而当前帐户余额字段( `balance` )为 \$100 。
-
-1. 操作员 A 此时将其读出( `version`=1 ),并从其帐户余额中扣除 $50( $100-\$50 )。
-2. 在操作员 A 操作的过程中,操作员 B 也读入此用户信息( `version`=1 ),并从其帐户余额中扣除 $20 ( $100-\$20 )。
-3. 操作员 A 完成了修改工作,将数据版本号( `version`=1 ),连同帐户扣除后余额( `balance`=\$50 ),提交至数据库更新,此时由于提交数据版本等于数据库记录当前版本,数据被更新,数据库记录 `version` 更新为 2 。
-4. 操作员 B 完成了操作,也将版本号( `version`=1 )试图向数据库提交数据( `balance`=\$80 ),但此时比对数据库记录版本时发现,操作员 B 提交的数据版本号为 1 ,数据库记录当前版本也为 2 ,不满足 “ 提交版本必须等于当前版本才能执行更新 “ 的乐观锁策略,因此,操作员 B 的提交被驳回。
-
-这样就避免了操作员 B 用基于 `version`=1 的旧数据修改的结果覆盖操作员 A 的操作结果的可能。
-
-### CAS 算法
-
-CAS 的全称是 **Compare And Swap(比较与交换)** ,用于实现乐观锁,被广泛应用于各大框架中。CAS 的思想很简单,就是用一个预期值和要更新的变量值进行比较,两值相等才会进行更新。
-
-CAS 是一个原子操作,底层依赖于一条 CPU 的原子指令。
-
-> **原子操作** 即最小不可拆分的操作,也就是说操作一旦开始,就不能被打断,直到操作完成。
-
-CAS 涉及到三个操作数:
-
-- **V** :要更新的变量值(Var)
-- **E** :预期值(Expected)
-- **N** :拟写入的新值(New)
-
-当且仅当 V 的值等于 E 时,CAS 通过原子方式用新值 N 来更新 V 的值。如果不等,说明已经有其它线程更新了V,则当前线程放弃更新。
-
-**举一个简单的例子** :线程 A 要修改变量 i 的值为 6,i 原值为 1(V = 1,E=1,N=6,假设不存在 ABA 问题)。
-
-1. i 与1 进行比较,如果相等, 则说明没被其他线程修改,可以被设置为 6 。
-2. i 与1 进行比较,如果不相等,则说明被其他线程修改,当前线程放弃更新,CAS 操作失败。
-
-当多个线程同时使用 CAS 操作一个变量时,只有一个会胜出,并成功更新,其余均会失败,但失败的线程并不会被挂起,仅是被告知失败,并且允许再次尝试,当然也允许失败的线程放弃操作。
-
-Java 语言并没有直接实现 CAS,CAS 相关的实现是通过 C++ 内联汇编的形式实现的(JNI 调用)。因此, CAS 的具体实现和操作系统以及CPU都有关系。
-
-`sun.misc`包下的`Unsafe`类提供了`compareAndSwapObject`、`compareAndSwapInt`、`compareAndSwapLong`方法来实现的对`Object`、`int`、`long`类型的 CAS 操作
-
-```java
-/**
- * CAS
- * @param o 包含要修改field的对象
- * @param offset 对象中某field的偏移量
- * @param expected 期望值
- * @param update 更新值
- * @return true | false
- */
-public final native boolean compareAndSwapObject(Object o, long offset, Object expected, Object update);
-
-public final native boolean compareAndSwapInt(Object o, long offset, int expected,int update);
-
-public final native boolean compareAndSwapLong(Object o, long offset, long expected, long update);
-```
-
-关于 `Unsafe` 类的详细介绍可以看这篇文章:[Java 魔法类 Unsafe 详解 - JavaGuide - 2022](https://javaguide.cn/java/basis/unsafe.html) 。
-
-## 乐观锁存在哪些问题?
-
-ABA 问题是乐观锁最常见的问题。
-
-### ABA 问题
-
-如果一个变量 V 初次读取的时候是 A 值,并且在准备赋值的时候检查到它仍然是 A 值,那我们就能说明它的值没有被其他线程修改过了吗?很明显是不能的,因为在这段时间它的值可能被改为其他值,然后又改回 A,那 CAS 操作就会误认为它从来没有被修改过。这个问题被称为 CAS 操作的 **"ABA"问题。**
-
-ABA 问题的解决思路是在变量前面追加上**版本号或者时间戳**。JDK 1.5 以后的 `AtomicStampedReference ` 类就是用来解决 ABA 问题的,其中的 `compareAndSet()` 方法就是首先检查当前引用是否等于预期引用,并且当前标志是否等于预期标志,如果全部相等,则以原子方式将该引用和该标志的值设置为给定的更新值。
-
-```java
-public boolean compareAndSet(V expectedReference,
- V newReference,
- int expectedStamp,
- int newStamp) {
- Pair<V> current = pair;
- return
- expectedReference == current.reference &&
- expectedStamp == current.stamp &&
- ((newReference == current.reference &&
- newStamp == current.stamp) ||
- casPair(current, Pair.of(newReference, newStamp)));
-}
-```
-
-### 循环时间长开销大
-
-CAS 经常会用到自旋操作来进行重试,也就是不成功就一直循环执行直到成功。如果长时间不成功,会给 CPU 带来非常大的执行开销。
-
-如果 JVM 能支持处理器提供的 pause 指令那么效率会有一定的提升,pause 指令有两个作用:
-
-1. 可以延迟流水线执行指令,使 CPU 不会消耗过多的执行资源,延迟的时间取决于具体实现的版本,在一些处理器上延迟时间是零。
-2. 可以避免在退出循环的时候因内存顺序冲而引起 CPU 流水线被清空,从而提高 CPU 的执行效率。
-
-### 只能保证一个共享变量的原子操作
-
-CAS 只对单个共享变量有效,当操作涉及跨多个共享变量时 CAS 无效。但是从 JDK 1.5 开始,提供了`AtomicReference`类来保证引用对象之间的原子性,你可以把多个变量放在一个对象里来进行 CAS 操作.所以我们可以使用锁或者利用`AtomicReference`类把多个共享变量合并成一个共享变量来操作。
|
javaguide
|
snailclimb
|
Java
|
Java
| 148,495
| 45,728
|
「Java学习+面试指南」一份涵盖大部分 Java 程序员所需要掌握的核心知识。准备 Java 面试,首选 JavaGuide!
|
snailclimb_javaguide
|
PERF_IMPROVEMENT
|
Code change: indexing added
|
af2391ad680b04e99c411b591fb13bc3926fe73a
|
2022-03-09 12:21:42
|
Yangshun
|
website: change fonts approach
| false
| 16
| 9
| 25
|
--- website/src/css/custom.css
@@ -1,19 +1,12 @@
@font-face {
- font-family: 'Inter var';
- font-weight: 100 900;
- font-display: swap;
+ font-family: 'Inter';
font-style: normal;
- font-named-instance: 'Regular';
- src: url('./fonts/Inter-roman-latin.var.woff2') format('woff2');
-}
-
-@font-face {
- font-family: 'Inter var';
font-weight: 100 900;
- font-display: swap;
- font-style: italic;
- font-named-instance: 'Italic';
- src: url('./fonts/Inter-italic-latin.var.woff2') format('woff2');
+ font-display: optional;
+ src: url('/fonts/inter-var-latin.woff2') format('woff2');
+ unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02BB-02BC, U+02C6, U+02DA,
+ U+02DC, U+2000-206F, U+2074, U+20AC, U+2122, U+2191, U+2193, U+2212, U+2215,
+ U+FEFF, U+FFFD;
}
:root {
@@ -25,9 +18,9 @@
--ifm-color-primary-lighter: #8a8adf;
--ifm-color-primary-lightest: #afafe9;
- --ifm-font-family-base: 'Inter var', -apple-system, BlinkMacSystemFont,
- 'Segoe UI', 'Roboto', 'Oxygen', 'Ubuntu', 'Cantarell', 'Fira Sans',
- 'Droid Sans', 'Helvetica Neue', sans-serif;
+ --ifm-font-family-base: 'Inter', -apple-system, BlinkMacSystemFont, 'Segoe UI',
+ 'Roboto', 'Oxygen', 'Ubuntu', 'Cantarell', 'Fira Sans', 'Droid Sans',
+ 'Helvetica Neue', sans-serif;
--ifm-font-size-base: 16px;
--ifm-footer-padding-vertical: 3rem;
--- website/src/css/fonts/Inter-italic-latin.var.woff2
Binary files a/website/src/css/fonts/Inter-italic-latin.var.woff2 and /dev/null differ
--- website/src/css/fonts/Inter-roman-latin.var.woff2
Binary files a/website/src/css/fonts/Inter-roman-latin.var.woff2 and /dev/null differ
--- website/static/fonts/inter-var-latin.woff2
Binary files /dev/null and b/website/static/fonts/inter-var-latin.woff2 differ
|
tech-interview-handbook
|
yangshun
|
TypeScript
|
TypeScript
| 122,353
| 15,039
|
💯 Curated coding interview preparation materials for busy software engineers
|
yangshun_tech-interview-handbook
|
CODE_IMPROVEMENT
|
fonts approach changed
|
7cc67e852e358ffc111201ed2bf081f171b4388d
|
2023-10-11 09:19:40
|
fatedier
|
fix that transport.tls.disableCustomTLSFirstByte doesn't take effect (#3660)
| false
| 27
| 10
| 37
|
--- README.md
@@ -761,8 +761,6 @@ allowPorts = [
`vhostHTTPPort` and `vhostHTTPSPort` in frps can use same port with `bindPort`. frps will detect the connection's protocol and handle it correspondingly.
-What you need to pay attention to is that if you want to configure `vhostHTTPSPort` and `bindPort` to the same port, you need to first set `transport.tls.disableCustomTLSFirstByte` to false.
-
We would like to try to allow multiple proxies bind a same remote port with different protocols in the future.
### Bandwidth Limit
--- Release.md
@@ -1,3 +1,9 @@
-### Fixes
+### Features
-* `transport.tls.disableCustomTLSFirstByte` doesn't have any effect.
+* Configuration: We now support TOML, YAML, and JSON for configuration. Please note that INI is deprecated and will be removed in future releases. New features will only be available in TOML, YAML, or JSON. Users wanting these new features should switch their configuration format accordingly. #2521
+
+### Breaking Changes
+
+* Change the way to start the visitor through the command line from `frpc stcp --role=visitor xxx` to `frpc stcp visitor xxx`.
+* Modified the semantics of the `server_addr` in the command line, no longer including the port. Added the `server_port` parameter to configure the port.
+* No longer support range ports mapping in TOML/YAML/JSON.
--- client/service.go
@@ -476,9 +476,6 @@ func (cm *ConnectionManager) realConnect() (net.Conn, error) {
// Make sure that if it is wss, the websocket hook is executed after the tls hook.
dialOptions = append(dialOptions, libdial.WithAfterHook(libdial.AfterHook{Hook: utilnet.DialHookWebsocket(protocol, tlsConfig.ServerName), Priority: 110}))
default:
- dialOptions = append(dialOptions, libdial.WithAfterHook(libdial.AfterHook{
- Hook: utilnet.DialHookCustomTLSHeadByte(tlsConfig != nil, lo.FromPtr(cm.cfg.Transport.TLS.DisableCustomTLSFirstByte)),
- }))
dialOptions = append(dialOptions, libdial.WithTLSConfig(tlsConfig))
}
--- pkg/util/version/version.go
@@ -19,7 +19,7 @@ import (
"strings"
)
-var version = "0.52.1"
+var version = "0.52.0"
func Full() string {
return version
--- test/e2e/v1/basic/client_server.go
@@ -291,7 +291,7 @@ var _ = ginkgo.Describe("[Feature: Client-Server]", func() {
})
})
- ginkgo.Describe("TLS with disableCustomTLSFirstByte set to false", func() {
+ ginkgo.Describe("TLS with disable_custom_tls_first_byte set to false", func() {
supportProtocols := []string{"tcp", "kcp", "quic", "websocket"}
for _, protocol := range supportProtocols {
tmp := protocol
@@ -322,22 +322,4 @@ var _ = ginkgo.Describe("[Feature: Client-Server]", func() {
})
}
})
-
- ginkgo.Describe("Use same port for bindPort and vhostHTTPSPort", func() {
- supportProtocols := []string{"tcp", "kcp", "quic", "websocket"}
- for _, protocol := range supportProtocols {
- tmp := protocol
- defineClientServerTest("Use same port for bindPort and vhostHTTPSPort: "+strings.ToUpper(tmp), f, &generalTestConfigures{
- server: fmt.Sprintf(`
- vhostHTTPSPort = {{ .%s }}
- %s
- `, consts.PortServerName, renderBindPortConfig(protocol)),
- // transport.tls.disableCustomTLSFirstByte should set to false when vhostHTTPSPort is same as bindPort
- client: fmt.Sprintf(`
- transport.protocol = "%s"
- transport.tls.disableCustomTLSFirstByte = false
- `, protocol),
- })
- }
- })
})
|
frp
|
fatedier
|
Go
|
Go
| 91,116
| 13,769
|
A fast reverse proxy to help you expose a local server behind a NAT or firewall to the internet.
|
fatedier_frp
|
BUG_FIX
|
obvious
|
d3d175345b98e3b640aab209f365f2a7cbd5ede3
|
2022-01-28 10:51:01
|
bannedbook
|
update
| false
| 0
| 0
| 0
|
--- docs/vsp.py
Binary files a/docs/vsp.py and b/docs/vsp.py differ
|
fanqiang
|
bannedbook
|
Kotlin
|
Kotlin
| 39,286
| 7,317
|
翻墙-科学上网
|
bannedbook_fanqiang
|
CONFIG_CHANGE
|
probably a config change since some update is done
|
7053d8fd7ab5a2ebf9c1fa5afc09039f6a17f46b
|
2023-08-04 21:09:51
|
Kingkor Roy Tirtho
|
chore: bump version and generate CHANGELOGS
| false
| 33
| 3
| 36
|
--- CHANGELOG.md
@@ -2,30 +2,6 @@
All notable changes to this project will be documented in this file. See [standard-version](https://github.com/conventional-changelog/standard-version) for commit guidelines.
-## [3.0.1](https://github.com/KRTirtho/spotube/compare/v3.0.0...v3.1.0) (2023-08-04)
-
-
-### Features
-
-* Force High Refresh Rate on some Android devices ([#607](https://github.com/KRTirtho/spotube/issues/607)) ([6dff099](https://github.com/KRTirtho/spotube/commit/6dff0996bdfee603acf242b1316f8793d625267c))
-* **translations:** add spanish translations ([#585](https://github.com/KRTirtho/spotube/issues/585)) ([042d7a4](https://github.com/KRTirtho/spotube/commit/042d7a4a10c78dd93a56a2f32d18a0fb74dbe697))
-* **translations:** add Simplified Chinese translation. ([#556](https://github.com/KRTirtho/spotube/issues/556)) ([26dbd52](https://github.com/KRTirtho/spotube/commit/26dbd523737d868114a47e82acd412cdae622b7c))
-
-
-### Bug Fixes
-
-* alternative track source textfield safe area ([b8c6d7e](https://github.com/KRTirtho/spotube/commit/b8c6d7eb6ae1c54bdc83a455850dfca0f27bd881))
-* avoid sponsor block for first few seconds to not break the stream ([d8cf2ae](https://github.com/KRTirtho/spotube/commit/d8cf2ae1315dc3848fe1ac12286faafe90fdbed7))
-* cache segments casting error ([dfd60bd](https://github.com/KRTirtho/spotube/commit/dfd60bd4cc0fe8fe90e0cbfd26331df505cde2aa))
-* duration is always zero in PlayerView ([4885dca](https://github.com/KRTirtho/spotube/commit/4885dca04f06658391d1063e6c5a009547391a6f))
-* flags not showing up and html in descriptions ([5a563ef](https://github.com/KRTirtho/spotube/commit/5a563ef4289423ceb5c44ba13f3cfda34b2d16dd))
-* **linux:** crash when no secret service provider found ([#608](https://github.com/KRTirtho/spotube/issues/608)) ([888a4b1](https://github.com/KRTirtho/spotube/commit/888a4b1162c25371d7f6e88fae3a2473cabf1434))
-* login dialog stays after login, mention sp_gaid in tutorial ([b492840](https://github.com/KRTirtho/spotube/commit/b4928405122ae5e5d4d4560f316f2a546a2fabe4))
-* **album_sync**: negative index exception in update palette ([#561](https://github.com/KRTirtho/spotube/issues/561)) ([0089d47](https://github.com/KRTirtho/spotube/commit/0089d471ae6d595e058061e3ac44caecdba12f61))
-* remove adaptive widgets ([#520](https://github.com/KRTirtho/spotube/issues/520)) ([e4cbdd3](https://github.com/KRTirtho/spotube/commit/e4cbdd37479a572198c1ca27fcbbba0232275513))
-* shuffle not working ([#562](https://github.com/KRTirtho/spotube/issues/562)) ([dc76634](https://github.com/KRTirtho/spotube/commit/dc76634a6e4ccdca0f09d63a2db82cce53d950d7))
-* track not skipping to next even when source is available ([0b7affd](https://github.com/KRTirtho/spotube/commit/0b7affdc058c028982266d5c93215697301846bd))
-
## [3.0.0](https://github.com/KRTirtho/spotube/compare/v2.7.1...v3.0.0) (2023-07-02)
--- CONTRIBUTION.md
@@ -131,7 +131,7 @@ Do the following:
```
- Fedora
```bash
- dnf install mpv mpv-devel libappindicator-gtk3 libappindicator-gtk3-devel libsecret libsecret-devel jsoncpp jsoncpp-devel libnotify libnotify-devel NetworkManager
+ dnf install mpv mpv-devel libappindicator-gtk3 libappindicator-gtk3-devel libsecret libsecret-devel jsoncpp jsoncpp-devel libnotify libnotify-devel
```
- Clone the Repo
- Create a `.env` in root of the project following the `.env.example` template
--- aur-struct/.SRCINFO
@@ -10,7 +10,6 @@ pkgbase = spotube-bin
depends = libsecret
depends = jsoncpp
depends = libnotify
- depends = networkmanager
source = https://github.com/KRTirtho/spotube/releases/download/v2.3.0/Spotube-linux-x86_64.tar.xz
md5sums = 8cd6a7385c5c75d203dccd762f1d63ec
--- aur-struct/PKGBUILD
@@ -8,7 +8,7 @@ arch=(x86_64)
url="https://github.com/KRTirtho/spotube/"
license=('BSD-4-Clause')
groups=()
-depends=('mpv' 'libappindicator-gtk3' 'libsecret' 'jsoncpp' 'libnotify' 'networkmanager')
+depends=('mpv' 'libappindicator-gtk3' 'libsecret' 'jsoncpp' 'libnotify')
makedepends=()
checkdepends=()
optdepends=()
--- linux/packaging/deb/make_config.yaml
@@ -16,7 +16,6 @@ dependencies:
- libsecret-1-0
- libnotify-bin
- libjsoncpp25
- - network-manager
essential: false
icon: assets/spotube-logo.png
--- linux/packaging/rpm/make_config.yaml
@@ -12,7 +12,6 @@ requires:
- jsoncpp
- libsecret
- libnotify
- - NetworkManager
display_name: Spotube
--- pubspec.yaml
@@ -3,10 +3,7 @@ description: Open source Spotify client that doesn't require Premium nor uses El
publish_to: "none"
-version: 3.0.1+20
-
-homepage: https://spotube.krtirtho.dev
-repository: https://github.com/KRTirtho/spotube
+version: 3.0.0+19
environment:
sdk: ">=3.0.0 <4.0.0"
|
spotube
|
krtirtho
|
Dart
|
Dart
| 35,895
| 1,491
|
🎧 Open source Spotify client that doesn't require Premium nor uses Electron! Available for both desktop & mobile!
|
krtirtho_spotube
|
CONFIG_CHANGE
|
obvious
|
4cdd97210266b6bbfd69f2eaa45b273ca1f0b523
| null |
Neal Wu
|
Typo fix for mnist_tpu.py
| false
| 1
| 1
| 0
|
--- mnist_tpu.py
@@ -41,7 +41,7 @@ tf.flags.DEFINE_string(
help="Name of the Cloud TPU for Cluster Resolvers. You must specify either "
"this flag or --master.")
-# Model specific paramenters
+# Model specific parameters
tf.flags.DEFINE_string(
"master", default=None,
help="GRPC URL of the master (e.g. grpc://ip.address.of.tpu:8470). You "
|
tensorflow_models.json
| null | null | null | null | null | null |
tensorflow_models.json
|
CONFIG_CHANGE
|
5, obvious
|
996d1c3242be5569bb4b579b2e3ad25a6d928dfb
| null |
Jaime Marquínez Ferrándiz
|
Don't include the test/testdata directory in the youtube-dl.tar.gz The last releases included big files that increased the size of the compressed file.
| false
| 1
| 0
| 1
|
--- Makefile
@@ -71,6 +71,7 @@ youtube-dl.tar.gz: youtube-dl README.md README.txt youtube-dl.1 youtube-dl.bash-
--exclude '*~' \
--exclude '__pycache' \
--exclude '.git' \
+ --exclude 'testdata' \
-- \
bin devscripts test youtube_dl \
CHANGELOG LICENSE README.md README.txt \
|
ytdl-org_youtube-dl.json
| null | null | null | null | null | null |
ytdl-org_youtube-dl.json
|
PERF_IMPROVEMENT
|
5, obvious
|
8737fc66dd737082d56e0880b5941519defd9bc3
|
2023-07-18 22:02:05
|
sjinzh
|
rust : add codes for chapter_divide_and_conquer (#621)
| false
| 157
| 0
| 157
|
--- codes/rust/Cargo.toml
@@ -334,20 +334,5 @@ path = "chapter_backtracking/preorder_traversal_iii_compact.rs"
name = "preorder_traversal_iii_template"
path = "chapter_backtracking/preorder_traversal_iii_template.rs"
-# Run Command: cargo run --bin binary_search_recur
-[[bin]]
-name = "binary_search_recur"
-path = "chapter_divide_and_conquer/binary_search_recur.rs"
-
-# Run Command: cargo run --bin hanota
-[[bin]]
-name = "hanota"
-path = "chapter_divide_and_conquer/hanota.rs"
-
-# Run Command: cargo run --bin build_tree
-[[bin]]
-name = "build_tree"
-path = "chapter_divide_and_conquer/build_tree.rs"
-
[dependencies]
rand = "0.8.5"
\ No newline at end of file
--- codes/rust/chapter_divide_and_conquer/binary_search_recur.rs
@@ -1,39 +0,0 @@
-/*
- * File: binary_search_recur.rs
- * Created Time: 2023-07-15
- * Author: sjinzh ([email protected])
- */
-
-/* 二分查找:问题 f(i, j) */
-fn dfs(nums: &[i32], target: i32, i: i32, j: i32) -> i32 {
- // 若区间为空,代表无目标元素,则返回 -1
- if i > j { return -1; }
- let m: i32 = (i + j) / 2;
- if nums[m as usize] < target {
- // 递归子问题 f(m+1, j)
- return dfs(nums, target, m + 1, j);
- } else if nums[m as usize] > target {
- // 递归子问题 f(i, m-1)
- return dfs(nums, target, i, m - 1);
- } else {
- // 找到目标元素,返回其索引
- return m;
- }
-}
-
-/* 二分查找 */
-fn binary_search(nums: &[i32], target: i32) -> i32 {
- let n = nums.len() as i32;
- // 求解问题 f(0, n-1)
- dfs(nums, target, 0, n - 1)
-}
-
-/* Driver Code */
-pub fn main() {
- let target = 6;
- let nums = [ 1, 3, 6, 8, 12, 15, 23, 26, 31, 35 ];
-
- // 二分查找(双闭区间)
- let index = binary_search(&nums, target);
- println!("目标元素 6 的索引 = {index}");
-}
--- codes/rust/chapter_divide_and_conquer/build_tree.rs
@@ -1,50 +0,0 @@
-/*
- * File: build_tree.rs
- * Created Time: 2023-07-15
- * Author: sjinzh ([email protected])
- */
-
- use std::{cell::RefCell, rc::Rc};
- use std::collections::HashMap;
- include!("../include/include.rs");
- use tree_node::TreeNode;
-
-/* 构建二叉树:分治 */
-fn dfs(preorder: &[i32], inorder: &[i32], hmap: &HashMap<i32, i32>, i: i32, l: i32, r: i32) -> Option<Rc<RefCell<TreeNode>>> {
- // 子树区间为空时终止
- if r - l < 0 { return None; }
- // 初始化根节点
- let root = TreeNode::new(preorder[i as usize]);
- // 查询 m ,从而划分左右子树
- let m = hmap.get(&preorder[i as usize]).unwrap();
- // 子问题:构建左子树
- root.borrow_mut().left = dfs(preorder, inorder, hmap, i + 1, l, m - 1);
- // 子问题:构建右子树
- root.borrow_mut().right = dfs(preorder, inorder, hmap, i + 1 + m - l, m + 1, r);
- // 返回根节点
- Some(root)
-}
-
-/* 构建二叉树 */
- fn build_tree(preorder: &[i32], inorder: &[i32]) -> Option<Rc<RefCell<TreeNode>>> {
- // 初始化哈希表,存储 inorder 元素到索引的映射
- let mut hmap: HashMap<i32, i32> = HashMap::new();
- for i in 0..inorder.len() {
- hmap.insert(inorder[i], i as i32);
- }
- let root = dfs(preorder, inorder, &hmap, 0, 0, inorder.len() as i32 - 1);
- root
-}
-
- /* Driver Code */
- fn main() {
- let preorder = [ 3, 9, 2, 1, 7 ];
- let inorder = [ 9, 3, 1, 2, 7 ];
- println!("中序遍历 = {:?}", preorder);
- println!("前序遍历 = {:?}", inorder);
-
- let root = build_tree(&preorder, &inorder);
- println!("构建的二叉树为:");
- print_util::print_tree(root.as_ref().unwrap());
- }
-
\ No newline at end of file
--- codes/rust/chapter_divide_and_conquer/hanota.rs
@@ -1,53 +0,0 @@
-/*
- * File: hanota.rs
- * Created Time: 2023-07-15
- * Author: sjinzh ([email protected])
- */
-
-/* 移动一个圆盘 */
-fn move_pan(src: &mut Vec<i32>, tar: &mut Vec<i32>) {
- // 从 src 顶部拿出一个圆盘
- let pan = src.remove(src.len() - 1);
- // 将圆盘放入 tar 顶部
- tar.push(pan);
-}
-
-/* 求解汉诺塔:问题 f(i) */
-fn dfs(i: i32, src: &mut Vec<i32>, buf: &mut Vec<i32>, tar: &mut Vec<i32>) {
- // 若 src 只剩下一个圆盘,则直接将其移到 tar
- if i == 1 {
- move_pan(src, tar);
- return;
- }
- // 子问题 f(i-1) :将 src 顶部 i-1 个圆盘借助 tar 移到 buf
- dfs(i - 1, src, tar, buf);
- // 子问题 f(1) :将 src 剩余一个圆盘移到 tar
- move_pan(src, tar);
- // 子问题 f(i-1) :将 buf 顶部 i-1 个圆盘借助 src 移到 tar
- dfs(i - 1, buf, src, tar);
-}
-
-/* 求解汉诺塔 */
-fn hanota(A: &mut Vec<i32>, B: &mut Vec<i32>, C: &mut Vec<i32>) {
- let n = A.len() as i32;
- // 将 A 顶部 n 个圆盘借助 B 移到 C
- dfs(n, A, B, C);
-}
-
-/* Driver Code */
-pub fn main() {
- let mut A = vec![5, 4, 3, 2, 1];
- let mut B = Vec::new();
- let mut C = Vec::new();
- println!("初始状态下:");
- println!("A = {:?}", A);
- println!("B = {:?}", B);
- println!("C = {:?}", C);
-
- hanota(&mut A, &mut B, &mut C);
-
- println!("圆盘移动完成后:");
- println!("A = {:?}", A);
- println!("B = {:?}", B);
- println!("C = {:?}", C);
-}
|
hello-algo
|
krahets
|
Java
|
Java
| 109,696
| 13,651
|
《Hello 算法》:动画图解、一键运行的数据结构与算法教程。支持 Python, Java, C++, C, C#, JS, Go, Swift, Rust, Ruby, Kotlin, TS, Dart 代码。简体版和繁体版同步更新,English version ongoing
|
krahets_hello-algo
|
NEW_FEAT
|
new code added
|
5b283d6fde6ed668fc99a492d7b3ad08ed225259
|
2024-02-22 20:49:47
|
Vinta Chen
|
refine Database Drivers
| false
| 7
| 9
| 16
|
--- README.md
@@ -382,23 +382,25 @@ Inspired by [awesome-php](https://github.com/ziadoz/awesome-php).
*Libraries for connecting and operating databases.*
* MySQL - [awesome-mysql](http://shlomi-noach.github.io/awesome-mysql/)
- * [mysqlclient](https://github.com/PyMySQL/mysqlclient) - MySQL connector with Python 3 support ([mysql-python](https://sourceforge.net/projects/mysql-python/) fork).
+ * [mysqlclient](https://github.com/PyMySQL/mysqlclient-python) - MySQL connector with Python 3 support ([mysql-python](https://sourceforge.net/projects/mysql-python/) fork).
* [PyMySQL](https://github.com/PyMySQL/PyMySQL) - A pure Python MySQL driver compatible to mysql-python.
* PostgreSQL - [awesome-postgres](https://github.com/dhamaniasad/awesome-postgres)
- * [psycopg](https://github.com/psycopg/psycopg) - The most popular PostgreSQL adapter for Python.
+ * [psycopg](https://www.psycopg.org/) - The most popular PostgreSQL adapter for Python.
+ * [queries](https://github.com/gmr/queries) - A wrapper of the psycopg2 library for interacting with PostgreSQL.
* SQlite - [awesome-sqlite](https://github.com/planetopendata/awesome-sqlite)
- * [sqlite3](https://docs.python.org/3/library/sqlite3.html) - (Python standard library) SQlite interface compliant with DB-API 2.0.
- * [sqlite-utils](https://github.com/simonw/sqlite-utils) - Python CLI utility and library for manipulating SQLite databases.
+ * [sqlite3](https://docs.python.org/3/library/sqlite3.html) - (Python standard library) SQlite interface compliant with DB-API 2.0
+ * [SuperSQLite](https://github.com/plasticityai/supersqlite) - A supercharged SQLite library built on top of [apsw](https://github.com/rogerbinns/apsw).
* Other Relational Databases
- * [pymssql](https://github.com/pymssql/pymssql) - A simple database interface to Microsoft SQL Server.
+ * [pymssql](https://pymssql.readthedocs.io/en/latest/) - A simple database interface to Microsoft SQL Server.
* [clickhouse-driver](https://github.com/mymarilyn/clickhouse-driver) - Python driver with native interface for ClickHouse.
* NoSQL Databases
* [cassandra-driver](https://github.com/datastax/python-driver) - The Python Driver for Apache Cassandra.
- * [happybase](https://github.com/python-happybase/happybase) - A developer-friendly library for Apache HBase.
+ * [happybase](https://github.com/wbolster/happybase) - A developer-friendly library for Apache HBase.
* [kafka-python](https://github.com/dpkp/kafka-python) - The Python client for Apache Kafka.
* [pymongo](https://github.com/mongodb/mongo-python-driver) - The official Python client for MongoDB.
+ * [redis-py](https://github.com/andymccurdy/redis-py) - The Python client for Redis.
+* Asynchronous Clients
* [motor](https://github.com/mongodb/motor) - The async Python driver for MongoDB.
- * [redis-py](https://github.com/redis/redis-py) - The Python client for Redis.
## Date and Time
|
awesome-python
|
vinta
|
Python
|
Python
| 236,071
| 25,368
|
An opinionated list of awesome Python frameworks, libraries, software and resources.
|
vinta_awesome-python
|
DOC_CHANGE
|
Obvious
|
cc55806a112137976b38ddfb5cbebd971e2c6e9f
| null |
Ritesh Kumar
|
closes #17
| false
| 1
| 1
| 0
|
--- index.js
@@ -91,7 +91,7 @@ class Layout extends React.Component {
<SplitPane
split={props.downPanelInRight ? 'vertical' : 'horizontal'}
primary="second"
- minSize={100}
+ minSize={props.downPanelInRight ? 200 : 100}
defaultSize={downPanelDefaultSize}
resizerChildren={props.downPanelInRight ? vsplit : hsplit}
onDragStarted={onDragStart}
|
storybookjs_storybook.json
| null | null | null | null | null | null |
storybookjs_storybook.json
|
BUG_FIX
|
4, irrelevant commit msg but looking at the changes, it seems like a bug fix
|
ab492d9b059d57128ea7443a04d4d02e95a20f29
|
2023-03-24 15:41:23
|
Johannes Breuer
|
Add new entries for Misc & Journals Added "New Media & Society" to Journals and a RatSWD publication on Big Data in the Social Sciences as well as the reddit community CompSocial to Miscellaneous
| false
| 3
| 0
| 3
|
--- README.md
@@ -229,7 +229,6 @@ Koç University, Turkey
- [Journal of Computational Social Science](https://www.springer.com/journal/42001)
- [Journal of Quantitative Description: Digital Media](https://journalqd.org)
- [Nature Human Behavior](https://www.nature.com/nathumbehav/)
-- [New Media & Society](https://journals.sagepub.com/home/nms)
- [Social Media and Society](https://journals.sagepub.com/home/sms)
- [Social Science Computer Review](https://journals.sagepub.com/home/ssc)
@@ -293,8 +292,6 @@ Koç University, Turkey
- [Google Group Computational Social Science Network](https://groups.google.com/g/CSSNET)
- [Podcast about Computational Communication Science](https://anchor.fm/ccs-pod)
-- [RatSWD publication "Big data in social, behavioural, and economic sciences: Data access and research data management (Including an expert opinion on "Web scraping in independent academic research")"](https://www.konsortswd.de/en/latest/publication/big-data-in-social-behavioural-and-economic-sciences-data-access-and-research-data-management/)
-- [reddit community "CompSocial"](https://www.reddit.com/r/CompSocial/)
## Relevant Awesome Lists
|
awesome-computational-social-science
|
gesiscss
|
R
|
R
| 648
| 83
|
A list of awesome resources for Computational Social Science
|
gesiscss_awesome-computational-social-science
|
CONFIG_CHANGE
|
Very small changes
|
706adcaa01d1adde1099a3bf78db977b7aa9e427
|
2022-12-24 21:01:45
|
ChenRenJie
|
Fix chain timeout not working on socket connecting (#7567) * Fix Chain#withConnectTimeout can not effect socket connect timeout
* Fix Chain#withReadTimeout can not effect socket soTimeout
| false
| 8
| 3
| 11
|
--- okhttp/src/jvmMain/kotlin/okhttp3/internal/connection/ConnectPlan.kt
@@ -30,7 +30,6 @@ import okhttp3.CertificatePinner
import okhttp3.ConnectionSpec
import okhttp3.Handshake
import okhttp3.Handshake.Companion.handshake
-import okhttp3.Interceptor
import okhttp3.OkHttpClient
import okhttp3.Protocol
import okhttp3.Request
@@ -63,7 +62,6 @@ class ConnectPlan(
// Configuration and state scoped to the call.
private val client: OkHttpClient,
private val call: RealCall,
- private val chain: Interceptor.Chain,
private val routePlanner: RealRoutePlanner,
// Specifics to this plan.
@@ -108,7 +106,6 @@ class ConnectPlan(
return ConnectPlan(
client = client,
call = call,
- chain = chain,
routePlanner = routePlanner,
route = route,
routes = routes,
@@ -252,9 +249,9 @@ class ConnectPlan(
throw IOException("canceled")
}
- rawSocket.soTimeout = chain.readTimeoutMillis()
+ rawSocket.soTimeout = client.readTimeoutMillis
try {
- Platform.get().connectSocket(rawSocket, route.socketAddress, chain.connectTimeoutMillis())
+ Platform.get().connectSocket(rawSocket, route.socketAddress, client.connectTimeoutMillis)
} catch (e: ConnectException) {
throw ConnectException("Failed to connect to ${route.socketAddress}").apply {
initCause(e)
@@ -503,7 +500,6 @@ class ConnectPlan(
return ConnectPlan(
client = client,
call = call,
- chain = chain,
routePlanner = routePlanner,
route = route,
routes = routes,
--- okhttp/src/jvmMain/kotlin/okhttp3/internal/connection/RealRoutePlanner.kt
@@ -39,7 +39,7 @@ class RealRoutePlanner(
private val client: OkHttpClient,
override val address: Address,
private val call: RealCall,
- private val chain: RealInterceptorChain,
+ chain: RealInterceptorChain,
) : RoutePlanner {
private val doExtensiveHealthChecks = chain.request.method != "GET"
@@ -210,7 +210,6 @@ class RealRoutePlanner(
return ConnectPlan(
client = client,
call = call,
- chain = chain,
routePlanner = this,
route = route,
routes = routes,
|
okhttp
|
square
|
Kotlin
|
Kotlin
| 46,179
| 9,194
|
Square’s meticulous HTTP client for the JVM, Android, and GraalVM.
|
square_okhttp
|
CODE_IMPROVEMENT
|
Code change: type annotation added
|
b2311e55e758df763219e7068f18dbcfdf6d2317
|
2022-01-28 12:59:43
|
fatedier
|
add new sponsor logo (#2785)
| false
| 29
| 7
| 36
|
--- README.md
@@ -6,38 +6,27 @@
[README](README.md) | [中文文档](README_zh.md)
+## What is frp?
+
+frp is a fast reverse proxy to help you expose a local server behind a NAT or firewall to the Internet. As of now, it supports **TCP** and **UDP**, as well as **HTTP** and **HTTPS** protocols, where requests can be forwarded to internal services by domain name.
+
+frp also has a P2P connect mode.
+
<h3 align="center">Platinum Sponsors</h3>
<!--platinum sponsors start-->
<p align="center">
<a href="https://www.doppler.com/?utm_campaign=github_repo&utm_medium=referral&utm_content=frp&utm_source=github" target="_blank">
- <img width="450px" src="https://raw.githubusercontent.com/fatedier/frp/dev/doc/pic/sponsor_doppler.png">
+ <img width="400px" src="https://raw.githubusercontent.com/fatedier/frp/dev/doc/pic/sponsor_doppler.png">
</a>
</p>
<!--platinum sponsors end-->
-<h3 align="center">Gold Sponsors</h3>
-<!--gold sponsors start-->
-
-<p align="center">
- <a href="https://workos.com/?utm_campaign=github_repo&utm_medium=referral&utm_content=frp&utm_source=github" target="_blank">
- <img width="300px" src="https://raw.githubusercontent.com/fatedier/frp/dev/doc/pic/sponsor_workos.png">
- </a>
-</p>
-
-<!--gold sponsors end-->
-
<h3 align="center">Silver Sponsors</h3>
* Sakura Frp - 欢迎点击 "加入我们"
-## What is frp?
-
-frp is a fast reverse proxy to help you expose a local server behind a NAT or firewall to the Internet. As of now, it supports **TCP** and **UDP**, as well as **HTTP** and **HTTPS** protocols, where requests can be forwarded to internal services by domain name.
-
-frp also has a P2P connect mode.
-
## Table of Contents
<!-- vim-markdown-toc GFM -->
--- README_zh.md
@@ -18,17 +18,6 @@ frp 是一个专注于内网穿透的高性能的反向代理应用,支持 TCP
<!--platinum sponsors end-->
-<h3 align="center">Gold Sponsors</h3>
-<!--gold sponsors start-->
-
-<p align="center">
- <a href="https://workos.com/?utm_campaign=github_repo&utm_medium=referral&utm_content=frp&utm_source=github" target="_blank">
- <img width="300px" src="https://raw.githubusercontent.com/fatedier/frp/dev/doc/pic/sponsor_workos.png">
- </a>
-</p>
-
-<!--gold sponsors end-->
-
<h3 align="center">Silver Sponsors</h3>
* Sakura Frp - 欢迎点击 "加入我们"
--- doc/pic/sponsor_doppler.png
Binary files a/doc/pic/sponsor_doppler.png and b/doc/pic/sponsor_doppler.png differ
--- doc/pic/sponsor_workos.png
Binary files a/doc/pic/sponsor_workos.png and /dev/null differ
|
frp
|
fatedier
|
Go
|
Go
| 91,116
| 13,769
|
A fast reverse proxy to help you expose a local server behind a NAT or firewall to the internet.
|
fatedier_frp
|
CONFIG_CHANGE
|
image files added
|
73372ad35aafdbdde933e491a6c2b24198581849
|
2023-06-16 19:20:23
|
krahets
|
Update the summary of hashing chapter.
| false
| 14
| 10
| 24
|
--- docs/chapter_hashing/hash_algorithm.md
@@ -221,7 +221,7 @@ $$
## 数据结构的哈希值
-我们知道,哈希表的 `key` 可以是整数、小数或字符串等数据类型。编程语言通常会为这些数据类型提供内置的哈希算法,用于计算哈希表中的桶索引。以 Python 为例,我们可以调用 `hash()` 函数来计算各种数据类型的哈希值,包括:
+我们知道,哈希表的 `key` 可以是整数、小数或字符串等数据类型。编程语言通常会为这些数据类型提供内置的哈希算法 `hash()` ,用于计算哈希表中的桶索引。以 Python 为例:
- 整数和布尔量的哈希值就是其本身。
- 浮点数和字符串的哈希值计算较为复杂,有兴趣的同学请自行学习。
--- docs/chapter_hashing/summary.md
@@ -1,15 +1,11 @@
# 小结
-- 输入 `key` ,哈希表能够在 $O(1)$ 时间内查询到 `value` ,效率非常高。
-- 常见的哈希表操作包括查询、添加键值对、删除键值对和遍历哈希表等。
-- 哈希函数将 `key` 映射为数组索引,从而访问对应桶并获取 `value` 。
-- 两个不同的 `key` 可能在经过哈希函数后得到相同的数组索引,导致查询结果出错,这种现象被称为哈希冲突。
-- 哈希表容量越大,哈希冲突的概率就越低。因此可以通过扩容哈希表来缓解哈希冲突。与数组扩容类似,哈希表扩容操作的开销很大。
-- 负载因子定义为哈希表中元素数量除以桶数量,反映了哈希冲突的严重程度,常用作触发哈希表扩容的条件。
-- 链式地址通过将单个元素转化为链表,将所有冲突元素存储在同一个链表中。然而,链表过长会降低查询效率,可以进一步将链表转换为红黑树来提高效率。
-- 开放寻址通过多次探测来处理哈希冲突。线性探测使用固定步长,缺点是不能删除元素,且容易产生聚集。多次哈希使用多个哈希函数进行探测,相较线性探测更不易产生聚集,但多个哈希函数增加了计算量。
-- 不同编程语言采取了不同的哈希表实现。例如,Java 的 `HashMap` 使用链式地址,而 Python 的 `Dict` 采用开放寻址。
-- 在哈希表中,我们希望哈希算法具有确定性、高效率和均匀分布的特点。在密码学中,哈希算法还应该具备抗碰撞性和雪崩效应。
-- 哈希算法通常采用大质数作为模数,以最大化地保证哈希值的均匀分布,减少哈希冲突。
-- 常见的哈希算法包括 MD5, SHA-1, SHA-2, SHA3 等。MD5 常用语校验文件完整性,SHA-2 常用于安全应用与协议。
-- 编程语言通常会为数据类型提供内置哈希算法,用于计算哈希表中的桶索引。通常情况下,只有不可变对象是可哈希的。
+- 哈希表能够在 $O(1)$ 时间内将键 key 映射到值 value,效率非常高。
+- 常见的哈希表操作包括查询、添加与删除键值对、遍历键值对等。
+- 哈希函数将 key 映射为数组索引(桶索引),从而访问对应的值 value 。
+- 两个不同的 key 可能在经过哈希函数后得到相同的索引,导致查询结果出错,这种现象被称为哈希冲突。
+- 缓解哈希冲突的方法主要有扩容哈希表和优化哈希表的表示方法。
+- 负载因子定义为哈希表中元素数量除以桶数量,反映了哈希冲突的严重程度,常用作触发哈希表扩容的条件。与数组扩容类似,哈希表扩容操作也会产生较大的开销。
+- 链式地址通过将单个元素转化为链表,将所有冲突元素存储在同一个链表中,从而解决哈希冲突。然而,过长的链表会降低查询效率,可以通过将链表转换为 AVL 树或红黑树来改善。
+- 开放寻址通过多次探测来解决哈希冲突。线性探测使用固定步长,缺点是不能删除元素且容易产生聚集。多次哈希使用多个哈希函数进行探测,相对线性探测不易产生聚集,但多个哈希函数增加了计算量。
+- 不同编程语言采取了不同的哈希表实现策略。例如,Java 的 HashMap 使用链式地址,而 Python 的 Dict 采用开放寻址。
|
hello-algo
|
krahets
|
Java
|
Java
| 109,696
| 13,651
|
《Hello 算法》:动画图解、一键运行的数据结构与算法教程。支持 Python, Java, C++, C, C#, JS, Go, Swift, Rust, Ruby, Kotlin, TS, Dart 代码。简体版和繁体版同步更新,English version ongoing
|
krahets_hello-algo
|
DOC_CHANGE
|
change in md files
|
d16ad0c6266d76e9182eb23dd629370ce026fe16
|
2024-10-30 06:42:41
|
Jozef Gaal
|
Translated missing Slovak translations (#1973)
| false
| 8
| 8
| 16
|
--- app/assets/i18n/_missing_translations_sk.json
@@ -6,26 +6,26 @@
"receiveTab": {
"quickSave": {
"off": "@:general.off",
- "favorites": "Obľúbené",
+ "favorites": "Favorites",
"on": "@:general.on"
}
},
"settingsTab": {
"network": {
- "useSystemName": "Použiť systémové meno",
- "generateRandomAlias": "Generovať náhodnú prezývku"
+ "useSystemName": "Use system name",
+ "generateRandomAlias": "Generate random alias"
}
},
"dialogs": {
"openFile": {
- "title": "Otvoriť súbor",
- "content": "Chcete otvoriť prijatý súbor?"
+ "title": "Open file",
+ "content": "Do you want to open the received file?"
},
"quickSaveFromFavoritesNotice": {
"content": [
- "Žiadosti o súbory sa teraz automaticky prijímajú zo zariadení vo vašom zozname obľúbených.",
- "Varovanie! V súčasnosti to nie je úplne bezpečné, pretože hacker, ktorý má odtlačok prsta akéhokoľvek zariadenia zo zoznamu obľúbených, vám môže posielať súbory bez obmedzenia.",
- "Táto možnosť je však stále bezpečnejšia ako povoliť všetkým používateľom v miestnej sieti posielať súbory bez obmedzenia. "
+ "File requests are now accepted automatically from devices in your favorites list.",
+ "Warning! Currently, this is not entirely secure, as a hacker who has the fingerprint of any device from your favorites list can send you files without restriction.",
+ "However, this option is still safer than allowing all users on the local network to send you files without restriction."
]
}
}
|
localsend
|
localsend
|
Dart
|
Dart
| 58,423
| 3,136
|
An open-source cross-platform alternative to AirDrop
|
localsend_localsend
|
CONFIG_CHANGE
|
Very small changes
|
e4995096efee842de7ce26c2f9614ec8e3d0179c
|
2024-10-23 19:43:16
|
Aarushi
|
fix(platform): Fix containerized connection issues with DB Manager (#8412) add missing DB manager host values
| false
| 3
| 0
| 3
|
--- autogpt_platform/docker-compose.platform.yml
@@ -66,7 +66,6 @@ services:
- ENABLE_AUTH=true
- PYRO_HOST=0.0.0.0
- EXECUTIONMANAGER_HOST=executor
- - DBMANAGER_HOST=executor
- FRONTEND_BASE_URL=http://localhost:3000
- BACKEND_CORS_ALLOW_ORIGINS=["http://localhost:3000"]
ports:
--- autogpt_platform/infra/helm/autogpt-server/values.dev.yaml
@@ -106,7 +106,6 @@ env:
SUPABASE_URL: "https://adfjtextkuilwuhzdjpf.supabase.co"
AGENTSERVER_HOST: "autogpt-server.dev-agpt.svc.cluster.local"
EXECUTIONMANAGER_HOST: "autogpt-server-executor.dev-agpt.svc.cluster.local"
- DBMANAGER_HOST: "autogpt-server-executor.dev-agpt.svc.cluster.local"
secrets:
ANTHROPIC_API_KEY: "AgBllA6KzTdyLs6Tc+HrwIeSjdsPQxdU/4qpqT64H4K3nTehS6kpCW1qtH6eBChs1v+m857sUgsrB9u8+P0aAa3DcgZ/gNg+G1GX6vAY2NJvP/2Q+Hiwi1cAn+R3ChHejG9P2C33hTa6+V9cpUI9xUWOwWLOIQZpLvAc7ltsi0ZJ06qFO0Zhj+H9K768h7U3XaivwywX7PT7BnUTiT6AQkAwD2misBkeSQZdsllOD0th3b2245yieqal9osZHlSlslI9c6EMpH0n+szSND7goyjgsik0Tb0xJU6kGggdcw9hl4x91rYDYNPs0hFES9HUxzfiAid6Y2rDUVBXoNg7K7pMR6/foIkl+gCg/1lqOS0FRlUVyAQGJEx6XphyX/SftgLaI7obaVnzjErrpLWY1ZRiD8VVZD40exf8FddGOXwPvxYHrrrPotlTDLONZMn4Fl46tJCTsoQfHCjco+sz7/nLMMnHx+l1D0eKBuGPVsKTtbWozhLCNuWEgcWb4kxJK5sd1g/GylD43g8hFW531Vbpk1J1rpf7Hurd/aTUjwSXmdxB2qXTT4HRG+Us6PnhMIuf/yxilTs4WNShY0zHhYgnQFSM3oCTL6XXG1dqdOwY2k6+k2wCQtpK45boVN5PpBrQuDuFdWb/jM5jH6L8ns0dMMlY3lHM459u7FEn8rum/xXdP/JvpFb+yct3Rgc54SOT5HuVUNAHzzmbWhY4RG4b3i21L2SlsVUwjKvu+PlN4MN5KPilvHe3yODXZu0Gp0ClzDNZQiKQU67H0uYr6eRccMDsHtMlPELqnjyQZ+OriydzB3qXidAkguKNmzPypz0LyTMnry7YpNRGyUw="
--- autogpt_platform/infra/helm/autogpt-server/values.prod.yaml
@@ -103,7 +103,6 @@ env:
SUPABASE_URL: "https://bgwpwdsxblryihinutbx.supabase.co"
AGENTSERVER_HOST: "autogpt-server.prod-agpt.svc.cluster.local"
EXECUTIONMANAGER_HOST: "autogpt-server-executor.prod-agpt.svc.cluster.local"
- DBMANAGER_HOST: "autogpt-server-executor.prod-agpt.svc.cluster.local"
secrets:
ANTHROPIC_API_KEY: "AgCf0CUyhYluMW13zvdScIzF50c1u4P3sUkKZjwe2lJGil/WrxN1r+GQGoLzjMn8ODANV7FiJN2+Y+ilVgpf0tVA9uEWLCL/OguNshRYWfNfU0PCgciXvz+Cy8xILfJW5SIZvZgDV5zMbzXeBomJYq+qFpr+PRyiIzA6ciHK/ZuItcGBB0FMdJ6w2gvAlLTFmAK0ekyXTzYidPEkBp+DA4jJXuzjXGd4U8iC4IcrSs/o0eaqfMQSOBRc7w/6SK+YDUnWypc2awBX4qNwqKbQRYAT59lihy/B0D4BhjjiUb2bAlzNWP0STsJONrOPbnHzuvipm1xpk+1bdYFpkqJAf9rk9GOPAMfB5f/kOdmaoj9jdQN55NIomSzub+KnSGt+m4G6YlEnUf2ZBZTKTeWO1jzk0gnzrdFZclPq/9Dd0qUBsZ/30KjbBRJyL9SexwxpfMoaf6dKJHcsOdOevaCpMQZaQ/AjcFZRtntw8mLALJzTZbTq7Gb6h25blwe1Oi6DrOuTrWT+OMHeUJcDQA3q1rJERa4xV0wLjYraCTerezhZgjMvfRD1Ykm5S+1U9hzsZUZZQS6OEEIS0BaOfYugt3DiFSNLrIUwVcYbl5geLoiMW6oSukEeb4s2AukRqKkMYz8/stjCgJB2NiarVi2NIaDvgaXWLgJxNxxovgtHyS4RR8WpRPdWJdjAs6RH13ve42a35S2m65jvUNg875GSO8Eo1izYH6q2LvJgGmlTfMworP6O2ryZO9tBjNS58UYxM8EqvtXLVktA0TYlK7wlF2NzA/waIMmiOiKJrb8YnQF28ePxYnmQSqqe2ZpwSiDBsDNrzfZvvTk9Ai81qu8="
|
autogpt
|
significant-gravitas
|
Python
|
Python
| 172,255
| 45,197
|
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
|
significant-gravitas_autogpt
|
BUG_FIX
|
obvious
|
c583d32a74149a2ef3c38d2dcd64395ff8580d26
|
2024-07-03 12:26:28
|
jbengler
|
articles
| false
| 54
| 34
| 88
|
--- _pkgdown.yml
@@ -9,14 +9,14 @@ authors:
figures:
dpi: 300
fig.width: 5
- fig.height: 5
+ fig.height: 2.5
articles:
- title: Articles
navbar: ~
contents:
- - articles/Visualizing-data
- articles/Advanced-plotting
+ - articles/Visualizing-data
- articles/Color-schemes
- articles/Design-principles
--- vignettes/articles/Visualizing-data.Rmd
@@ -23,7 +23,7 @@ library(tidyplots)
# Data points
-Plotting the raw data points is probably the most bare bone way to visualize a dataset. The corresponding function in tidyplots is called `add_data_points()`.
+Plotting the raw data points is probably the most bare bone way to visualize a dataset. the corresponding function in tidyplots is called `add_data_points()`.
```{r}
animals %>%
@@ -47,7 +47,7 @@ animals %>%
add_data_points(alpha = 0.3)
```
-Or to change the plotting symbol to an open `shape`.
+Or by changing the plotting symbol to an open `shape`.
```{r}
animals %>%
@@ -81,9 +81,9 @@ study %>%
# Amounts
-For some datasets, it makes sense to `count` or `sum` up data points in order to arrive to a conclusion. As one example, let's have a look at the `spendings` dataset.
+For some datasets, it makes sense to `count` or `sum` up, data points in order to arrive to a conclusion. As one example, let's have a look at the `spendings` dataset.
-```{r, results='markup'}
+```{r results='markup'}
spendings
```
@@ -139,7 +139,7 @@ spendings %>%
remove_x_axis_ticks()
```
-Note that besides the x axis labels, I also removed the x axis ticks and x axis title to achieve a cleaner look.
+Note that besides the x axis labels, I also removed the x axis ticks and x axis title to achieve a clean look.
Of course you are free to play around with different graphical representations of the sum values. Here is an example of a lollipop plot constructed from a thin `bar` and a `dot`.
@@ -161,7 +161,7 @@ I also added the sum value as text label using the `add_sum_value()` function.
Heatmaps are a great way to plot a _continuous variable_ across to two _discrete variables_. To exemplify this, we will have a look at the `gene_expression` dataset.
-```{r, results='markup'}
+```{r results='markup'}
gene_expression %>%
dplyr::glimpse()
```
@@ -174,64 +174,44 @@ gene_expression %>%
add_heatmap()
```
-One thing to note here is that the y axis labeks are overlapping. So let's increase the height of the plot area from 50 to 100 mm.
+One thing to note here is that the y axis labeks are overlapping. So let's increase the height of the plot area from 50 to 80 mm.
-```{r, fig.asp=0.9}
+```{r fig.height=5}
gene_expression %>%
tidyplot(x = sample, y = external_gene_name, color = expression) %>%
add_heatmap() %>%
- adjust_plot_area_size(height = 100)
+ adjust_plot_area_size(height = 80)
```
The next thing to note is that some of the rows like _Map1a_ and _Kif1a_ show very high expression while others show much lower values. Let's apply a classical technique to reserve the variations in the color for differences within each row. This is done by calculating _row z scores_ for each row individually. Luckily, tidyplots does this for us when setting the parameter `scale = "row"` withing the `add_heatmap()` function call.
-```{r, fig.asp=0.9}
+```{r}
gene_expression %>%
tidyplot(x = sample, y = external_gene_name, color = expression) %>%
add_heatmap(scale = "row") %>%
- adjust_plot_area_size(height = 100)
+ adjust_plot_area_size(height = 80)
```
Now it much easier to appreciate the dynamics of individual genes across the samples on the x axis.
However, the rows appear to be mixed. Some having rather high expression in the "Eip" samples while others have high value in the "Hip" samples. Conveniently, in the dataset there is a variable called `direction`, which is either "up" or "down". Let's use this variable to sort our y axis.
-```{r, fig.asp=0.9}
+```{r}
gene_expression %>%
tidyplot(x = sample, y = external_gene_name, color = expression) %>%
add_heatmap(scale = "row") %>%
- adjust_plot_area_size(height = 100) %>%
+ adjust_plot_area_size(height = 80) %>%
sort_y_axis_labels(direction)
```
+This starts looking like a classical gene expression heatmap
+
# Central tendency
```{r}
-energy %>%
- tidyplot(year, power, color = energy_source) %>%
- add_barstack_absolute()
-energy %>%
- tidyplot(year, power, color = energy_source) %>%
- add_barstack_absolute() %>%
- adjust_colors(colors_discrete_seaside)
-
-energy %>%
- tidyplot(year, power, color = energy_source) %>%
- add_barstack_absolute() %>%
- adjust_colors(colors_discrete_candy)
-
-energy %>%
- tidyplot(year, power, color = energy_source) %>%
- add_barstack_absolute() %>%
- adjust_colors(colors_discrete_pastel)
-
-energy %>%
- tidyplot(year, power, color = energy_source) %>%
- add_barstack_absolute() %>%
- adjust_colors(colors_discrete_circle)
```
# Dispersion & uncertainty
@@ -475,6 +455,9 @@ energy_week %>%
```
+
+# Uncertainty
+
```{r}
study %>%
tidyplot(x = treatment, y = score, color = treatment) %>%
@@ -496,6 +479,9 @@ study %>%
```
+
+# x--y relationships
+
```{r}
energy_week %>%
--- vignettes/tidyplots.Rmd
@@ -137,7 +137,7 @@ Although there are many more `add_*()` functions available, I will stop here and
## Remove
-Besides adding plot elements, you might want to remove certain parts of the plot. This can be achieved by the `remove_*()` family of functions. For example, you might want to remove the color legend title, or in some rare cases even the entire y axis.
+Besides adding plot elements, you might want to remove certain parts of the plot. This can be archived by the `remove_*()` family of functions. For example, you might want to remove the color legend title, or in some rare cases even the entire y axis.
```{r}
study %>%
@@ -194,7 +194,7 @@ study %>%
Note that I removed the legend title by setting it to an empty string `legend_title = ""`. This is alternative to `remove_legend_title()`, however the result is not exactly the same. I am sure you will figure out the difference.
-When it comes to scientific plots, titles often contain special characters like Greek symbols, subscript or superscript. For this purpose, tidyplots supports _plotmath expressions_. Besides finding out how to use the [plotmath expression syntax](https://www.rdocumentation.org/packages/grDevices/versions/3.6.2/topics/plotmath), please note that in tidyplots all plotmath expression need to start and end with a `$` character. Moreover, you can not mix plotmath with plain text in one string, instead the entire string needs to be a plotmath expression if you want to use special characters.
+When it comes to scientific plots, titles often contain special characters like Greek characters, subscript or superscript. For this purpose, tidyplots supports _plotmath expressions_. Besides finding out how to use the [plotmath expression syntax](https://www.rdocumentation.org/packages/grDevices/versions/3.6.2/topics/plotmath), please note that in tidyplots all plotmath expression need to start and end with a `$` character. Moreover, you can not mix plotmath with plain text in one string, instead the entire string needs to be a plotmath expression if you want to use special characters.
```{r}
study %>%
@@ -230,11 +230,11 @@ study %>%
adjust_colors(new_colors = colors_discrete_seaside)
```
-**Rename, reorder, sort, and reverse**
+##### Rename, reorder, sort, and reverse
-A special group of adjust functions deals with the _data labels_ in your plot. These function are special because they need to modify the underlying data of the plot. Moreover, they do not start with `adjust_` but with `rename_`, `reorder_`, `sort_`, and `reverse_`.
+A special type of adjust function deals with the _data labels_ in your plot. These function are special because they need to modify the underlying data of the plot. Moreover, they do not start with `adjust_` but with `rename_`, `reorder_`, `sort_`, and `reverse_`.
-For example, to rename the data labels for the `treatment` variable on the x axis, you can do this.
+For example, to rename the data labels for the treatment variable on the x axis, you can do this.
```{r}
study %>%
@@ -250,7 +250,7 @@ study %>%
Note that we provide a _named character vector_ to make it clear which old label should be replace with which new label.
-The remaining functions, starting with `reorder_`, `sort_`, and `reverse_`, do not change the name of the label but their order in the plot.
+The functions starting with `reorder_`, `sort_`, and `reverse_` do not change the name of the label but their order in the graph.
For example, you can bring the treatment "D" to the front.
@@ -285,11 +285,11 @@ study %>%
reverse_x_axis_labels()
```
-Of course, there are many more `adjust_` functions that you can find in the [Function reference](https://jbengler.github.io/tidyplots/reference/index.html).
+Of course, there are many more `adjust_` funtions that you can find in the [Function reference](https://jbengler.github.io/tidyplots/reference/index.html).
## Themes
-Themes are a great way to modify the look an feel of your plot without changing the representation of the data. You can stay with the default tidyplots theme.
+Themes are a great way to modify the look an feel of you plot without changing the representation of the data. You can stay with the default tidyplots theme.
```{r}
study %>%
@@ -340,7 +340,7 @@ study %>%
## Output
-The classical way to output a plot is to write it to a `PDF` or `PNG` file. This can be easily done by piping the plot into the function `save_plot()`.
+The classical way to output a plot is to write it to a `PDF` or `PNG` file. This can be easily done by piping the plot into the function `save_plot()`. Conveniently, `save_plot()` also gives back the plot it received, allowing it to be used in the middle of a pipeline. If `save_plot()` is a the end of pipeline, the plot will still be rendered on screen, providing a visual confirmation of what was saved to file.
```{r eval=FALSE}
study %>%
@@ -351,25 +351,19 @@ study %>%
save_plot("my_plot.pdf")
```
-Conveniently, `save_plot()` also gives back the plot it received, allowing it to be used in the middle of a pipeline. If `save_plot()` is a the end of pipeline, the plot will still be rendered on screen, providing a visual confirmation of what was saved to file.
-
# What's more?
This getting started guide is meant to give a high level overview of the tidyplots workflow. To dive deeper into more specific aspects of tidyplots, here a couple of resources.
-## Reference
-
- [Function reference](https://jbengler.github.io/tidyplots/reference/index.html)
A great overview of all tidyplots functions
-## Articles
+- [Advanced plotting](https://jbengler.github.io/tidyplots/articles/Advanced-plotting.html)
+An article about advanced plotting techniques and workflows
- [Visualizing data](https://jbengler.github.io/tidyplots/articles/Visualizing-data.html)
An article with examples for common data visualizations
-- [Advanced plotting](https://jbengler.github.io/tidyplots/articles/Advanced-plotting.html)
-An article about advanced plotting techniques and workflows
-
- [Color schemes](https://jbengler.github.io/tidyplots/articles/Color-schemes.html)
An article about the use of color schemes in tidyplots
|
tidyplots
|
jbengler
|
R
|
R
| 495
| 18
|
Tidy Plots for Scientific Papers
|
jbengler_tidyplots
|
DOC_CHANGE
|
Obvious
|
e997631b25ea496bb42ded198e71fc0bc10d7645
|
2023-09-28 03:19:48
|
Tom Hacohen
|
Update spec according to changes in the README (#12) The changes were originally made in #7
| false
| 7
| 7
| 14
|
--- spec/standard-webhooks.md
@@ -12,19 +12,19 @@ License: The Apache License, Version 2.0.
## Introduction
-Webhooks are becoming increasingly popular and are used by many of the world's top companies for sending events to users of their APIs. However, the ecosystem is fragmented, with each webhook provider using different implementations and varying quality. Even high quality implementations vary, making them inherently incompatible. This fragmentation is a pain for the providers and consumers, stifling innovation.
+Webhooks are becoming increasingly popular, and are used by many of the world's top companies to notify users of their APIs of events. Implementations, however, are heavily fragmented, and every webhooks provider implements things differently and with varying quality. In addition, even the higher quality implementations are different to one another which means they are inherently incompatible. This fragmentation is a pain for the whole ecosystem, both providers and consumers, is wasteful, and is holding back innovation.
-For consumers, this means handling webhooks differently for every provider, relearning how to verify webhooks, and encountering gotchas with bespoke implementations. For providers, this means reinventing the wheel, redesigning for issues that have already been solved (security, forward compatibility, etc.).
+For consumers this means having to implement webhook handling differently for every provider, having to relearn how to verify webhooks, and encounter many gotchas with weird implementations. For providers this means reinventing the wheel every time, and making costly mistakes around issues that have already been solved elsewhere (security, forward compatibility, etc.). It also holds the ecosystem back as a whole, as these incompatibilities mean that no tools are being built to help senders send, consumers consume, and for everyone to innovate on top.
-We propose a simple solution: standardize webhooks across the industry. This design document outlines our proposal, a set of strict webhook guidelines based on the existing industry best practices. We call it "Standard Webhooks".
+The solution is simple: have a standard way of implementing webhooks. This design document aims to outline exactly that, a set of strict webhook guidelines based on the existing industry best practices; we call it "Standard Webhooks".
-We believe "Standard Webhooks" can do for webhooks what JWT did for API authentication. Adopting a common protocol that is consistent and supported by different implementations will solve the above issues, and will enable new tools and innovations in webhook ecosystem.
+We believe "Standard Webhooks" can do to webhooks what JWT did to API authentication. Having a common protocol that is consistent and supported by different implementations will solve the above issues, and will usher in an era of new tools and innovations in the world of webhooks.
-To achieve this, we have created an open source and community-driven set of tools and guidelines for sending webhooks.
+To achieve this, we have created a fully open source and community driven set of tools and guidelines for sending webhooks. Part of which is the document you are currently reading.
-## What are Webhooks?
+## What are webhooks?
-Webhooks are a common name for HTTP callbacks, and are a way for services to notify each other of events. Webhooks are part of a service's API, though you can think of them as a sort of a "reverse API". When a client wants to make a request to a service they make an API call, and when the service wants to notify the client of an event the service triggers a webhook ("a user has paid", "task has finished", etc.).
+Webhooks are a common name for HTTP callbacks, and are how services notify each other of events. Webhooks are part of a service's API, though you can think of them as a sort of a reverse API. When a client wants to make a request to a service they make an API call, and when the service wants to notify the client of an event the service triggers a webhook ("a user has paid", "task has finished", etc.).
Webhooks are server-to-server, in the sense that both the customer and the service in the above description, should be operating HTTP servers, one to receive the API calls and one to receive the webhooks.It's important to note that while webhooks usually co-exist with a traditional API, this is not a requirement, and some services send webhooks without offering a traditional API.
|
standard-webhooks
|
standard-webhooks
|
Elixir
|
Elixir
| 1,390
| 37
|
The Standard Webhooks specification
|
standard-webhooks_standard-webhooks
|
CODE_IMPROVEMENT
|
Obvious
|
9892ade4e04fe04ef2b284f5553cbfcbd9d5859e
|
2023-10-27 00:54:25
|
Leonard Hecker
|
COOKED_READ: Fix reference counting woes (#16187) This restores the original code from before 821ae3a where the `.GetMainBuffer()` call was accidentally removed. Closes #16158 ## Validation Steps Performed * Run this Python script: ```py import sys while True: sys.stdout.write("\033[?1049h") sys.stdout.flush() sys.stdin.readline() sys.stdout.write("\033[?1049l") ``` * Press enter repeatedly * Doesn't crash ✅ (cherry picked from commit 08f30330d15e22adf23ee04164fc2474f290348a) Service-Card-Id: 90861143 Service-Version: 1.19
| false
| 43
| 1
| 44
|
--- .github/actions/spelling/expect/expect.txt
@@ -301,7 +301,6 @@ coordnew
COPYCOLOR
CORESYSTEM
cotaskmem
-countof
CPG
cpinfo
CPINFOEX
--- src/host/ft_host/API_BufferTests.cpp
@@ -3,8 +3,6 @@
#include "precomp.h"
-#include "../../types/inc/IInputEvent.hpp"
-
extern "C" IMAGE_DOS_HEADER __ImageBase;
using namespace WEX::Logging;
@@ -19,7 +17,6 @@ class BufferTests
END_TEST_CLASS()
TEST_METHOD(TestSetConsoleActiveScreenBufferInvalid);
- TEST_METHOD(TestCookedReadBufferReferenceCount);
TEST_METHOD(TestCookedReadOnNonShareableScreenBuffer);
@@ -86,40 +83,6 @@ void BufferTests::TestCookedReadOnNonShareableScreenBuffer()
VERIFY_IS_TRUE(IsConsoleStillRunning());
}
-// This test ensures that COOKED_READ_DATA properly holds onto the screen buffer it is
-// reading from for the whole duration of the read. It's important that we hold a handle
-// to the main instead of the alt buffer even if this cooked read targets the latter,
-// because alt buffers are fake SCREEN_INFORMATION objects that are owned by the main buffer.
-void BufferTests::TestCookedReadBufferReferenceCount()
-{
- static constexpr int loops = 5;
-
- const auto in = GetStdInputHandle();
- const auto out = GetStdOutputHandle();
-
- DWORD inMode = 0;
- GetConsoleMode(out, &inMode);
- inMode |= ENABLE_VIRTUAL_TERMINAL_PROCESSING | ENABLE_PROCESSED_OUTPUT;
- SetConsoleMode(out, inMode);
-
- INPUT_RECORD newlines[loops];
- DWORD written = 0;
- std::fill_n(&newlines[0], loops, SynthesizeKeyEvent(true, 1, L'\r', 0, L'\r', 0));
- WriteConsoleInputW(in, &newlines[0], loops, &written);
-
- for (int i = 0; i < loops; ++i)
- {
- VERIFY_SUCCEEDED(WriteConsoleW(out, L"\033[?1049h", 8, nullptr, nullptr));
-
- wchar_t buffer[16];
- DWORD read = 0;
- VERIFY_SUCCEEDED(ReadConsoleW(in, &buffer[0], _countof(buffer), &read, nullptr));
- VERIFY_ARE_EQUAL(2u, read);
-
- VERIFY_SUCCEEDED(WriteConsoleW(out, L"\033[?1049l", 8, nullptr, nullptr));
- }
-}
-
void BufferTests::TestWritingInactiveScreenBuffer()
{
bool useVtOutput;
--- src/host/readDataCooked.cpp
@@ -62,11 +62,7 @@ COOKED_READ_DATA::COOKED_READ_DATA(_In_ InputBuffer* const pInputBuffer,
// We need to ensure that it stays alive for the duration of the read.
// Coincidentally this serves another important purpose: It checks whether we're allowed to read from
// the given buffer in the first place. If it's missing the FILE_SHARE_READ flag, we can't read from it.
- //
- // GH#16158: It's important that we hold a handle to the main instead of the alt buffer
- // even if this cooked read targets the latter, because alt buffers are fake
- // SCREEN_INFORMATION objects that are owned by the main buffer.
- THROW_IF_FAILED(_screenInfo.GetMainBuffer().AllocateIoHandle(ConsoleHandleData::HandleType::Output, GENERIC_WRITE, FILE_SHARE_READ | FILE_SHARE_WRITE, _tempHandle));
+ THROW_IF_FAILED(_screenInfo.AllocateIoHandle(ConsoleHandleData::HandleType::Output, GENERIC_WRITE, FILE_SHARE_READ | FILE_SHARE_WRITE, _tempHandle));
#endif
if (!initialData.empty())
|
terminal
|
microsoft
|
C++
|
C++
| 97,273
| 8,477
|
The new Windows Terminal and the original Windows console host, all in the same place!
|
microsoft_terminal
|
BUG_FIX
|
simplify decoder draining logic
|
1acb37939cbfadcc67db8d5ad06477b25d5448bd
|
2022-10-31 19:00:09
|
Tay Yang Shun
|
[portal][misc] refactor typeahead props
| false
| 73
| 72
| 145
|
--- apps/portal/src/components/shared/CitiesTypeahead.tsx
@@ -1,32 +1,28 @@
-import type { ComponentProps } from 'react';
import { useState } from 'react';
import type { TypeaheadOption } from '@tih/ui';
import { Typeahead } from '@tih/ui';
import { trpc } from '~/utils/trpc';
-type BaseProps = Pick<
- ComponentProps<typeof Typeahead>,
- | 'disabled'
- | 'errorMessage'
- | 'isLabelHidden'
- | 'placeholder'
- | 'required'
- | 'textSize'
->;
-
-type Props = BaseProps &
- Readonly<{
- label?: string;
- onSelect: (option: TypeaheadOption | null) => void;
- value?: TypeaheadOption | null;
- }>;
+type Props = Readonly<{
+ disabled?: boolean;
+ errorMessage?: string;
+ isLabelHidden?: boolean;
+ label?: string;
+ onSelect: (option: TypeaheadOption | null) => void;
+ placeholder?: string;
+ required?: boolean;
+ value?: TypeaheadOption | null;
+}>;
export default function CitiesTypeahead({
+ disabled,
label = 'City',
onSelect,
+ isLabelHidden,
+ placeholder,
+ required,
value,
- ...props
}: Props) {
const [query, setQuery] = useState('');
const cities = trpc.useQuery([
@@ -40,6 +36,8 @@ export default function CitiesTypeahead({
return (
<Typeahead
+ disabled={disabled}
+ isLabelHidden={isLabelHidden}
label={label}
noResultsMessage="No cities found"
nullable={true}
@@ -50,10 +48,12 @@ export default function CitiesTypeahead({
value: id,
})) ?? []
}
+ placeholder={placeholder}
+ required={required}
+ textSize="inherit"
value={value}
onQueryChange={setQuery}
onSelect={onSelect}
- {...props}
/>
);
}
--- apps/portal/src/components/shared/CompaniesTypeahead.tsx
@@ -1,30 +1,26 @@
-import type { ComponentProps } from 'react';
import { useState } from 'react';
import type { TypeaheadOption } from '@tih/ui';
import { Typeahead } from '@tih/ui';
import { trpc } from '~/utils/trpc';
-type BaseProps = Pick<
- ComponentProps<typeof Typeahead>,
- | 'disabled'
- | 'errorMessage'
- | 'isLabelHidden'
- | 'placeholder'
- | 'required'
- | 'textSize'
->;
-
-type Props = BaseProps &
- Readonly<{
- onSelect: (option: TypeaheadOption | null) => void;
- value?: TypeaheadOption | null;
- }>;
+type Props = Readonly<{
+ disabled?: boolean;
+ errorMessage?: string;
+ isLabelHidden?: boolean;
+ onSelect: (option: TypeaheadOption | null) => void;
+ placeholder?: string;
+ required?: boolean;
+ value?: TypeaheadOption | null;
+}>;
export default function CompaniesTypeahead({
+ disabled,
onSelect,
+ isLabelHidden,
+ placeholder,
+ required,
value,
- ...props
}: Props) {
const [query, setQuery] = useState('');
const companies = trpc.useQuery([
@@ -38,6 +34,8 @@ export default function CompaniesTypeahead({
return (
<Typeahead
+ disabled={disabled}
+ isLabelHidden={isLabelHidden}
label="Company"
noResultsMessage="No companies found"
nullable={true}
@@ -48,10 +46,12 @@ export default function CompaniesTypeahead({
value: id,
})) ?? []
}
+ placeholder={placeholder}
+ required={required}
+ textSize="inherit"
value={value}
onQueryChange={setQuery}
onSelect={onSelect}
- {...props}
/>
);
}
--- apps/portal/src/components/shared/CountriesTypeahead.tsx
@@ -1,30 +1,26 @@
-import type { ComponentProps } from 'react';
import { useState } from 'react';
import type { TypeaheadOption } from '@tih/ui';
import { Typeahead } from '@tih/ui';
import { trpc } from '~/utils/trpc';
-type BaseProps = Pick<
- ComponentProps<typeof Typeahead>,
- | 'disabled'
- | 'errorMessage'
- | 'isLabelHidden'
- | 'placeholder'
- | 'required'
- | 'textSize'
->;
-
-type Props = BaseProps &
- Readonly<{
- onSelect: (option: TypeaheadOption | null) => void;
- value?: TypeaheadOption | null;
- }>;
+type Props = Readonly<{
+ disabled?: boolean;
+ errorMessage?: string;
+ isLabelHidden?: boolean;
+ onSelect: (option: TypeaheadOption | null) => void;
+ placeholder?: string;
+ required?: boolean;
+ value?: TypeaheadOption | null;
+}>;
export default function CountriesTypeahead({
+ disabled,
onSelect,
+ isLabelHidden,
+ placeholder,
+ required,
value,
- ...props
}: Props) {
const [query, setQuery] = useState('');
const countries = trpc.useQuery([
@@ -38,6 +34,8 @@ export default function CountriesTypeahead({
return (
<Typeahead
+ disabled={disabled}
+ isLabelHidden={isLabelHidden}
label="Country"
noResultsMessage="No countries found"
nullable={true}
@@ -48,10 +46,12 @@ export default function CountriesTypeahead({
value: id,
})) ?? []
}
+ placeholder={placeholder}
+ required={required}
+ textSize="inherit"
value={value}
onQueryChange={setQuery}
onSelect={onSelect}
- {...props}
/>
);
}
--- apps/portal/src/components/shared/JobTitlesTypahead.tsx
@@ -1,30 +1,25 @@
-import type { ComponentProps } from 'react';
import { useState } from 'react';
import type { TypeaheadOption } from '@tih/ui';
import { Typeahead } from '@tih/ui';
import { JobTitleLabels } from './JobTitles';
-type BaseProps = Pick<
- ComponentProps<typeof Typeahead>,
- | 'disabled'
- | 'errorMessage'
- | 'isLabelHidden'
- | 'placeholder'
- | 'required'
- | 'textSize'
->;
-
-type Props = BaseProps &
- Readonly<{
- onSelect: (option: TypeaheadOption | null) => void;
- value?: TypeaheadOption | null;
- }>;
+type Props = Readonly<{
+ disabled?: boolean;
+ isLabelHidden?: boolean;
+ onSelect: (option: TypeaheadOption | null) => void;
+ placeholder?: string;
+ required?: boolean;
+ value?: TypeaheadOption | null;
+}>;
export default function JobTitlesTypeahead({
+ disabled,
onSelect,
+ isLabelHidden,
+ placeholder,
+ required,
value,
- ...props
}: Props) {
const [query, setQuery] = useState('');
const options = Object.entries(JobTitleLabels)
@@ -40,14 +35,18 @@ export default function JobTitlesTypeahead({
return (
<Typeahead
+ disabled={disabled}
+ isLabelHidden={isLabelHidden}
label="Job Title"
noResultsMessage="No available job titles."
nullable={true}
options={options}
+ placeholder={placeholder}
+ required={required}
+ textSize="inherit"
value={value}
onQueryChange={setQuery}
onSelect={onSelect}
- {...props}
/>
);
}
|
tech-interview-handbook
|
yangshun
|
TypeScript
|
TypeScript
| 122,353
| 15,039
|
💯 Curated coding interview preparation materials for busy software engineers
|
yangshun_tech-interview-handbook
|
CODE_IMPROVEMENT
|
Obvious
|
a8801428da25d681d85fd75f07779385f35bf46a
|
2024-06-29 02:42:33
|
Will Ceolin
|
Runtime: Use :conn for check_origin
| false
| 1
| 2
| 3
|
--- config/runtime.exs
@@ -93,10 +93,11 @@ if config_env() == :prod do
# Check `Plug.SSL` for all available options in `force_ssl`.
host = System.get_env("PHX_HOST")
port = String.to_integer(System.get_env("PORT") || "8080")
+ check_origin = ["https://*.#{System.get_env("PHX_HOST")}"]
config :zoonk, ZoonkWeb.Endpoint,
url: [host: host, port: 443, scheme: "https"],
- check_origin: :conn,
+ check_origin: check_origin,
http: [
# Enable IPv6 and bind on all interfaces.
# Set it to {0, 0, 0, 0, 0, 0, 0, 1} for local network only access.
|
uneebee
|
zoonk
|
Elixir
|
Elixir
| 1,339
| 83
|
Platform for creating interactive courses.
|
zoonk_uneebee
|
CONFIG_CHANGE
|
Obvious
|
c215c4d3d35d361f45bd057c2fc211783e60d73a
|
2025-03-20 00:23:53
|
Ilya Chernikov
|
K2 REPL: Generate correct properties from snippet variables Previously only jvm fields were generated and it broke the Kotlin reflection (see e.g. KT-75580). The commit also add a new infrastructure for detailed evaluation results inspection in the custom REPL tests, and a test for reflection access to the compiled and evaluated snippet. #KT-75580 fixed
| false
| 117
| 27
| 144
|
--- compiler/ir/backend.jvm/lower/src/org/jetbrains/kotlin/backend/jvm/lower/MoveCompanionObjectFieldsLowering.kt
@@ -32,7 +32,7 @@ import org.jetbrains.kotlin.resolve.deprecation.DeprecationResolver
@PhaseDescription(name = "MoveOrCopyCompanionObjectFields")
internal class MoveOrCopyCompanionObjectFieldsLowering(val context: JvmBackendContext) : ClassLoweringPass {
override fun lower(irClass: IrClass) {
- if (irClass.isNonCompanionObject && !irClass.isReplSnippet) {
+ if (irClass.isNonCompanionObject) {
irClass.handle()
} else {
(irClass.declarations.singleOrNull { it is IrClass && it.isCompanion } as IrClass?)?.handle()
--- compiler/ir/ir.tree/src/org/jetbrains/kotlin/ir/builders/declarations/declarationBuilders.kt
@@ -15,7 +15,6 @@ import org.jetbrains.kotlin.ir.declarations.impl.IrVariableImpl
import org.jetbrains.kotlin.ir.expressions.impl.IrGetFieldImpl
import org.jetbrains.kotlin.ir.expressions.impl.IrGetValueImpl
import org.jetbrains.kotlin.ir.expressions.impl.IrReturnImpl
-import org.jetbrains.kotlin.ir.expressions.impl.IrSetFieldImpl
import org.jetbrains.kotlin.ir.symbols.impl.*
import org.jetbrains.kotlin.ir.types.IrType
import org.jetbrains.kotlin.ir.util.copyTo
@@ -181,29 +180,6 @@ fun IrProperty.addDefaultGetter(parentClass: IrClass, builtIns: IrBuiltIns) {
}
}
-fun IrProperty.addDefaultSetter(parentClass: IrClass, builtIns: IrBuiltIns) {
- val field = backingField!!
- addSetter {
- origin = IrDeclarationOrigin.DEFAULT_PROPERTY_ACCESSOR
- visibility = [email protected]
- returnType = builtIns.unitType
- }.also { setter ->
- setter.dispatchReceiverParameter = parentClass.thisReceiver!!.copyTo(setter)
- val irValueParameter = setter.addValueParameter("value", field.type)
- setter.body = factory.createBlockBody(
- UNDEFINED_OFFSET, UNDEFINED_OFFSET, listOf(
- IrSetFieldImpl(
- UNDEFINED_OFFSET, UNDEFINED_OFFSET,
- field.symbol,
- IrGetValueImpl(UNDEFINED_OFFSET, UNDEFINED_OFFSET, setter.dispatchReceiverParameter!!.type, setter.dispatchReceiverParameter!!.symbol),
- IrGetValueImpl(UNDEFINED_OFFSET, UNDEFINED_OFFSET, irValueParameter.type, irValueParameter.symbol),
- builtIns.unitType,
- )
- )
- )
- }
-}
-
inline fun IrProperty.addBackingField(builder: IrFieldBuilder.() -> Unit = {}): IrField =
IrFieldBuilder().run {
name = [email protected]
--- compiler/ir/ir.tree/src/org/jetbrains/kotlin/ir/util/IrUtils.kt
@@ -259,7 +259,6 @@ val IrClass.isClass get() = kind == ClassKind.CLASS
val IrClass.isObject get() = kind == ClassKind.OBJECT
val IrClass.isAnonymousObject get() = isClass && name == SpecialNames.NO_NAME_PROVIDED
val IrClass.isNonCompanionObject: Boolean get() = isObject && !isCompanion
-val IrClass.isReplSnippet: Boolean get() = origin == IrDeclarationOrigin.REPL_SNIPPET_CLASS
val IrDeclarationWithName.fqNameWhenAvailable: FqName?
get() {
--- plugins/scripting/scripting-compiler/src/org/jetbrains/kotlin/scripting/compiler/plugin/irLowerings/ReplSnippetLowering.kt
@@ -149,34 +149,23 @@ internal class ReplSnippetsToClassesLowering(val context: IrPluginContext) : Mod
} else {
when (statement) {
is IrVariable -> {
- irSnippetClass.addProperty {
- updateFrom(statement)
+ irSnippetClass.addField {
+ startOffset = statement.startOffset
+ endOffset = statement.endOffset
name = statement.name
- }.also { property ->
- property.backingField = context.irFactory.buildField {
- updateFrom(statement)
- origin = IrDeclarationOrigin.PROPERTY_BACKING_FIELD
- name = statement.name
- type = statement.type
- }.also { field ->
- statement.initializer?.let { initializer ->
- +IrSetFieldImpl(
- initializer.startOffset,
- initializer.endOffset,
- field.symbol,
- irGet(irSnippetClassThisReceiver),
- initializer,
- this.context.irBuiltIns.unitType
- )
- }
- field.parent = irSnippetClass
- field.annotations += statement.annotations
- valsToFields[statement.symbol] = field.symbol
- }
- property.addDefaultGetter(irSnippetClass, context.irBuiltIns)
- if (statement.isVar) {
- property.addDefaultSetter(irSnippetClass, context.irBuiltIns)
+ type = statement.type
+ }.also { field ->
+ statement.initializer?.let { initializer ->
+ +IrSetFieldImpl(
+ initializer.startOffset,
+ initializer.endOffset,
+ field.symbol,
+ irGet(irSnippetClassThisReceiver),
+ initializer,
+ this.context.irBuiltIns.unitType
+ )
}
+ valsToFields[statement.symbol] = field.symbol
}
}
is IrProperty,
--- plugins/scripting/scripting-compiler/tests/org/jetbrains/kotlin/scripting/compiler/test/CustomK2ReplTest.kt
@@ -11,8 +11,6 @@ import org.jetbrains.kotlin.scripting.compiler.plugin.impl.K2ReplCompiler
import org.jetbrains.kotlin.scripting.compiler.plugin.impl.K2ReplEvaluator
import org.jetbrains.kotlin.scripting.compiler.plugin.impl.withMessageCollectorAndDisposable
import java.io.File
-import kotlin.reflect.full.declaredMemberFunctions
-import kotlin.reflect.full.declaredMemberProperties
import kotlin.script.experimental.api.*
import kotlin.script.experimental.host.toScriptSource
import kotlin.script.experimental.impl.internalScriptingRunSuspend
@@ -25,7 +23,6 @@ class ReplReceiver1 {
val ok = "OK"
}
-@Suppress("unused") // Used in snippets
class TestReplReceiver1() { fun checkReceiver(block: ReplReceiver1.() -> Any) = block(ReplReceiver1()) }
@@ -33,7 +30,7 @@ class CustomK2ReplTest {
@Test
fun testSimple() {
- evalAndCheckSnippetsResultVals(
+ evalAndCheckSnippets(
sequenceOf(
"val x = 3",
"x + 4",
@@ -87,7 +84,7 @@ class CustomK2ReplTest {
@Test
fun testWithReceiverExtension() {
- evalAndCheckSnippetsResultVals(
+ evalAndCheckSnippets(
sequenceOf(
"val obj = org.jetbrains.kotlin.scripting.compiler.test.TestReplReceiver1()",
"obj.checkReceiver { ok }",
@@ -98,7 +95,7 @@ class CustomK2ReplTest {
@Test
fun testWithUpdatingDefaultImports() {
- evalAndCheckSnippetsResultVals(
+ evalAndCheckSnippets(
sequenceOf(
"kotlin.random.Random.nextInt(10)/10",
"Random.nextInt(10)/10",
@@ -117,39 +114,6 @@ class CustomK2ReplTest {
}
)
}
-
- @Test
- fun testBasicReflection() {
- evalAndCheckSnippets(
- sequenceOf(
- "var x = 3",
- "fun f() = x"
- ),
- baseCompilationConfiguration,
- baseEvaluationConfiguration,
- {
- it.onSuccess { s ->
- s.get().result.let { r ->
- @Suppress("UNCHECKED_CAST") val propx = r.scriptClass!!.declaredMemberProperties.first() as kotlin.reflect.KMutableProperty1<Any, Int>
- val x = propx.get(r.scriptInstance!!)
- assertEquals(3, x)
- propx.set(r.scriptInstance!!, 5)
- }
- it
- }
- },
- {
- it.onSuccess { s ->
- s.get().result.let { r ->
- val funf = r.scriptClass!!.declaredMemberFunctions.first()
- val fret = funf.call() as Int
- assertEquals(5, fret)
- }
- it
- }
- }
- )
- }
}
private val baseCompilationConfiguration: ScriptCompilationConfiguration =
@@ -162,30 +126,27 @@ private val baseCompilationConfiguration: ScriptCompilationConfiguration =
private val baseEvaluationConfiguration: ScriptEvaluationConfiguration = ScriptEvaluationConfiguration {}
-private fun compileEvalAndCheckSnippetsSequence(
+private fun compileAndEvalSnippets(
snippets: Sequence<String>,
compilationConfiguration: ScriptCompilationConfiguration,
- evaluationConfiguration: ScriptEvaluationConfiguration,
- expectedResultCheckers: Sequence<(ResultWithDiagnostics<LinkedSnippet<KJvmEvaluatedSnippet>>) -> ResultWithDiagnostics<LinkedSnippet<KJvmEvaluatedSnippet>>>
+ evaluationConfiguration: ScriptEvaluationConfiguration
): ResultWithDiagnostics<List<LinkedSnippet<KJvmEvaluatedSnippet>>> =
withMessageCollectorAndDisposable { messageCollector, disposable ->
val compiler = K2ReplCompiler(K2ReplCompiler.createCompilationState(messageCollector, disposable, compilationConfiguration))
val evaluator = K2ReplEvaluator()
val filenameExtension = compilationConfiguration[ScriptCompilationConfiguration.fileExtension] ?: "repl.kts"
var snippetNo = 1
- val checkersIterator = expectedResultCheckers.iterator()
@Suppress("DEPRECATION_ERROR")
internalScriptingRunSuspend {
snippets.asIterable().mapSuccess { snippet ->
- val checker = if (checkersIterator.hasNext()) checkersIterator.next() else null
compiler.compile(snippet.toScriptSource("s${snippetNo++}.$filenameExtension")).onSuccess {
- evaluator.eval(it, evaluationConfiguration).let { checker?.invoke(it) ?: it }
+ evaluator.eval(it, evaluationConfiguration)
}
}
}
}
-private fun checkEvaluatedSnippetsResultVals(
+private fun checkEvaluatedSnippets(
expectedResultVals: Sequence<Any?>,
evaluationResults: ResultWithDiagnostics<List<LinkedSnippet<KJvmEvaluatedSnippet>>>
) {
@@ -195,7 +156,6 @@ private fun checkEvaluatedSnippetsResultVals(
return
}
for (res in successResults) {
- if (!expectedIter.hasNext()) break
val expectedVal = expectedIter.next()
when (val resVal = res.get().result) {
is ResultValue.Unit -> assertTrue(expectedVal == null, "Unexpected evaluation result: Unit")
@@ -206,7 +166,7 @@ private fun checkEvaluatedSnippetsResultVals(
}
}
-private fun evalAndCheckSnippetsResultVals(
+private fun evalAndCheckSnippets(
snippets: Sequence<String>,
expectedResultVals: Sequence<Any?>,
compilationConfiguration: ScriptCompilationConfiguration = baseCompilationConfiguration,
@@ -215,22 +175,8 @@ private fun evalAndCheckSnippetsResultVals(
// this is K2-only tests
if (System.getProperty(SCRIPT_TEST_BASE_COMPILER_ARGUMENTS_PROPERTY)?.contains("-language-version 1.9") == true) return
- val evaluationResults = compileEvalAndCheckSnippetsSequence(snippets, compilationConfiguration, evaluationConfiguration, emptySequence())
- checkEvaluatedSnippetsResultVals(expectedResultVals, evaluationResults)
-}
-
-private fun evalAndCheckSnippets(
- snippets: Sequence<String>,
- compilationConfiguration: ScriptCompilationConfiguration,
- evaluationConfiguration: ScriptEvaluationConfiguration,
- vararg resultCheckers: (ResultWithDiagnostics<LinkedSnippet<KJvmEvaluatedSnippet>>) -> ResultWithDiagnostics<LinkedSnippet<KJvmEvaluatedSnippet>>
-) {
- // this is K2-only tests
- if (System.getProperty(SCRIPT_TEST_BASE_COMPILER_ARGUMENTS_PROPERTY)?.contains("-language-version 1.9") == true) return
-
- val results =
- compileEvalAndCheckSnippetsSequence(snippets, compilationConfiguration, evaluationConfiguration, resultCheckers.asSequence())
- checkEvaluatedSnippetsResultVals(emptySequence(), results)
+ val evaluationResults = compileAndEvalSnippets(snippets, compilationConfiguration, evaluationConfiguration)
+ checkEvaluatedSnippets(expectedResultVals, evaluationResults)
}
private fun evalAndCheckSnippetsWithReplReceiver1(
@@ -239,7 +185,7 @@ private fun evalAndCheckSnippetsWithReplReceiver1(
compilationConfiguration: ScriptCompilationConfiguration = baseCompilationConfiguration,
evaluationConfiguration: ScriptEvaluationConfiguration = baseEvaluationConfiguration
) {
- evalAndCheckSnippetsResultVals(
+ evalAndCheckSnippets(
snippets, expectedResultVals,
compilationConfiguration.with {
implicitReceivers(ReplReceiver1::class)
|
kotlin
|
jetbrains
|
Kotlin
|
Kotlin
| 50,115
| 5,861
|
The Kotlin Programming Language.
|
jetbrains_kotlin
|
BUG_FIX
|
fixed the field generation that broken Kotlin reflection
|
7c51e14c4004911ac17f8c230ca761c1206d82f2
|
2025-01-17 04:05:24
|
Costa Tsaousis
|
PULSE: network traffic (#19419) * pulse now tracks network traffic for web server, statsd, streaming and aclk * show gaps on the network traffic chart when aclk is not connected * fix contexts shutdown * log nodes info every 10 seconds
| false
| 273
| 54
| 327
|
--- CMakeLists.txt
@@ -1168,8 +1168,6 @@ set(DAEMON_FILES
src/daemon/config/netdata-conf-profile.c
src/daemon/config/netdata-conf-profile.h
src/daemon/pulse/pulse-daemon-memory-system.c
- src/daemon/pulse/pulse-network.c
- src/daemon/pulse/pulse-network.h
)
set(H2O_FILES
--- src/aclk/aclk.c
@@ -24,17 +24,12 @@ int aclk_pubacks_per_conn = 0; // How many PubAcks we got since MQTT conn est.
int aclk_rcvd_cloud_msgs = 0;
int aclk_connection_counter = 0;
-mqtt_wss_client mqttwss_client;
-
static bool aclk_connected = false;
static inline void aclk_set_connected(void) {
__atomic_store_n(&aclk_connected, true, __ATOMIC_RELAXED);
}
static inline void aclk_set_disconnected(void) {
__atomic_store_n(&aclk_connected, false, __ATOMIC_RELAXED);
-
- if(mqttwss_client)
- mqtt_wss_reset_stats(mqttwss_client);
}
inline bool aclk_online(void) {
@@ -69,12 +64,7 @@ float last_backoff_value = 0;
time_t aclk_block_until = 0;
-struct mqtt_wss_stats aclk_statistics(void) {
- if(mqttwss_client)
- return mqtt_wss_get_stats(mqttwss_client);
- else
- return (struct mqtt_wss_stats) { 0 };
-}
+mqtt_wss_client mqttwss_client;
struct aclk_shared_state aclk_shared_state = {
.mqtt_shutdown_msg_id = -1,
--- src/aclk/aclk.h
@@ -94,6 +94,4 @@ char *aclk_state_json(void);
void add_aclk_host_labels(void);
void aclk_queue_node_info(RRDHOST *host, bool immediate);
-struct mqtt_wss_stats aclk_statistics(void);
-
#endif /* ACLK_H */
--- src/aclk/mqtt_websockets/mqtt_wss_client.c
@@ -989,18 +989,12 @@ struct mqtt_wss_stats mqtt_wss_get_stats(mqtt_wss_client client)
struct mqtt_wss_stats current;
spinlock_lock(&client->stat_lock);
current = client->stats;
+ memset(&client->stats, 0, sizeof(client->stats));
spinlock_unlock(&client->stat_lock);
mqtt_ng_get_stats(client->mqtt, ¤t.mqtt);
return current;
}
-void mqtt_wss_reset_stats(mqtt_wss_client client)
-{
- spinlock_lock(&client->stat_lock);
- memset(&client->stats, 0, sizeof(client->stats));
- spinlock_unlock(&client->stat_lock);
-}
-
int mqtt_wss_set_topic_alias(mqtt_wss_client client, const char *topic)
{
return mqtt_ng_set_topic_alias(client->mqtt, topic);
--- src/aclk/mqtt_websockets/mqtt_wss_client.h
@@ -141,7 +141,6 @@ struct mqtt_wss_stats {
};
struct mqtt_wss_stats mqtt_wss_get_stats(mqtt_wss_client client);
-void mqtt_wss_reset_stats(mqtt_wss_client client);
#ifdef MQTT_WSS_DEBUG
#include <openssl/ssl.h>
--- src/collectors/statsd.plugin/statsd.c
@@ -965,8 +965,6 @@ static int statsd_rcv_callback(POLLINFO *pi, nd_poll_event_t *events) {
d->len += rc;
statsd.tcp_socket_reads++;
statsd.tcp_bytes_read += rc;
-
- pulse_statsd_received_bytes(rc);
}
if(likely(d->len > 0)) {
@@ -1018,15 +1016,12 @@ static int statsd_rcv_callback(POLLINFO *pi, nd_poll_event_t *events) {
statsd.udp_socket_reads++;
statsd.udp_packets_received += rc;
- size_t i, total_size = 0;
+ size_t i;
for (i = 0; i < (size_t)rc; ++i) {
size_t len = (size_t)d->msgs[i].msg_len;
statsd.udp_bytes_read += len;
- total_size += len;
statsd_process(d->msgs[i].msg_hdr.msg_iov->iov_base, len, 0);
}
-
- pulse_statsd_received_bytes(total_size);
}
} while (rc != -1);
@@ -1048,8 +1043,6 @@ static int statsd_rcv_callback(POLLINFO *pi, nd_poll_event_t *events) {
statsd.udp_packets_received++;
statsd.udp_bytes_read += rc;
statsd_process(d->buffer, (size_t) rc, 0);
-
- pulse_statsd_received_bytes(rc);
}
} while (rc != -1);
#endif
--- src/daemon/pulse/pulse-http-api.c
@@ -13,6 +13,8 @@ static struct web_statistics {
PAD64(uint64_t) web_requests;
PAD64(uint64_t) web_usec;
PAD64(uint64_t) web_usec_max;
+ PAD64(uint64_t) bytes_received;
+ PAD64(uint64_t) bytes_sent;
PAD64(uint64_t) content_size_uncompressed;
PAD64(uint64_t) content_size_compressed;
@@ -27,8 +29,8 @@ void pulse_web_client_disconnected(void) {
}
void pulse_web_request_completed(uint64_t dt,
- uint64_t bytes_received __maybe_unused,
- uint64_t bytes_sent __maybe_unused,
+ uint64_t bytes_received,
+ uint64_t bytes_sent,
uint64_t content_size,
uint64_t compressed_content_size) {
uint64_t old_web_usec_max = live_stats.web_usec_max;
@@ -37,8 +39,8 @@ void pulse_web_request_completed(uint64_t dt,
__atomic_fetch_add(&live_stats.web_requests, 1, __ATOMIC_RELAXED);
__atomic_fetch_add(&live_stats.web_usec, dt, __ATOMIC_RELAXED);
-// __atomic_fetch_add(&live_stats.bytes_received, bytes_received, __ATOMIC_RELAXED);
-// __atomic_fetch_add(&live_stats.bytes_sent, bytes_sent, __ATOMIC_RELAXED);
+ __atomic_fetch_add(&live_stats.bytes_received, bytes_received, __ATOMIC_RELAXED);
+ __atomic_fetch_add(&live_stats.bytes_sent, bytes_sent, __ATOMIC_RELAXED);
__atomic_fetch_add(&live_stats.content_size_uncompressed, content_size, __ATOMIC_RELAXED);
__atomic_fetch_add(&live_stats.content_size_compressed, compressed_content_size, __ATOMIC_RELAXED);
}
@@ -48,8 +50,8 @@ static inline void pulse_web_copy(struct web_statistics *gs, uint8_t options) {
gs->web_requests = __atomic_load_n(&live_stats.web_requests, __ATOMIC_RELAXED);
gs->web_usec = __atomic_load_n(&live_stats.web_usec, __ATOMIC_RELAXED);
gs->web_usec_max = __atomic_load_n(&live_stats.web_usec_max, __ATOMIC_RELAXED);
-// gs->bytes_received = __atomic_load_n(&live_stats.bytes_received, __ATOMIC_RELAXED);
-// gs->bytes_sent = __atomic_load_n(&live_stats.bytes_sent, __ATOMIC_RELAXED);
+ gs->bytes_received = __atomic_load_n(&live_stats.bytes_received, __ATOMIC_RELAXED);
+ gs->bytes_sent = __atomic_load_n(&live_stats.bytes_sent, __ATOMIC_RELAXED);
gs->content_size_uncompressed = __atomic_load_n(&live_stats.content_size_uncompressed, __ATOMIC_RELAXED);
gs->content_size_compressed = __atomic_load_n(&live_stats.content_size_compressed, __ATOMIC_RELAXED);
@@ -123,6 +125,38 @@ void pulse_web_do(bool extended) {
// ----------------------------------------------------------------
+ {
+ static RRDSET *st_bytes = NULL;
+ static RRDDIM *rd_in = NULL,
+ *rd_out = NULL;
+
+ if (unlikely(!st_bytes)) {
+ st_bytes = rrdset_create_localhost(
+ "netdata"
+ , "net"
+ , NULL
+ , "HTTP API"
+ , "netdata.http_api_traffic"
+ , "Netdata Web API Network Traffic"
+ , "kilobits/s"
+ , "netdata"
+ , "pulse"
+ , 130400
+ , localhost->rrd_update_every
+ , RRDSET_TYPE_AREA
+ );
+
+ rd_in = rrddim_add(st_bytes, "in", NULL, 8, BITS_IN_A_KILOBIT, RRD_ALGORITHM_INCREMENTAL);
+ rd_out = rrddim_add(st_bytes, "out", NULL, -8, BITS_IN_A_KILOBIT, RRD_ALGORITHM_INCREMENTAL);
+ }
+
+ rrddim_set_by_pointer(st_bytes, rd_in, (collected_number) gs.bytes_received);
+ rrddim_set_by_pointer(st_bytes, rd_out, (collected_number) gs.bytes_sent);
+ rrdset_done(st_bytes);
+ }
+
+ // ----------------------------------------------------------------
+
{
static unsigned long long old_web_requests = 0, old_web_usec = 0;
static collected_number average_response_time = -1;
--- src/daemon/pulse/pulse-network.c
@@ -1,189 +0,0 @@
-// SPDX-License-Identifier: GPL-3.0-or-later
-
-#define PULSE_INTERNALS 1
-#include "pulse-network.h"
-
-#define PULSE_NETWORK_CHART_TITLE "Netdata Network Traffic"
-#define PULSE_NETWORK_CHART_FAMILY "Network Traffic"
-#define PULSE_NETWORK_CHART_CONTEXT "netdata.network"
-#define PULSE_NETWORK_CHART_UNITS "kilobits/s"
-#define PULSE_NETWORK_CHART_PRIORITY 130150
-
-static struct network_statistics {
- bool extended;
- PAD64(uint64_t) api_bytes_received;
- PAD64(uint64_t) api_bytes_sent;
- PAD64(uint64_t) statsd_bytes_received;
- PAD64(uint64_t) statsd_bytes_sent;
- PAD64(uint64_t) stream_bytes_received;
- PAD64(uint64_t) stream_bytes_sent;
-} live_stats = { 0 };
-
-void pulse_web_server_received_bytes(size_t bytes) {
- __atomic_add_fetch(&live_stats.api_bytes_received, bytes, __ATOMIC_RELAXED);
-}
-
-void pulse_web_server_sent_bytes(size_t bytes) {
- __atomic_add_fetch(&live_stats.api_bytes_sent, bytes, __ATOMIC_RELAXED);
-}
-
-void pulse_statsd_received_bytes(size_t bytes) {
- __atomic_add_fetch(&live_stats.statsd_bytes_received, bytes, __ATOMIC_RELAXED);
-}
-
-void pulse_statsd_sent_bytes(size_t bytes) {
- __atomic_add_fetch(&live_stats.statsd_bytes_sent, bytes, __ATOMIC_RELAXED);
-}
-
-void pulse_stream_received_bytes(size_t bytes) {
- __atomic_add_fetch(&live_stats.stream_bytes_received, bytes, __ATOMIC_RELAXED);
-}
-
-void pulse_stream_sent_bytes(size_t bytes) {
- __atomic_add_fetch(&live_stats.stream_bytes_sent, bytes, __ATOMIC_RELAXED);
-}
-
-static inline void pulse_network_copy(struct network_statistics *gs) {
- gs->api_bytes_received = __atomic_load_n(&live_stats.api_bytes_received, __ATOMIC_RELAXED);
- gs->api_bytes_sent = __atomic_load_n(&live_stats.api_bytes_sent, __ATOMIC_RELAXED);
-
- gs->statsd_bytes_received = __atomic_load_n(&live_stats.statsd_bytes_received, __ATOMIC_RELAXED);
- gs->statsd_bytes_sent = __atomic_load_n(&live_stats.statsd_bytes_sent, __ATOMIC_RELAXED);
-
- gs->stream_bytes_received = __atomic_load_n(&live_stats.stream_bytes_received, __ATOMIC_RELAXED);
- gs->stream_bytes_sent = __atomic_load_n(&live_stats.stream_bytes_sent, __ATOMIC_RELAXED);
-}
-
-void pulse_network_do(bool extended __maybe_unused) {
- static struct network_statistics gs;
- pulse_network_copy(&gs);
-
- if(gs.api_bytes_received || gs.api_bytes_sent) {
- static RRDSET *st_bytes = NULL;
- static RRDDIM *rd_in = NULL,
- *rd_out = NULL;
-
- if (unlikely(!st_bytes)) {
- st_bytes = rrdset_create_localhost(
- "netdata"
- , "network_api"
- , NULL
- , PULSE_NETWORK_CHART_FAMILY
- , PULSE_NETWORK_CHART_CONTEXT
- , PULSE_NETWORK_CHART_TITLE
- , PULSE_NETWORK_CHART_UNITS
- , "netdata"
- , "pulse"
- , PULSE_NETWORK_CHART_PRIORITY
- , localhost->rrd_update_every
- , RRDSET_TYPE_AREA
- );
-
- rrdlabels_add(st_bytes->rrdlabels, "endpoint", "web-server", RRDLABEL_SRC_AUTO);
-
- rd_in = rrddim_add(st_bytes, "in", NULL, 8, BITS_IN_A_KILOBIT, RRD_ALGORITHM_INCREMENTAL);
- rd_out = rrddim_add(st_bytes, "out", NULL, -8, BITS_IN_A_KILOBIT, RRD_ALGORITHM_INCREMENTAL);
- }
-
- rrddim_set_by_pointer(st_bytes, rd_in, (collected_number) gs.api_bytes_received);
- rrddim_set_by_pointer(st_bytes, rd_out, (collected_number) gs.api_bytes_sent);
- rrdset_done(st_bytes);
- }
-
- if(gs.statsd_bytes_received || gs.statsd_bytes_sent) {
- static RRDSET *st_bytes = NULL;
- static RRDDIM *rd_in = NULL,
- *rd_out = NULL;
-
- if (unlikely(!st_bytes)) {
- st_bytes = rrdset_create_localhost(
- "netdata"
- , "network_statsd"
- , NULL
- , PULSE_NETWORK_CHART_FAMILY
- , PULSE_NETWORK_CHART_CONTEXT
- , PULSE_NETWORK_CHART_TITLE
- , PULSE_NETWORK_CHART_UNITS
- , "netdata"
- , "pulse"
- , PULSE_NETWORK_CHART_PRIORITY
- , localhost->rrd_update_every
- , RRDSET_TYPE_AREA
- );
-
- rrdlabels_add(st_bytes->rrdlabels, "endpoint", "statsd", RRDLABEL_SRC_AUTO);
-
- rd_in = rrddim_add(st_bytes, "in", NULL, 8, BITS_IN_A_KILOBIT, RRD_ALGORITHM_INCREMENTAL);
- rd_out = rrddim_add(st_bytes, "out", NULL, -8, BITS_IN_A_KILOBIT, RRD_ALGORITHM_INCREMENTAL);
- }
-
- rrddim_set_by_pointer(st_bytes, rd_in, (collected_number) gs.statsd_bytes_received);
- rrddim_set_by_pointer(st_bytes, rd_out, (collected_number) gs.statsd_bytes_sent);
- rrdset_done(st_bytes);
- }
-
- if(gs.stream_bytes_received || gs.stream_bytes_sent) {
- static RRDSET *st_bytes = NULL;
- static RRDDIM *rd_in = NULL,
- *rd_out = NULL;
-
- if (unlikely(!st_bytes)) {
- st_bytes = rrdset_create_localhost(
- "netdata"
- , "network_streaming"
- , NULL
- , PULSE_NETWORK_CHART_FAMILY
- , PULSE_NETWORK_CHART_CONTEXT
- , PULSE_NETWORK_CHART_TITLE
- , PULSE_NETWORK_CHART_UNITS
- , "netdata"
- , "pulse"
- , PULSE_NETWORK_CHART_PRIORITY
- , localhost->rrd_update_every
- , RRDSET_TYPE_AREA
- );
-
- rrdlabels_add(st_bytes->rrdlabels, "endpoint", "streaming", RRDLABEL_SRC_AUTO);
-
- rd_in = rrddim_add(st_bytes, "in", NULL, 8, BITS_IN_A_KILOBIT, RRD_ALGORITHM_INCREMENTAL);
- rd_out = rrddim_add(st_bytes, "out", NULL, -8, BITS_IN_A_KILOBIT, RRD_ALGORITHM_INCREMENTAL);
- }
-
- rrddim_set_by_pointer(st_bytes, rd_in, (collected_number) gs.stream_bytes_received);
- rrddim_set_by_pointer(st_bytes, rd_out, (collected_number) gs.stream_bytes_sent);
- rrdset_done(st_bytes);
- }
-
- if(aclk_online()) {
- struct mqtt_wss_stats t = aclk_statistics();
- if (t.bytes_rx || t.bytes_tx) {
- static RRDSET *st_bytes = NULL;
- static RRDDIM *rd_in = NULL, *rd_out = NULL;
-
- if (unlikely(!st_bytes)) {
- st_bytes = rrdset_create_localhost(
- "netdata",
- "network_aclk",
- NULL,
- PULSE_NETWORK_CHART_FAMILY,
- PULSE_NETWORK_CHART_CONTEXT,
- PULSE_NETWORK_CHART_TITLE,
- PULSE_NETWORK_CHART_UNITS,
- "netdata",
- "pulse",
- PULSE_NETWORK_CHART_PRIORITY,
- localhost->rrd_update_every,
- RRDSET_TYPE_AREA);
-
- rrdlabels_add(st_bytes->rrdlabels, "endpoint", "aclk", RRDLABEL_SRC_AUTO);
-
- rd_in = rrddim_add(st_bytes, "in", NULL, 8, BITS_IN_A_KILOBIT, RRD_ALGORITHM_INCREMENTAL);
- rd_out = rrddim_add(st_bytes, "out", NULL, -8, BITS_IN_A_KILOBIT, RRD_ALGORITHM_INCREMENTAL);
- }
-
- rrddim_set_by_pointer(st_bytes, rd_in, (collected_number)t.bytes_rx);
- rrddim_set_by_pointer(st_bytes, rd_out, (collected_number)t.bytes_tx);
- rrdset_done(st_bytes);
- }
- }
-}
--- src/daemon/pulse/pulse-network.h
@@ -1,19 +0,0 @@
-// SPDX-License-Identifier: GPL-3.0-or-later
-
-#ifndef NETDATA_PULSE_NETWORK_H
-#define NETDATA_PULSE_NETWORK_H
-
-#include "daemon/common.h"
-
-void pulse_network_do(bool extended);
-
-void pulse_web_server_received_bytes(size_t bytes);
-void pulse_web_server_sent_bytes(size_t bytes);
-
-void pulse_statsd_received_bytes(size_t bytes);
-void pulse_statsd_sent_bytes(size_t bytes);
-
-void pulse_stream_received_bytes(size_t bytes);
-void pulse_stream_sent_bytes(size_t bytes);
-
-#endif //NETDATA_PULSE_NETWORK_H
--- src/daemon/pulse/pulse.c
@@ -18,9 +18,8 @@
#define WORKER_JOB_MALLOC_TRACE 12
#define WORKER_JOB_REGISTRY 13
#define WORKER_JOB_ARAL 14
-#define WORKER_JOB_NETWORK 15
-#if WORKER_UTILIZATION_MAX_JOB_TYPES < 16
+#if WORKER_UTILIZATION_MAX_JOB_TYPES < 15
#error "WORKER_UTILIZATION_MAX_JOB_TYPES has to be at least 14"
#endif
@@ -45,7 +44,6 @@ static void pulse_register_workers(void) {
worker_register_job_name(WORKER_JOB_MALLOC_TRACE, "malloc_trace");
worker_register_job_name(WORKER_JOB_REGISTRY, "registry");
worker_register_job_name(WORKER_JOB_ARAL, "aral");
- worker_register_job_name(WORKER_JOB_NETWORK, "network");
}
static void pulse_cleanup(void *pptr)
@@ -101,9 +99,6 @@ void *pulse_thread_main(void *ptr) {
worker_is_busy(WORKER_JOB_QUERIES);
pulse_queries_do(pulse_extended_enabled);
- worker_is_busy(WORKER_JOB_NETWORK);
- pulse_network_do(pulse_extended_enabled);
-
worker_is_busy(WORKER_JOB_ML);
pulse_ml_do(pulse_extended_enabled);
--- src/daemon/pulse/pulse.h
@@ -24,7 +24,6 @@ extern bool pulse_extended_enabled;
#include "pulse-workers.h"
#include "pulse-trace-allocations.h"
#include "pulse-aral.h"
-#include "pulse-network.h"
void *pulse_thread_main(void *ptr);
void *pulse_thread_sqlite3_main(void *ptr);
--- src/database/contexts/worker.c
@@ -407,7 +407,7 @@ static void rrdinstance_post_process_updates(RRDINSTANCE *ri, bool force, RRD_FL
if(dictionary_entries(ri->rrdmetrics) > 0) {
RRDMETRIC *rm;
dfe_start_read((DICTIONARY *)ri->rrdmetrics, rm) {
- if(unlikely(worker_jobs && !service_running(SERVICE_CONTEXT))) break;
+ if(unlikely(!service_running(SERVICE_CONTEXT))) break;
RRD_FLAGS reason_to_pass = reason;
if(rrd_flag_check(ri, RRD_FLAG_UPDATE_REASON_UPDATE_RETENTION))
@@ -516,7 +516,7 @@ static void rrdcontext_post_process_updates(RRDCONTEXT *rc, bool force, RRD_FLAG
if(dictionary_entries(rc->rrdinstances) > 0) {
RRDINSTANCE *ri;
dfe_start_reentrant(rc->rrdinstances, ri) {
- if(unlikely(worker_jobs && !service_running(SERVICE_CONTEXT))) break;
+ if(unlikely(!service_running(SERVICE_CONTEXT))) break;
RRD_FLAGS reason_to_pass = reason;
if(rrd_flag_check(rc, RRD_FLAG_UPDATE_REASON_UPDATE_RETENTION))
@@ -703,7 +703,7 @@ static void rrdcontext_dequeue_from_post_processing(RRDCONTEXT *rc) {
void rrdcontext_initial_processing_after_loading(RRDCONTEXT *rc) {
rrdcontext_dequeue_from_post_processing(rc);
- rrdcontext_post_process_updates(rc, false, RRD_FLAG_NONE, false);
+ rrdcontext_post_process_updates(rc, false, RRD_FLAG_NONE, true);
}
void rrdcontext_delete_after_loading(RRDHOST *host, RRDCONTEXT *rc) {
--- src/database/sqlite/sqlite_aclk_node.c
@@ -217,7 +217,7 @@ void aclk_check_node_info_and_collectors(void)
context_pp_post = "')";
}
- nd_log_limit_static_global_var(erl, 10, 100 * USEC_PER_MS);
+ nd_log_limit_static_thread_var(erl, 10, 100 * USEC_PER_MS);
nd_log_limit(&erl, NDLS_DAEMON, NDLP_INFO,
"NODES INFO: %zu nodes loading contexts%s%s%s, %zu receiving replication%s%s%s, %zu sending replication%s%s%s, %zu pending context post processing%s%s%s%s",
context_loading, context_loading_pre, context_loading_body, context_loading_post,
--- src/streaming/stream-receiver.c
@@ -167,7 +167,6 @@ static inline ssize_t receiver_read_uncompressed(struct receiver_state *r) {
r->thread.uncompressed.read_len += bytes;
r->thread.uncompressed.read_buffer[r->thread.uncompressed.read_len] = '\0';
- pulse_stream_received_bytes(bytes);
}
return bytes;
@@ -277,16 +276,15 @@ static inline ssize_t receiver_read_compressed(struct receiver_state *r) {
internal_fatal(r->thread.uncompressed.read_buffer[r->thread.uncompressed.read_len] != '\0',
"%s: read_buffer does not start with zero #2", __FUNCTION__ );
- ssize_t bytes = read_stream(r, r->thread.compressed.buf + r->thread.compressed.used,
+ ssize_t bytes_read = read_stream(r, r->thread.compressed.buf + r->thread.compressed.used,
r->thread.compressed.size - r->thread.compressed.used);
- if(bytes > 0) {
- r->thread.compressed.used += bytes;
- worker_set_metric(WORKER_RECEIVER_JOB_BYTES_READ, (NETDATA_DOUBLE)bytes);
- pulse_stream_received_bytes(bytes);
+ if(bytes_read > 0) {
+ r->thread.compressed.used += bytes_read;
+ worker_set_metric(WORKER_RECEIVER_JOB_BYTES_READ, (NETDATA_DOUBLE)bytes_read);
}
- return bytes;
+ return bytes_read;
}
// --------------------------------------------------------------------------------------------------------------------
@@ -699,7 +697,6 @@ bool stream_receiver_send_data(struct stream_thread *sth, struct receiver_state
ssize_t rc = write_stream(rpt, chunk, outstanding);
if (likely(rc > 0)) {
- pulse_stream_sent_bytes(rc);
rpt->thread.last_traffic_ut = now_ut;
stream_circular_buffer_del_unsafe(scb, rc, now_ut);
if (!stats->bytes_outstanding) {
--- src/streaming/stream-sender.c
@@ -622,7 +622,6 @@ bool stream_sender_send_data(struct stream_thread *sth, struct sender_state *s,
ssize_t rc = nd_sock_send_nowait(&s->sock, chunk, outstanding);
if (likely(rc > 0)) {
- pulse_stream_sent_bytes(rc);
stream_circular_buffer_del_unsafe(s->scb, rc, now_ut);
replication_sender_recalculate_buffer_used_ratio_unsafe(s);
s->thread.last_traffic_ut = now_ut;
@@ -713,7 +712,6 @@ bool stream_sender_receive_data(struct stream_thread *sth, struct sender_state *
s->thread.last_traffic_ut = now_ut;
sth->snd.bytes_received += rc;
- pulse_stream_received_bytes(rc);
worker_is_busy(WORKER_SENDER_JOB_EXECUTE);
stream_sender_execute_commands(s);
--- src/web/server/static/static-threaded.c
@@ -179,17 +179,13 @@ static int web_server_rcv_callback(POLLINFO *pi, nd_poll_event_t *events) {
bytes = web_client_receive(w);
if (likely(bytes > 0)) {
- pulse_web_server_received_bytes(bytes);
-
netdata_log_debug(D_WEB_CLIENT, "%llu: processing received data on fd %d.", w->id, fd);
worker_is_idle();
worker_is_busy(WORKER_JOB_PROCESS);
web_client_process_request_from_web_server(w);
if (unlikely(w->mode == HTTP_REQUEST_MODE_STREAM)) {
- ssize_t rc = web_client_send(w);
- if(rc > 0)
- pulse_web_server_sent_bytes(rc);
+ web_client_send(w);
}
else if(unlikely(w->fd == fd && web_client_has_wait_receive(w)))
*events |= ND_POLL_READ;
@@ -233,8 +229,6 @@ static int web_server_snd_callback(POLLINFO *pi, nd_poll_event_t *events) {
goto cleanup;
}
- pulse_web_server_sent_bytes(ret);
-
if(unlikely(w->fd == fd && web_client_has_wait_receive(w)))
*events |= ND_POLL_READ;
|
netdata
|
netdata
|
C
|
C
| 73,681
| 6,023
|
X-Ray Vision for your infrastructure!
|
netdata_netdata
|
NEW_FEAT
|
seems like a new feature for network traffic added
|
2da69da0263189bc5622422902f2b5950a0656ec
|
2023-06-14 02:27:49
|
Christopher Helmerich
|
Update README.md
| false
| 1
| 1
| 2
|
--- README.md
@@ -1,2 +1,2 @@
# WarpFactory
-A numerical toolkit for analyzing warp drive spacetimes
+A numerical toolkit for analyzing warp drive spacetime
|
warpfactory
|
nerdswithattitudes
|
MATLAB
|
MATLAB
| 298
| 41
|
WarpFactory is a numerical toolkit for analyzing warp drive spacetimes.
|
nerdswithattitudes_warpfactory
|
CONFIG_CHANGE
|
Very small changes
|
8ec78688b80e8d1c92cc71a61a442e5c5c17c974
| null |
Jeremy Wu
|
Add empty actionParameters check (#148)
| false
| 1
| 1
| 0
|
--- QueryBuilder.cs
@@ -24,7 +24,7 @@ namespace Wox.Core.Plugin
{ // use non global plugin for query
actionKeyword = possibleActionKeyword;
actionParameters = terms.Skip(1).ToList();
- search = rawQuery.Substring(actionKeyword.Length + 1);
+ search = actionParameters.Count > 0 ? rawQuery.Substring(actionKeyword.Length + 1) : string.Empty;
}
else
{ // non action keyword
|
microsoft_PowerToys.json
| null | null | null | null | null | null |
microsoft_PowerToys.json
|
NEW_FEAT
|
4, add a new empty actionParameters check which is a new feature
|
88e36612cadac099fbde66876083355325572235
|
2024-10-23 03:54:51
|
Ben Pasquariello
|
Add files via upload
| false
| 0
| 0
| 0
|
--- Images/matlabonramp.jpg
Binary files a/Images/matlabonramp.jpg and /dev/null differ
|
awesome-matlab-students
|
mathworks
|
MATLAB
|
MATLAB
| 393
| 42
|
An awesome list of helpful resources for students learning MATLAB & Simulink. List includes tips & tricks, tutorials, videos, cheat sheets, and opportunities to learn MATLAB & Simulink.
|
mathworks_awesome-matlab-students
|
CONFIG_CHANGE
|
Very small changes
|
433c0de20432a33d7468c0d86842e457be6a70af
|
2025-04-03 14:35:44
|
Winter
|
update
| false
| 0
| 0
| 0
|
--- examples/simulation_global.mlx
Binary files a/examples/simulation_global.mlx and b/examples/simulation_global.mlx differ
--- examples/simulation_local.mlx
Binary files a/examples/simulation_local.mlx and b/examples/simulation_local.mlx differ
--- utils/env/map_creater.mlx
Binary files a/utils/env/map_creater.mlx and b/utils/env/map_creater.mlx differ
|
matlab_motion_planning
|
ai-winter
|
MATLAB
|
MATLAB
| 419
| 66
|
Motion planning and Navigation of AGV/AMR:matlab implementation of Dijkstra, A*, Theta*, JPS, D*, LPA*, D* Lite, RRT, RRT*, RRT-Connect, Informed RRT*, ACO, Voronoi, PID, LQR, MPC, APF, RPP, DWA, DDPG, Bezier, B-spline, Dubins, Reeds-Shepp etc.
|
ai-winter_matlab_motion_planning
|
CONFIG_CHANGE
|
Very small changes
|
0d8dfaf31243bbbf1b0c831e4f029b435c8e30c8
|
2024-09-26 21:27:17
|
Toran Bruce Richards
|
tweak(docs): Update setup.md with git instructions. (#8192) Update setup.md with git instructions.
| false
| 11
| 0
| 11
|
--- docs/content/server/setup.md
@@ -20,7 +20,6 @@ To setup the server, you need to have the following installed:
- [Node.js](https://nodejs.org/en/)
- [Docker](https://docs.docker.com/get-docker/)
-- [Git](https://git-scm.com/downloads)
### Checking if you have Node.js & NPM installed
@@ -60,16 +59,6 @@ docker-compose -v
Once you have Docker and Docker Compose installed, you can proceed to the next step.
-## Cloning the Repository
-The first step is cloning the AutoGPT repository to your computer.
-To do this, open a terminal window in a folder on your computer and run:
-```
-git clone https://github.com/Significant-Gravitas/AutoGPT.git
-```
-If you get stuck, follow [this guide](https://docs.github.com/en/repositories/creating-and-managing-repositories/cloning-a-repository).
-
-Once that's complete you can close this terminal window.
-
## Running the backend services
To run the backend services, follow these steps:
|
autogpt
|
significant-gravitas
|
Python
|
Python
| 172,255
| 45,197
|
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
|
significant-gravitas_autogpt
|
DOC_CHANGE
|
Obvious
|
1b2526eaceeedcc4b9dc3c31e15c50eb11a7343d
|
2025-01-20 13:31:56
|
Claudio Cambra
|
macosx: Add VLCLibraryMusicGenreSubSegment Signed-off-by: Claudio Cambra <[email protected]>
| false
| 37
| 18
| 55
|
--- modules/gui/macosx/library/VLCLibrarySegment.h
@@ -39,7 +39,7 @@ typedef NS_ENUM(NSInteger, VLCLibrarySegmentType) {
VLCLibraryArtistsMusicSubSegment,
VLCLibraryAlbumsMusicSubSegment,
VLCLibrarySongsMusicSubSegment,
- VLCLibraryGenresMusicSubSegmentType,
+ VLCLibraryGenresMusicSubSegment,
VLCLibraryPlaylistsSegmentType,
VLCLibraryPlaylistsMusicOnlyPlaylistsSubSegmentType,
VLCLibraryPlaylistsVideoOnlyPlaylistsSubSegmentType,
--- modules/gui/macosx/library/VLCLibrarySegment.m
@@ -133,31 +133,6 @@ - (instancetype)init
@end
-@interface VLCLibraryMusicGenreSubSegment : VLCLibrarySegment
-@end
-
-@implementation VLCLibraryMusicGenreSubSegment
-
-- (instancetype)init
-{
- self = [super initWithSegmentType:VLCLibraryGenresMusicSubSegmentType];
- if (self) {
- self.internalDisplayString = _NS("Genres");
- if (@available(macOS 11.0, *)) {
- self.internalDisplayImage = [NSImage imageWithSystemSymbolName:@"guitars"
- accessibilityDescription:@"Music genres icon"];
- } else {
- self.internalDisplayImage = [NSImage imageNamed:@"sidebar-music"];
- self.internalDisplayImage.template = YES;
- }
- self.internalLibraryViewControllerClass = VLCLibraryAudioViewController.class;
- }
- return self;
-}
-
-@end
-
-
@interface VLCLibraryMusicSegment : VLCLibrarySegment
@end
@@ -180,7 +155,7 @@ - (instancetype)init
[VLCLibrarySegment segmentWithSegmentType:VLCLibraryArtistsMusicSubSegment],
[VLCLibrarySegment segmentWithSegmentType:VLCLibraryAlbumsMusicSubSegment],
[VLCLibrarySegment segmentWithSegmentType:VLCLibrarySongsMusicSubSegment],
- [[VLCLibraryMusicGenreSubSegment alloc] init],
+ [VLCLibrarySegment segmentWithSegmentType:VLCLibraryGenresMusicSubSegment],
];
}
return self;
@@ -531,6 +506,8 @@ - (NSString *)displayStringForType:(VLCLibrarySegmentType)segmentType
return _NS("Albums");
case VLCLibrarySongsMusicSubSegment:
return _NS("Songs");
+ case VLCLibraryGenresMusicSubSegment:
+ return _NS("Genres");
case VLCLibraryShowsVideoSubSegment:
return _NS("Shows");
case VLCLibraryExploreHeaderSegment:
@@ -550,6 +527,7 @@ - (NSImage *)oldIconImageForType:(VLCLibrarySegmentType)segmentType
case VLCLibraryArtistsMusicSubSegment:
case VLCLibraryAlbumsMusicSubSegment:
case VLCLibrarySongsMusicSubSegment:
+ case VLCLibraryGenresMusicSubSegment:
return [NSImage imageNamed:@"sidebar-music"];
case VLCLibraryShowsVideoSubSegment:
return [NSImage imageNamed:@"sidebar-movie"];
@@ -580,6 +558,9 @@ - (NSImage *)newIconImageForType:(VLCLibrarySegmentType)segmentType
case VLCLibrarySongsMusicSubSegment:
return [NSImage imageWithSystemSymbolName:@"music.note"
accessibilityDescription:@"Music songs icon"];
+ case VLCLibraryGenresMusicSubSegment:
+ return [NSImage imageWithSystemSymbolName:@"guitars"
+ accessibilityDescription:@"Music genres icon"];
case VLCLibraryShowsVideoSubSegment:
return [NSImage imageWithSystemSymbolName:@"tv"
accessibilityDescription:@"Shows icon"];
@@ -638,7 +619,7 @@ + (nullable Class)libraryViewControllerClassForSegmentType:(VLCLibrarySegmentTyp
case VLCLibraryArtistsMusicSubSegment:
case VLCLibraryAlbumsMusicSubSegment:
case VLCLibrarySongsMusicSubSegment:
- case VLCLibraryGenresMusicSubSegmentType:
+ case VLCLibraryGenresMusicSubSegment:
return VLCLibraryAudioViewController.class;
case VLCLibraryPlaylistsSegmentType:
case VLCLibraryPlaylistsMusicOnlyPlaylistsSubSegmentType:
@@ -671,7 +652,7 @@ + (VLCLibraryAbstractSegmentViewController *)libraryViewControllerForSegmentType
case VLCLibraryArtistsMusicSubSegment:
case VLCLibraryAlbumsMusicSubSegment:
case VLCLibrarySongsMusicSubSegment:
- case VLCLibraryGenresMusicSubSegmentType:
+ case VLCLibraryGenresMusicSubSegment:
return [[VLCLibraryAudioViewController alloc] initWithLibraryWindow:VLCMain.sharedInstance.libraryWindow];
case VLCLibraryPlaylistsSegmentType:
case VLCLibraryPlaylistsMusicOnlyPlaylistsSubSegmentType:
--- modules/gui/macosx/library/VLCLibraryWindow.m
@@ -237,7 +237,7 @@ - (void)updateGridVsListViewModeSegmentedControl
case VLCLibraryArtistsMusicSubSegment:
_currentSelectedViewModeSegment = preferences.artistLibraryViewMode;
break;
- case VLCLibraryGenresMusicSubSegmentType:
+ case VLCLibraryGenresMusicSubSegment:
_currentSelectedViewModeSegment = preferences.genreLibraryViewMode;
break;
case VLCLibraryAlbumsMusicSubSegment:
@@ -295,7 +295,7 @@ - (void)setViewForSelectedSegment
case VLCLibraryArtistsMusicSubSegment:
case VLCLibraryAlbumsMusicSubSegment:
case VLCLibrarySongsMusicSubSegment:
- case VLCLibraryGenresMusicSubSegmentType:
+ case VLCLibraryGenresMusicSubSegment:
[self showAudioLibrary];
break;
case VLCLibraryPlaylistsSegmentType:
@@ -357,7 +357,7 @@ - (IBAction)gridVsListSegmentedControlAction:(id)sender
case VLCLibraryArtistsMusicSubSegment:
preferences.artistLibraryViewMode = _currentSelectedViewModeSegment;
break;
- case VLCLibraryGenresMusicSubSegmentType:
+ case VLCLibraryGenresMusicSubSegment:
preferences.genreLibraryViewMode = _currentSelectedViewModeSegment;
break;
case VLCLibraryAlbumsMusicSubSegment:
--- modules/gui/macosx/library/VLCLibraryWindowToolbarDelegate.m
@@ -168,7 +168,7 @@ - (void)layoutForSegment:(VLCLibrarySegmentType)segment
case VLCLibraryArtistsMusicSubSegment:
case VLCLibraryAlbumsMusicSubSegment:
case VLCLibrarySongsMusicSubSegment:
- case VLCLibraryGenresMusicSubSegmentType:
+ case VLCLibraryGenresMusicSubSegment:
case VLCLibraryGroupsSegmentType:
case VLCLibraryGroupsGroupSubSegmentType:
case VLCLibraryPlaylistsSegmentType:
--- modules/gui/macosx/library/audio-library/VLCLibraryAudioViewController.m
@@ -350,7 +350,7 @@ - (VLCLibraryViewModeSegment)viewModeSegmentForCurrentLibrarySegment
switch (self.libraryWindow.librarySegmentType) {
case VLCLibraryArtistsMusicSubSegment:
return libraryWindowPrefs.artistLibraryViewMode;
- case VLCLibraryGenresMusicSubSegmentType:
+ case VLCLibraryGenresMusicSubSegment:
return libraryWindowPrefs.genreLibraryViewMode;
case VLCLibrarySongsMusicSubSegment:
return libraryWindowPrefs.songsLibraryViewMode;
@@ -371,7 +371,7 @@ - (VLCAudioLibrarySegment)currentLibrarySegmentToAudioLibrarySegment
return VLCAudioLibraryAlbumsSegment;
case VLCLibrarySongsMusicSubSegment:
return VLCAudioLibrarySongsSegment;
- case VLCLibraryGenresMusicSubSegmentType:
+ case VLCLibraryGenresMusicSubSegment:
return VLCAudioLibraryGenresSegment;
default:
NSAssert(false, @"Non-audio or unknown library subsegment received");
@@ -498,7 +498,7 @@ - (void)presentLibraryItem:(id<VLCMediaLibraryItemProtocol>)libraryItem
} else if ([libraryItem isKindOfClass:VLCMediaLibraryArtist.class]) {
segmentType = VLCLibraryArtistsMusicSubSegment;
} else if ([libraryItem isKindOfClass:VLCMediaLibraryGenre.class]) {
- segmentType = VLCLibraryGenresMusicSubSegmentType;
+ segmentType = VLCLibraryGenresMusicSubSegment;
} else {
segmentType = VLCLibrarySongsMusicSubSegment;
}
@@ -522,7 +522,7 @@ - (void)libraryModelUpdated:(NSNotification *)aNotification
self.libraryWindow.librarySegmentType == VLCLibrarySongsMusicSubSegment ||
self.libraryWindow.librarySegmentType == VLCLibraryArtistsMusicSubSegment ||
self.libraryWindow.librarySegmentType == VLCLibraryAlbumsMusicSubSegment ||
- self.libraryWindow.librarySegmentType == VLCLibraryGenresMusicSubSegmentType) &&
+ self.libraryWindow.librarySegmentType == VLCLibraryGenresMusicSubSegment) &&
((audioCount == 0 && ![self.libraryTargetView.subviews containsObject:self.emptyLibraryView]) ||
(audioCount > 0 && ![self.libraryTargetView.subviews containsObject:_audioLibraryView])) &&
self.libraryWindow.videoViewController.view.hidden) {
|
vlc
| null |
C
|
C
| null | null |
Video player
|
_vlc
|
CODE_IMPROVEMENT
|
Code change: type annotation added
|
6c072792381cee46cb1dd1e45ebdb97e55a4a1a4
|
2023-08-23 18:32:42
|
Flavius
|
Small fix to margin counter
| false
| 2
| 2
| 4
|
--- style.css
@@ -328,11 +328,11 @@ figure blockquote {
}
span {
- margin-right: 20px;
+ margin-right: 34px; /* Adjust as needed */
}
.count {
- font-weight: bold;
+ font-weight: bold; /* Making numbers bold for emphasis */
}
.Info{
|
manifesto
|
opentofu
|
HTML
|
HTML
| 36,134
| 1,083
|
The OpenTF Manifesto expresses concern over HashiCorp's switch of the Terraform license from open-source to the Business Source License (BSL) and calls for the tool's return to a truly open-source license.
|
opentofu_manifesto
|
CODE_IMPROVEMENT
|
Obvious
|
c08182509dbf4913288b389577a2e07c5f010077
| null |
Camille Diez
|
Update README.md
| false
| 1
| 0
| 1
|
--- README.md
@@ -166,6 +166,7 @@ using the matched rule and runs it. Rules enabled by default are as follows:
* `git_pull` – sets upstream before executing previous `git pull`;
* `git_push` – adds `--set-upstream origin $branch` to previous failed `git push`;
* `git_stash` – stashes you local modifications before rebasing or switching branch;
+* `go_run` – appends `.go` extension when compiling/running Go programs
* `grep_recursive` – adds `-r` when you trying to grep directory;
* `has_exists_script` – prepends `./` when script/binary exists;
* `lein_not_task` – fixes wrong `lein` tasks like `lein rpl`;
|
nvbn_thefuck.json
| null | null | null | null | null | null |
nvbn_thefuck.json
|
CONFIG_CHANGE
|
Changes in readme only
|
9620ec2de5f22d1662841d73ec75f2d7a7c9b36a
|
2024-06-27 23:12:53
|
Robert Felker
|
Added packages
| false
| 6
| 4
| 10
|
--- README.md
@@ -366,7 +366,7 @@ Meteo
### Clone
- [GitTouch](https://github.com/pd4d10/git-touch) [1524⭐] - Open source mobile client for GitHub, GitLab, Bitbucket and Gitea by [Rongjian Zhang](https://github.com/pd4d10)
-- [RustDesk](https://github.com/rustdesk/rustdesk) [67590⭐] - Open source virtual / remote desktop. TeamViewer alternative. Built with Rust by [RustDesk team](https://www.rustdesk.com/)
+- [RustDesk](https://github.com/rustdesk/rustdesk) [67589⭐] - Open source virtual / remote desktop. TeamViewer alternative. Built with Rust by [RustDesk team](https://www.rustdesk.com/)
### Machine Learning
@@ -417,9 +417,8 @@ Meteo
### Storage
- [Sqflite](https://github.com/tekartik/sqflite) [2819⭐] - SQLite flutter plugin by [Alexandre Roux](https://www.linkedin.com/in/alextekartik/)
-- [Drift](https://github.com/simolus3/drift) - Drift is an easy to use, reactive, typesafe persistence library for Dart & Flutter by [
+- [Moor](https://github.com/simolus3/moor) - Moor is an easy to use, reactive, typesafe persistence library for Dart & Flutter by [
Simon Binder](https://github.com/simolus3)
-- [ObjectBox](https://github.com/objectbox/objectbox-dart) - On-device database for fast cross-platform Dart object persistence by [ObjectBox](https://github.com/objectbox)
### Services
@@ -523,8 +522,8 @@ This section contains libraries that take an experimental or unorthodox approach
### Premium
-- [AppFlowy](https://github.com/AppFlowy-IO/appflowy) [50175⭐] - Open Source Notion Alternative. You are in charge of your data and customizations. Built with Flutter and Rust by [AppFlowy team](https://www.appflowy.io/)
-- [RustDesk](https://github.com/rustdesk/rustdesk) [67590⭐] - Open source virtual/remote desktop and TeamViewer alternative. Built with Flutter and Rust by [RustDesk team](https://www.rustdesk.com/).
+- [AppFlowy](https://github.com/AppFlowy-IO/appflowy) [50176⭐] - Open Source Notion Alternative. You are in charge of your data and customizations. Built with Flutter and Rust by [AppFlowy team](https://www.appflowy.io/)
+- [RustDesk](https://github.com/rustdesk/rustdesk) [67589⭐] - Open source virtual/remote desktop and TeamViewer alternative. Built with Flutter and Rust by [RustDesk team](https://www.rustdesk.com/).
- [Spotube](https://github.com/KRTirtho/spotube) - Open source Spotify client for desktop and mobile by [Kingkor Roy Tirtho](https://github.com/KRTirtho)
### Top
@@ -627,7 +626,6 @@ This section contains libraries that take an experimental or unorthodox approach
- [Interview Questions](https://github.com/whatsupcoders/Flutter-Interview-Questions) - List of helpful questions you can use to interview potential candidates by [Whatsupcoders](https://github.com/whatsupcoders/Whatsupcoders-flutter)
- [The International Flutter Starter Kit](https://medium.com/flutter-community/intl-flutter-starter-kit-18415e739fb6) - Guide by the experts by [Beyza Sunay Guler](https://twitter.com/BeyzaSunayGler1) & [Nawal Alhamwi](https://twitter.com/__nawalhmw)
-- [Roadmap.sh/flutter](https://roadmap.sh/flutter) - A community curated flutter developer learning roadmap from the 6th most starred GitHub project.
## Community
|
awesome-flutter
|
solido
|
Dart
|
Dart
| 54,974
| 6,726
|
An awesome list that curates the best Flutter libraries, tools, tutorials, articles and more.
|
solido_awesome-flutter
|
DOC_CHANGE
|
changes in readme
|
65e4a7c8b35914e70bfc2b7cbc58c5a9f654db53
|
2023-01-06 19:07:36
|
Oleksii Trekhleb
|
Adding a simple cascading solution to generate a Power Set (#975) * Add a simple cascading version of generating a PowerSet.
* Update README.
* Update README.
* Update README.
| false
| 115
| 12
| 127
|
--- README.md
@@ -99,7 +99,7 @@ a set of rules that precisely define a sequence of operations.
* **Sets**
* `B` [Cartesian Product](src/algorithms/sets/cartesian-product) - product of multiple sets
* `B` [Fisher–Yates Shuffle](src/algorithms/sets/fisher-yates) - random permutation of a finite sequence
- * `A` [Power Set](src/algorithms/sets/power-set) - all subsets of a set (bitwise, backtracking, and cascading solutions)
+ * `A` [Power Set](src/algorithms/sets/power-set) - all subsets of a set (bitwise and backtracking solutions)
* `A` [Permutations](src/algorithms/sets/permutations) (with and without repetitions)
* `A` [Combinations](src/algorithms/sets/combinations) (with and without repetitions)
* `A` [Longest Common Subsequence](src/algorithms/sets/longest-common-subsequence) (LCS)
--- src/algorithms/sets/power-set/README.md
@@ -1,7 +1,7 @@
# Power Set
Power set of a set `S` is the set of all of the subsets of `S`, including the
-empty set and `S` itself. Power set of set `S` is denoted as `P(S)`.
+empty set and `S` itself. Power set of set `S` is denoted as `P(S)`.
For example for `{x, y, z}`, the subsets
are:
@@ -21,37 +21,37 @@ are:

-Here is how we may illustrate the elements of the power set of the set `{x, y, z}` ordered with respect to
+Here is how we may illustrate the elements of the power set of the set `{x, y, z}` ordered with respect to
inclusion:

**Number of Subsets**
-If `S` is a finite set with `|S| = n` elements, then the number of subsets
-of `S` is `|P(S)| = 2^n`. This fact, which is the motivation for the
+If `S` is a finite set with `|S| = n` elements, then the number of subsets
+of `S` is `|P(S)| = 2^n`. This fact, which is the motivation for the
notation `2^S`, may be demonstrated simply as follows:
-> First, order the elements of `S` in any manner. We write any subset of `S` in
-the format `{γ1, γ2, ..., γn}` where `γi , 1 ≤ i ≤ n`, can take the value
+> First, order the elements of `S` in any manner. We write any subset of `S` in
+the format `{γ1, γ2, ..., γn}` where `γi , 1 ≤ i ≤ n`, can take the value
of `0` or `1`. If `γi = 1`, the `i`-th element of `S` is in the subset;
-otherwise, the `i`-th element is not in the subset. Clearly the number of
+otherwise, the `i`-th element is not in the subset. Clearly the number of
distinct subsets that can be constructed this way is `2^n` as `γi ∈ {0, 1}`.
## Algorithms
### Bitwise Solution
-Each number in binary representation in a range from `0` to `2^n` does exactly
-what we need: it shows by its bits (`0` or `1`) whether to include related
-element from the set or not. For example, for the set `{1, 2, 3}` the binary
+Each number in binary representation in a range from `0` to `2^n` does exactly
+what we need: it shows by its bits (`0` or `1`) whether to include related
+element from the set or not. For example, for the set `{1, 2, 3}` the binary
number of `0b010` would mean that we need to include only `2` to the current set.
| | `abc` | Subset |
| :---: | :---: | :-----------: |
| `0` | `000` | `{}` |
| `1` | `001` | `{c}` |
-| `2` | `010` | `{b}` |
+| `2` | `010` | `{b}` |
| `3` | `011` | `{c, b}` |
| `4` | `100` | `{a}` |
| `5` | `101` | `{a, c}` |
@@ -68,44 +68,6 @@ element.
> See [btPowerSet.js](./btPowerSet.js) file for backtracking solution.
-### Cascading Solution
-
-This is, arguably, the simplest solution to generate a Power Set.
-
-We start with an empty set:
-
-```text
-powerSets = [[]]
-```
-
-Now, let's say:
-
-```text
-originalSet = [1, 2, 3]
-```
-
-Let's add the 1st element from the originalSet to all existing sets:
-
-```text
-[[]] ← 1 = [[], [1]]
-```
-
-Adding the 2nd element to all existing sets:
-
-```text
-[[], [1]] ← 2 = [[], [1], [2], [1, 2]]
-```
-
-Adding the 3nd element to all existing sets:
-
-```
-[[], [1], [2], [1, 2]] ← 3 = [[], [1], [2], [1, 2], [3], [1, 3], [2, 3], [1, 2, 3]]
-```
-
-And so on, for the rest of the elements from the `originalSet`. On every iteration the number of sets is doubled, so we'll get `2^n` sets.
-
-> See [caPowerSet.js](./caPowerSet.js) file for cascading solution.
-
## References
* [Wikipedia](https://en.wikipedia.org/wiki/Power_set)
--- src/algorithms/sets/power-set/__test__/caPowerSet.test.js
@@ -1,28 +0,0 @@
-import caPowerSet from '../caPowerSet';
-
-describe('caPowerSet', () => {
- it('should calculate power set of given set using cascading approach', () => {
- expect(caPowerSet([1])).toEqual([
- [],
- [1],
- ]);
-
- expect(caPowerSet([1, 2])).toEqual([
- [],
- [1],
- [2],
- [1, 2],
- ]);
-
- expect(caPowerSet([1, 2, 3])).toEqual([
- [],
- [1],
- [2],
- [1, 2],
- [3],
- [1, 3],
- [2, 3],
- [1, 2, 3],
- ]);
- });
-});
--- src/algorithms/sets/power-set/caPowerSet.js
@@ -1,37 +0,0 @@
-/**
- * Find power-set of a set using CASCADING approach.
- *
- * @param {*[]} originalSet
- * @return {*[][]}
- */
-export default function caPowerSet(originalSet) {
- // Let's start with an empty set.
- const sets = [[]];
-
- /*
- Now, let's say:
- originalSet = [1, 2, 3].
-
- Let's add the first element from the originalSet to all existing sets:
- [[]] ← 1 = [[], [1]]
-
- Adding the 2nd element to all existing sets:
- [[], [1]] ← 2 = [[], [1], [2], [1, 2]]
-
- Adding the 3nd element to all existing sets:
- [[], [1], [2], [1, 2]] ← 3 = [[], [1], [2], [1, 2], [3], [1, 3], [2, 3], [1, 2, 3]]
-
- And so on for the rest of the elements from originalSet.
- On every iteration the number of sets is doubled, so we'll get 2^n sets.
- */
- for (let numIdx = 0; numIdx < originalSet.length; numIdx += 1) {
- const existingSetsNum = sets.length;
-
- for (let setIdx = 0; setIdx < existingSetsNum; setIdx += 1) {
- const set = [...sets[setIdx], originalSet[numIdx]];
- sets.push(set);
- }
- }
-
- return sets;
-}
|
javascript-algorithms
|
trekhleb
|
JavaScript
|
JavaScript
| 190,336
| 30,518
|
📝 Algorithms and data structures implemented in JavaScript with explanations and links to further readings
|
trekhleb_javascript-algorithms
|
DOC_CHANGE
|
Obvious
|
d9c48370df338eedb0637f8f76e3e0e0cc99cf6d
|
2024-11-22 04:08:37
|
Kieran
|
[Enhancement] Adds ability to enable/disable sources (#481) * [Unrelated] updated module name for existing liveview module
* Updated toggle component and moved MP index table to a liveview
* [WIP] reverted MP index table; added source count to MP index
* Moved new live table to sources index
* Added 'enabled' boolean to sources
* Got 'enabled' logic working re: downloading pending media
* Updated sources context to do the right thing when a source is updated
* Docs and tests
* Updated slow indexing to maintain its old schedule if re-enabled
* Hooked up the enabled toggle to the sources page
* [Unrelated] added direct links to various tabs on the sources table
* More tests
* Removed unneeded guard in
* Removed outdated comment
| false
| 624
| 144
| 768
|
--- assets/js/alpine_helpers.js
@@ -35,10 +35,3 @@ window.markVersionAsSeen = (versionString) => {
window.isVersionSeen = (versionString) => {
return localStorage.getItem('seenVersion') === versionString
}
-
-window.dispatchFor = (elementOrId, eventName, detail = {}) => {
- const element =
- typeof elementOrId === 'string' ? document.getElementById(elementOrId) : elementOrId
-
- element.dispatchEvent(new CustomEvent(eventName, { detail }))
-}
--- assets/js/app.js
@@ -39,7 +39,7 @@ let liveSocket = new LiveSocket(document.body.dataset.socketPath, Socket, {
}
},
hooks: {
- 'supress-enter-submission': {
+ supressEnterSubmission: {
mounted() {
this.el.addEventListener('keypress', (event) => {
if (event.key === 'Enter') {
@@ -47,29 +47,6 @@ let liveSocket = new LiveSocket(document.body.dataset.socketPath, Socket, {
}
})
}
- },
- 'formless-input': {
- mounted() {
- const subscribedEvents = this.el.dataset.subscribe.split(' ')
- const eventName = this.el.dataset.eventName || ''
- const identifier = this.el.dataset.identifier || ''
-
- subscribedEvents.forEach((domEvent) => {
- this.el.addEventListener(domEvent, () => {
- // This ensures that the event is pushed to the server after the input value has been updated
- // so that the server has the most up-to-date value
- setTimeout(() => {
- this.pushEvent('formless-input', {
- value: this.el.value,
- id: identifier,
- event: eventName,
- dom_id: this.el.id,
- dom_event: domEvent
- })
- }, 0)
- })
- })
- }
}
}
})
--- lib/pinchflat/fast_indexing/fast_indexing_helpers.ex
@@ -11,27 +11,14 @@ defmodule Pinchflat.FastIndexing.FastIndexingHelpers do
alias Pinchflat.Repo
alias Pinchflat.Media
- alias Pinchflat.Tasks
alias Pinchflat.Sources.Source
alias Pinchflat.FastIndexing.YoutubeRss
alias Pinchflat.FastIndexing.YoutubeApi
alias Pinchflat.Downloading.DownloadingHelpers
- alias Pinchflat.FastIndexing.FastIndexingWorker
alias Pinchflat.Downloading.DownloadOptionBuilder
alias Pinchflat.YtDlp.Media, as: YtDlpMedia
- @doc """
- Kicks off a new fast indexing task for a source. This will delete any existing fast indexing
- tasks for the source before starting a new one.
-
- Returns {:ok, %Task{}}
- """
- def kickoff_indexing_task(%Source{} = source) do
- Tasks.delete_pending_tasks_for(source, "FastIndexingWorker", include_executing: true)
- FastIndexingWorker.kickoff_with_task(source)
- end
-
@doc """
Fetches new media IDs for a source from YT's API or RSS, indexes them, and kicks off downloading
tasks for any pending media items. See comments in `FastIndexingWorker` for more info on the
--- lib/pinchflat/profiles/profiles_query.ex
@@ -1,29 +0,0 @@
-defmodule Pinchflat.Profiles.ProfilesQuery do
- @moduledoc """
- Query helpers for the Profiles context.
-
- These methods are made to be one-ish liners used
- to compose queries. Each method should strive to do
- _one_ thing. These don't need to be tested as
- they are just building blocks for other functionality
- which, itself, will be tested.
- """
- import Ecto.Query, warn: false
-
- alias Pinchflat.Profiles.MediaProfile
-
- # This allows the module to be aliased and query methods to be used
- # all in one go
- # usage: use Pinchflat.Profiles.ProfilesQuery
- defmacro __using__(_opts) do
- quote do
- import Ecto.Query, warn: false
-
- alias unquote(__MODULE__)
- end
- end
-
- def new do
- MediaProfile
- end
-end
--- lib/pinchflat/slow_indexing/slow_indexing_helpers.ex
@@ -25,28 +25,13 @@ defmodule Pinchflat.SlowIndexing.SlowIndexingHelpers do
Starts tasks for indexing a source's media regardless of the source's indexing
frequency. It's assumed the caller will check for indexing frequency.
- Returns {:ok, %Task{}}
+ Returns {:ok, %Task{}}.
"""
def kickoff_indexing_task(%Source{} = source, job_args \\ %{}, job_opts \\ []) do
- job_offset_seconds = calculate_job_offset_seconds(source)
-
Tasks.delete_pending_tasks_for(source, "FastIndexingWorker")
Tasks.delete_pending_tasks_for(source, "MediaCollectionIndexingWorker", include_executing: true)
- MediaCollectionIndexingWorker.kickoff_with_task(source, job_args, job_opts ++ [schedule_in: job_offset_seconds])
- end
-
- @doc """
- A helper method to delete all indexing-related tasks for a source.
- Optionally, you can include executing tasks in the deletion process.
-
- Returns :ok
- """
- def delete_indexing_tasks(%Source{} = source, opts \\ []) do
- include_executing = Keyword.get(opts, :include_executing, false)
-
- Tasks.delete_pending_tasks_for(source, "FastIndexingWorker", include_executing: include_executing)
- Tasks.delete_pending_tasks_for(source, "MediaCollectionIndexingWorker", include_executing: include_executing)
+ MediaCollectionIndexingWorker.kickoff_with_task(source, job_args, job_opts)
end
@doc """
@@ -156,14 +141,4 @@ defmodule Pinchflat.SlowIndexing.SlowIndexingHelpers do
changeset
end
end
-
- # Find the difference between the current time and the last time the source was indexed
- defp calculate_job_offset_seconds(%Source{last_indexed_at: nil}), do: 0
-
- defp calculate_job_offset_seconds(source) do
- offset_seconds = DateTime.diff(DateTime.utc_now(), source.last_indexed_at, :second)
- index_frequency_seconds = source.index_frequency_minutes * 60
-
- max(0, index_frequency_seconds - offset_seconds)
- end
end
--- lib/pinchflat/sources/source.ex
@@ -15,7 +15,6 @@ defmodule Pinchflat.Sources.Source do
alias Pinchflat.Metadata.SourceMetadata
@allowed_fields ~w(
- enabled
collection_name
collection_id
collection_type
@@ -65,7 +64,6 @@ defmodule Pinchflat.Sources.Source do
)a
schema "sources" do
- field :enabled, :boolean, default: true
# This is _not_ used as the primary key or internally in the database
# relations. This is only used to prevent an enumeration attack on the streaming
# and RSS feed endpoints since those _must_ be public (ie: no basic auth)
--- lib/pinchflat/sources/sources.ex
@@ -15,8 +15,8 @@ defmodule Pinchflat.Sources do
alias Pinchflat.Metadata.SourceMetadata
alias Pinchflat.Utils.FilesystemUtils
alias Pinchflat.Downloading.DownloadingHelpers
+ alias Pinchflat.FastIndexing.FastIndexingWorker
alias Pinchflat.SlowIndexing.SlowIndexingHelpers
- alias Pinchflat.FastIndexing.FastIndexingHelpers
alias Pinchflat.Metadata.SourceMetadataStorageWorker
@doc """
@@ -255,40 +255,19 @@ defmodule Pinchflat.Sources do
end
end
- # If the source is new (ie: not persisted), do nothing
- defp maybe_handle_media_tasks(%{data: %{__meta__: %{state: state}}}, _source) when state != :loaded do
- :ok
- end
-
- # If the source is NOT new (ie: updated),
+ # If the source is NOT new (ie: updated) and the download_media flag has changed,
# enqueue or dequeue media download tasks as necessary.
defp maybe_handle_media_tasks(changeset, source) do
- current_changes = changeset.changes
- applied_changes = Ecto.Changeset.apply_changes(changeset)
-
- # We need both current_changes and applied_changes to determine
- # the course of action to take. For example, we only care if a source is supposed
- # to be `enabled` or not - we don't care if that information comes from the
- # current changes or if that's how it already was in the database.
- # Rephrased, we're essentially using it in place of `get_field/2`
- case {current_changes, applied_changes} do
- {%{download_media: true}, %{enabled: true}} ->
- DownloadingHelpers.enqueue_pending_download_tasks(source)
-
- {%{enabled: true}, %{download_media: true}} ->
+ case {changeset.data, changeset.changes} do
+ {%{__meta__: %{state: :loaded}}, %{download_media: true}} ->
DownloadingHelpers.enqueue_pending_download_tasks(source)
- {%{download_media: false}, _} ->
- DownloadingHelpers.dequeue_pending_download_tasks(source)
-
- {%{enabled: false}, _} ->
+ {%{__meta__: %{state: :loaded}}, %{download_media: false}} ->
DownloadingHelpers.dequeue_pending_download_tasks(source)
_ ->
- nil
+ :ok
end
-
- :ok
end
defp maybe_run_indexing_task(changeset, source) do
@@ -322,22 +301,13 @@ defmodule Pinchflat.Sources do
end
defp maybe_update_slow_indexing_task(changeset, source) do
- # See comment in `maybe_handle_media_tasks` as to why we need these
- current_changes = changeset.changes
- applied_changes = Ecto.Changeset.apply_changes(changeset)
-
- case {current_changes, applied_changes} do
- {%{index_frequency_minutes: mins}, %{enabled: true}} when mins > 0 ->
- SlowIndexingHelpers.kickoff_indexing_task(source)
-
- {%{enabled: true}, %{index_frequency_minutes: mins}} when mins > 0 ->
+ case changeset.changes do
+ %{index_frequency_minutes: mins} when mins > 0 ->
SlowIndexingHelpers.kickoff_indexing_task(source)
- {%{index_frequency_minutes: _}, _} ->
- SlowIndexingHelpers.delete_indexing_tasks(source, include_executing: true)
-
- {%{enabled: false}, _} ->
- SlowIndexingHelpers.delete_indexing_tasks(source, include_executing: true)
+ %{index_frequency_minutes: _} ->
+ Tasks.delete_pending_tasks_for(source, "FastIndexingWorker")
+ Tasks.delete_pending_tasks_for(source, "MediaCollectionIndexingWorker")
_ ->
:ok
@@ -345,25 +315,13 @@ defmodule Pinchflat.Sources do
end
defp maybe_update_fast_indexing_task(changeset, source) do
- # See comment in `maybe_handle_media_tasks` as to why we need these
- current_changes = changeset.changes
- applied_changes = Ecto.Changeset.apply_changes(changeset)
-
- # This technically could be simplified since `maybe_update_slow_indexing_task`
- # has some overlap re: deleting pending tasks, but I'm keeping it separate
- # for clarity and explicitness.
- case {current_changes, applied_changes} do
- {%{fast_index: true}, %{enabled: true}} ->
- FastIndexingHelpers.kickoff_indexing_task(source)
-
- {%{enabled: true}, %{fast_index: true}} ->
- FastIndexingHelpers.kickoff_indexing_task(source)
-
- {%{fast_index: false}, _} ->
- Tasks.delete_pending_tasks_for(source, "FastIndexingWorker", include_executing: true)
+ case changeset.changes do
+ %{fast_index: true} ->
+ Tasks.delete_pending_tasks_for(source, "FastIndexingWorker")
+ FastIndexingWorker.kickoff_with_task(source)
- {%{enabled: false}, _} ->
- Tasks.delete_pending_tasks_for(source, "FastIndexingWorker", include_executing: true)
+ %{fast_index: false} ->
+ Tasks.delete_pending_tasks_for(source, "FastIndexingWorker")
_ ->
:ok
--- lib/pinchflat_web/components/core_components.ex
@@ -340,15 +340,14 @@ defmodule PinchflatWeb.CoreComponents do
end)
~H"""
- <div x-data={"{ enabled: #{@checked} }"} class="" phx-update="ignore" id={"#{@id}-wrapper"}>
- <.label :if={@label} for={@id}>
+ <div x-data={"{ enabled: #{@checked}}"}>
+ <.label for={@id}>
<%= @label %>
<span :if={@label_suffix} class="text-xs text-bodydark"><%= @label_suffix %></span>
</.label>
- <div class="relative flex flex-col">
+ <div class="relative">
<input type="hidden" id={@id} name={@name} x-bind:value="enabled" {@rest} />
- <%!-- This triggers a `change` event on the hidden input when the toggle is clicked --%>
- <div class="inline-block cursor-pointer" @click={"enabled = !enabled; dispatchFor('#{@id}', 'change')"}>
+ <div class="inline-block cursor-pointer" @click="enabled = !enabled">
<div x-bind:class="enabled && '!bg-primary'" class="block h-8 w-14 rounded-full bg-black"></div>
<div
x-bind:class="enabled && '!right-1 !translate-x-full'"
--- lib/pinchflat_web/components/layouts/partials/upgrade_button_live.ex
@@ -3,7 +3,7 @@ defmodule Pinchflat.UpgradeButtonLive do
def render(assigns) do
~H"""
- <form id="upgradeForm" phx-change="check_matching_text" phx-hook="supress-enter-submission">
+ <form id="upgradeForm" phx-change="check_matching_text" phx-hook="supressEnterSubmission">
<.input type="text" name="unlock-pro-textbox" value="" />
</form>
--- lib/pinchflat_web/controllers/media_profiles/media_profile_controller.ex
@@ -1,31 +1,20 @@
defmodule PinchflatWeb.MediaProfiles.MediaProfileController do
use PinchflatWeb, :controller
use Pinchflat.Sources.SourcesQuery
- use Pinchflat.Profiles.ProfilesQuery
alias Pinchflat.Repo
alias Pinchflat.Profiles
- alias Pinchflat.Sources.Source
alias Pinchflat.Profiles.MediaProfile
alias Pinchflat.Profiles.MediaProfileDeletionWorker
def index(conn, _params) do
- media_profiles_query =
- from mp in MediaProfile,
- as: :media_profile,
- where: is_nil(mp.marked_for_deletion_at),
- order_by: [asc: mp.name],
- select: map(mp, ^MediaProfile.__schema__(:fields)),
- select_merge: %{
- source_count:
- subquery(
- from s in Source,
- where: s.media_profile_id == parent_as(:media_profile).id,
- select: count(s.id)
- )
- }
-
- render(conn, :index, media_profiles: Repo.all(media_profiles_query))
+ media_profiles =
+ MediaProfile
+ |> where([mp], is_nil(mp.marked_for_deletion_at))
+ |> order_by(asc: :name)
+ |> Repo.all()
+
+ render(conn, :index, media_profiles: media_profiles)
end
def new(conn, params) do
--- lib/pinchflat_web/controllers/media_profiles/media_profile_html/index.html.heex
@@ -10,6 +10,7 @@
</.link>
</nav>
</div>
+
<div class="rounded-sm border border-stroke bg-white shadow-default dark:border-strokedark dark:bg-boxdark">
<div class="max-w-full overflow-x-auto">
<div class="flex flex-col gap-10 min-w-max">
@@ -22,11 +23,6 @@
<:col :let={media_profile} label="Preferred Resolution">
<%= media_profile.preferred_resolution %>
</:col>
- <:col :let={media_profile} label="Sources">
- <.subtle_link href={~p"/media_profiles/#{media_profile.id}/#tab-sources"}>
- <.localized_number number={media_profile.source_count} />
- </.subtle_link>
- </:col>
<:col :let={media_profile} label="" class="flex justify-end">
<.icon_link href={~p"/media_profiles/#{media_profile.id}/edit"} icon="hero-pencil-square" class="mr-4" />
</:col>
--- lib/pinchflat_web/controllers/sources/source_controller.ex
@@ -1,11 +1,12 @@
defmodule PinchflatWeb.Sources.SourceController do
use PinchflatWeb, :controller
- use Pinchflat.Sources.SourcesQuery
+ use Pinchflat.Media.MediaQuery
alias Pinchflat.Repo
alias Pinchflat.Tasks
alias Pinchflat.Sources
alias Pinchflat.Sources.Source
+ alias Pinchflat.Media.MediaItem
alias Pinchflat.Profiles.MediaProfile
alias Pinchflat.Media.FileSyncingWorker
alias Pinchflat.Sources.SourceDeletionWorker
@@ -14,7 +15,33 @@ defmodule PinchflatWeb.Sources.SourceController do
alias Pinchflat.Metadata.SourceMetadataStorageWorker
def index(conn, _params) do
- render(conn, :index)
+ source_query =
+ from s in Source,
+ as: :source,
+ inner_join: mp in assoc(s, :media_profile),
+ where: is_nil(s.marked_for_deletion_at) and is_nil(mp.marked_for_deletion_at),
+ preload: [media_profile: mp],
+ order_by: [asc: s.custom_name],
+ select: map(s, ^Source.__schema__(:fields)),
+ select_merge: %{
+ downloaded_count:
+ subquery(
+ from m in MediaItem,
+ where: m.source_id == parent_as(:source).id,
+ where: ^MediaQuery.downloaded(),
+ select: count(m.id)
+ ),
+ pending_count:
+ subquery(
+ from m in MediaItem,
+ join: s in assoc(m, :source),
+ where: m.source_id == parent_as(:source).id,
+ where: ^MediaQuery.pending(),
+ select: count(m.id)
+ )
+ }
+
+ render(conn, :index, sources: Repo.all(source_query))
end
def new(conn, params) do
--- lib/pinchflat_web/controllers/sources/source_html/index.html.heex
@@ -12,7 +12,32 @@
<div class="rounded-sm border border-stroke bg-white shadow-default dark:border-strokedark dark:bg-boxdark">
<div class="max-w-full overflow-x-auto">
<div class="flex flex-col gap-10 min-w-max">
- <%= live_render(@conn, PinchflatWeb.Sources.IndexTableLive) %>
+ <.table rows={@sources} table_class="text-black dark:text-white">
+ <:col :let={source} label="Name">
+ <.subtle_link href={~p"/sources/#{source.id}"}>
+ <%= StringUtils.truncate(source.custom_name || source.collection_name, 35) %>
+ </.subtle_link>
+ </:col>
+ <:col :let={source} label="Type"><%= source.collection_type %></:col>
+ <:col :let={source} label="Pending"><.localized_number number={source.pending_count} /></:col>
+ <:col :let={source} label="Downloaded"><.localized_number number={source.downloaded_count} /></:col>
+ <:col :let={source} label="Retention">
+ <%= if source.retention_period_days && source.retention_period_days > 0 do %>
+ <.localized_number number={source.retention_period_days} />
+ <.pluralize count={source.retention_period_days} word="day" />
+ <% else %>
+ <span class="text-lg">∞</span>
+ <% end %>
+ </:col>
+ <:col :let={source} label="Media Profile">
+ <.subtle_link href={~p"/media_profiles/#{source.media_profile_id}"}>
+ <%= source.media_profile.name %>
+ </.subtle_link>
+ </:col>
+ <:col :let={source} label="" class="flex place-content-evenly">
+ <.icon_link href={~p"/sources/#{source.id}/edit"} icon="hero-pencil-square" class="mx-1" />
+ </:col>
+ </.table>
</div>
</div>
</div>
--- lib/pinchflat_web/controllers/sources/source_html/index_table_live.ex
@@ -1,103 +0,0 @@
-defmodule PinchflatWeb.Sources.IndexTableLive do
- use PinchflatWeb, :live_view
- use Pinchflat.Media.MediaQuery
- use Pinchflat.Sources.SourcesQuery
-
- alias Pinchflat.Repo
- alias Pinchflat.Sources
- alias Pinchflat.Sources.Source
- alias Pinchflat.Media.MediaItem
-
- def render(assigns) do
- ~H"""
- <.table rows={@sources} table_class="text-white">
- <:col :let={source} label="Name">
- <.subtle_link href={~p"/sources/#{source.id}"}>
- <%= StringUtils.truncate(source.custom_name || source.collection_name, 35) %>
- </.subtle_link>
- </:col>
- <:col :let={source} label="Pending">
- <.subtle_link href={~p"/sources/#{source.id}/#tab-pending"}>
- <.localized_number number={source.pending_count} />
- </.subtle_link>
- </:col>
- <:col :let={source} label="Downloaded">
- <.subtle_link href={~p"/sources/#{source.id}/#tab-downloaded"}>
- <.localized_number number={source.downloaded_count} />
- </.subtle_link>
- </:col>
- <:col :let={source} label="Retention">
- <%= if source.retention_period_days && source.retention_period_days > 0 do %>
- <.localized_number number={source.retention_period_days} />
- <.pluralize count={source.retention_period_days} word="day" />
- <% else %>
- <span class="text-lg">∞</span>
- <% end %>
- </:col>
- <:col :let={source} label="Media Profile">
- <.subtle_link href={~p"/media_profiles/#{source.media_profile_id}"}>
- <%= source.media_profile.name %>
- </.subtle_link>
- </:col>
- <:col :let={source} label="Enabled?">
- <.input
- name={"source[#{source.id}][enabled]"}
- value={source.enabled}
- id={"source_#{source.id}_enabled"}
- phx-hook="formless-input"
- data-subscribe="change"
- data-event-name="toggle_enabled"
- data-identifier={source.id}
- type="toggle"
- />
- </:col>
- <:col :let={source} label="" class="flex place-content-evenly">
- <.icon_link href={~p"/sources/#{source.id}/edit"} icon="hero-pencil-square" class="mx-1" />
- </:col>
- </.table>
- """
- end
-
- def mount(_params, _session, socket) do
- {:ok, assign(socket, %{sources: get_sources()})}
- end
-
- def handle_event("formless-input", %{"event" => "toggle_enabled"} = params, socket) do
- source = Sources.get_source!(params["id"])
- should_enable = params["value"] == "true"
-
- {:ok, _} = Sources.update_source(source, %{enabled: should_enable})
-
- {:noreply, assign(socket, %{sources: get_sources()})}
- end
-
- defp get_sources do
- query =
- from s in Source,
- as: :source,
- inner_join: mp in assoc(s, :media_profile),
- where: is_nil(s.marked_for_deletion_at) and is_nil(mp.marked_for_deletion_at),
- preload: [media_profile: mp],
- order_by: [asc: s.custom_name],
- select: map(s, ^Source.__schema__(:fields)),
- select_merge: %{
- downloaded_count:
- subquery(
- from m in MediaItem,
- where: m.source_id == parent_as(:source).id,
- where: ^MediaQuery.downloaded(),
- select: count(m.id)
- ),
- pending_count:
- subquery(
- from m in MediaItem,
- join: s in assoc(m, :source),
- where: m.source_id == parent_as(:source).id,
- where: ^MediaQuery.pending(),
- select: count(m.id)
- )
- }
-
- Repo.all(query)
- end
-end
--- lib/pinchflat_web/controllers/sources/source_html/media_item_table_live.ex
@@ -1,4 +1,4 @@
-defmodule PinchflatWeb.Sources.MediaItemTableLive do
+defmodule Pinchflat.Sources.MediaItemTableLive do
use PinchflatWeb, :live_view
use Pinchflat.Media.MediaQuery
--- lib/pinchflat_web/controllers/sources/source_html/show.html.heex
@@ -39,21 +39,21 @@
<:tab title="Pending" id="pending">
<%= live_render(
@conn,
- PinchflatWeb.Sources.MediaItemTableLive,
+ Pinchflat.Sources.MediaItemTableLive,
session: %{"source_id" => @source.id, "media_state" => "pending"}
) %>
</:tab>
<:tab title="Downloaded" id="downloaded">
<%= live_render(
@conn,
- PinchflatWeb.Sources.MediaItemTableLive,
+ Pinchflat.Sources.MediaItemTableLive,
session: %{"source_id" => @source.id, "media_state" => "downloaded"}
) %>
</:tab>
<:tab title="Other" id="other">
<%= live_render(
@conn,
- PinchflatWeb.Sources.MediaItemTableLive,
+ Pinchflat.Sources.MediaItemTableLive,
session: %{"source_id" => @source.id, "media_state" => "other"}
) %>
</:tab>
--- priv/repo/erd.png
Binary files a/priv/repo/erd.png and b/priv/repo/erd.png differ
--- priv/repo/migrations/20241120204407_add_enabled_to_sources.exs
@@ -1,9 +0,0 @@
-defmodule Pinchflat.Repo.Migrations.AddEnabledToSources do
- use Ecto.Migration
-
- def change do
- alter table(:sources) do
- add :enabled, :boolean, default: true, null: false
- end
- end
-end
--- test/pinchflat/fast_indexing/fast_indexing_helpers_test.exs
@@ -1,7 +1,6 @@
defmodule Pinchflat.FastIndexing.FastIndexingHelpersTest do
use Pinchflat.DataCase
- import Pinchflat.TasksFixtures
import Pinchflat.MediaFixtures
import Pinchflat.SourcesFixtures
import Pinchflat.ProfilesFixtures
@@ -9,7 +8,6 @@ defmodule Pinchflat.FastIndexing.FastIndexingHelpersTest do
alias Pinchflat.Tasks
alias Pinchflat.Settings
alias Pinchflat.Media.MediaItem
- alias Pinchflat.FastIndexing.FastIndexingWorker
alias Pinchflat.Downloading.MediaDownloadWorker
alias Pinchflat.FastIndexing.FastIndexingHelpers
@@ -21,23 +19,6 @@ defmodule Pinchflat.FastIndexing.FastIndexingHelpersTest do
{:ok, [source: source_fixture()]}
end
- describe "kickoff_indexing_task/1" do
- test "deletes any existing fast indexing tasks", %{source: source} do
- {:ok, job} = Oban.insert(FastIndexingWorker.new(%{"id" => source.id}))
- task = task_fixture(source_id: source.id, job_id: job.id)
-
- assert Repo.reload!(task)
- assert {:ok, _} = FastIndexingHelpers.kickoff_indexing_task(source)
- assert_raise Ecto.NoResultsError, fn -> Repo.reload!(task) end
- end
-
- test "kicks off a new fast indexing task", %{source: source} do
- assert {:ok, _} = FastIndexingHelpers.kickoff_indexing_task(source)
- assert [worker] = all_enqueued(worker: FastIndexingWorker)
- assert worker.args["id"] == source.id
- end
- end
-
describe "kickoff_download_tasks_from_youtube_rss_feed/1" do
test "enqueues a new worker for each new media_id in the source's RSS feed", %{source: source} do
expect(HTTPClientMock, :get, fn _url -> {:ok, "<yt:videoId>test_1</yt:videoId>"} end)
--- test/pinchflat/slow_indexing/slow_indexing_helpers_test.exs
@@ -23,36 +23,6 @@ defmodule Pinchflat.SlowIndexing.SlowIndexingHelpersTest do
assert_enqueued(worker: MediaCollectionIndexingWorker, args: %{"id" => source.id})
end
- test "schedules a job for the future based on when the source was last indexed" do
- source = source_fixture(index_frequency_minutes: 30, last_indexed_at: now_minus(5, :minutes))
-
- assert {:ok, _} = SlowIndexingHelpers.kickoff_indexing_task(source)
-
- [job] = all_enqueued(worker: MediaCollectionIndexingWorker, args: %{"id" => source.id})
-
- assert_in_delta DateTime.diff(job.scheduled_at, DateTime.utc_now(), :minute), 25, 1
- end
-
- test "schedules a job immediately if the source was indexed far in the past" do
- source = source_fixture(index_frequency_minutes: 30, last_indexed_at: now_minus(60, :minutes))
-
- assert {:ok, _} = SlowIndexingHelpers.kickoff_indexing_task(source)
-
- [job] = all_enqueued(worker: MediaCollectionIndexingWorker, args: %{"id" => source.id})
-
- assert_in_delta DateTime.diff(job.scheduled_at, DateTime.utc_now(), :second), 0, 1
- end
-
- test "schedules a job immediately if the source has never been indexed" do
- source = source_fixture(index_frequency_minutes: 30, last_indexed_at: nil)
-
- assert {:ok, _} = SlowIndexingHelpers.kickoff_indexing_task(source)
-
- [job] = all_enqueued(worker: MediaCollectionIndexingWorker, args: %{"id" => source.id})
-
- assert_in_delta DateTime.diff(job.scheduled_at, DateTime.utc_now(), :second), 0, 1
- end
-
test "creates and attaches a task" do
source = source_fixture(index_frequency_minutes: 1)
@@ -122,56 +92,6 @@ defmodule Pinchflat.SlowIndexing.SlowIndexingHelpersTest do
end
end
- describe "delete_indexing_tasks/2" do
- setup do
- source = source_fixture()
-
- {:ok, %{source: source}}
- end
-
- test "deletes slow indexing tasks for the source", %{source: source} do
- {:ok, job} = Oban.insert(MediaCollectionIndexingWorker.new(%{"id" => source.id}))
- _task = task_fixture(source_id: source.id, job_id: job.id)
-
- assert_enqueued(worker: MediaCollectionIndexingWorker, args: %{"id" => source.id})
- assert :ok = SlowIndexingHelpers.delete_indexing_tasks(source)
- refute_enqueued(worker: MediaCollectionIndexingWorker)
- end
-
- test "deletes fast indexing tasks for the source", %{source: source} do
- {:ok, job} = Oban.insert(FastIndexingWorker.new(%{"id" => source.id}))
- _task = task_fixture(source_id: source.id, job_id: job.id)
-
- assert_enqueued(worker: FastIndexingWorker, args: %{"id" => source.id})
- assert :ok = SlowIndexingHelpers.delete_indexing_tasks(source)
- refute_enqueued(worker: FastIndexingWorker)
- end
-
- test "doesn't normally delete currently executing tasks", %{source: source} do
- {:ok, job} = Oban.insert(MediaCollectionIndexingWorker.new(%{"id" => source.id}))
- task = task_fixture(source_id: source.id, job_id: job.id)
-
- from(Oban.Job, where: [id: ^job.id], update: [set: [state: "executing"]])
- |> Repo.update_all([])
-
- assert Repo.reload!(task)
- assert :ok = SlowIndexingHelpers.delete_indexing_tasks(source)
- assert Repo.reload!(task)
- end
-
- test "can optionally delete currently executing tasks", %{source: source} do
- {:ok, job} = Oban.insert(MediaCollectionIndexingWorker.new(%{"id" => source.id}))
- task = task_fixture(source_id: source.id, job_id: job.id)
-
- from(Oban.Job, where: [id: ^job.id], update: [set: [state: "executing"]])
- |> Repo.update_all([])
-
- assert Repo.reload!(task)
- assert :ok = SlowIndexingHelpers.delete_indexing_tasks(source, include_executing: true)
- assert_raise Ecto.NoResultsError, fn -> Repo.reload!(task) end
- end
- end
-
describe "index_and_enqueue_download_for_media_items/1" do
setup do
stub(YtDlpRunnerMock, :run, fn _url, _opts, _ot, _addl_opts ->
--- test/pinchflat/sources_test.exs
@@ -418,100 +418,6 @@ defmodule Pinchflat.SourcesTest do
assert {:ok, %Source{}} = Sources.update_source(source, update_attrs)
end
- test "updates with invalid data returns error changeset" do
- source = source_fixture()
-
- assert {:error, %Ecto.Changeset{}} =
- Sources.update_source(source, @invalid_source_attrs)
-
- assert source == Sources.get_source!(source.id)
- end
-
- test "updating will kickoff a metadata storage worker if the original_url changes" do
- expect(YtDlpRunnerMock, :run, &playlist_mock/4)
- source = source_fixture()
- update_attrs = %{original_url: "https://www.youtube.com/channel/cba321"}
-
- assert {:ok, %Source{} = source} = Sources.update_source(source, update_attrs)
-
- assert_enqueued(worker: SourceMetadataStorageWorker, args: %{"id" => source.id})
- end
-
- test "updating will not kickoff a metadata storage worker other attrs change" do
- source = source_fixture()
- update_attrs = %{name: "some new name"}
-
- assert {:ok, %Source{}} = Sources.update_source(source, update_attrs)
-
- refute_enqueued(worker: SourceMetadataStorageWorker)
- end
- end
-
- describe "update_source/3 when testing media download tasks" do
- test "enabling the download_media attribute will schedule a download task" do
- source = source_fixture(download_media: false)
- media_item = media_item_fixture(source_id: source.id, media_filepath: nil)
- update_attrs = %{download_media: true}
-
- refute_enqueued(worker: MediaDownloadWorker)
- assert {:ok, %Source{}} = Sources.update_source(source, update_attrs)
- assert_enqueued(worker: MediaDownloadWorker, args: %{"id" => media_item.id})
- end
-
- test "disabling the download_media attribute will cancel the download task" do
- source = source_fixture(download_media: true, enabled: true)
- media_item = media_item_fixture(source_id: source.id, media_filepath: nil)
- update_attrs = %{download_media: false}
- DownloadingHelpers.enqueue_pending_download_tasks(source)
-
- assert_enqueued(worker: MediaDownloadWorker, args: %{"id" => media_item.id})
- assert {:ok, %Source{}} = Sources.update_source(source, update_attrs)
- refute_enqueued(worker: MediaDownloadWorker)
- end
-
- test "enabling download_media will not schedule a task if the source is disabled" do
- source = source_fixture(download_media: false, enabled: false)
- _media_item = media_item_fixture(source_id: source.id, media_filepath: nil)
- update_attrs = %{download_media: true}
-
- refute_enqueued(worker: MediaDownloadWorker)
- assert {:ok, %Source{}} = Sources.update_source(source, update_attrs)
- refute_enqueued(worker: MediaDownloadWorker)
- end
-
- test "disabling a source will cancel any pending download tasks" do
- source = source_fixture(download_media: true, enabled: true)
- media_item = media_item_fixture(source_id: source.id, media_filepath: nil)
- update_attrs = %{enabled: false}
- DownloadingHelpers.enqueue_pending_download_tasks(source)
-
- assert_enqueued(worker: MediaDownloadWorker, args: %{"id" => media_item.id})
- assert {:ok, %Source{}} = Sources.update_source(source, update_attrs)
- refute_enqueued(worker: MediaDownloadWorker)
- end
-
- test "enabling a source will schedule a download task if download_media is true" do
- source = source_fixture(download_media: true, enabled: false)
- media_item = media_item_fixture(source_id: source.id, media_filepath: nil)
- update_attrs = %{enabled: true}
-
- refute_enqueued(worker: MediaDownloadWorker)
- assert {:ok, %Source{}} = Sources.update_source(source, update_attrs)
- assert_enqueued(worker: MediaDownloadWorker, args: %{"id" => media_item.id})
- end
-
- test "enabling a source will not schedule a download task if download_media is false" do
- source = source_fixture(download_media: false, enabled: false)
- _media_item = media_item_fixture(source_id: source.id, media_filepath: nil)
- update_attrs = %{enabled: true}
-
- refute_enqueued(worker: MediaDownloadWorker)
- assert {:ok, %Source{}} = Sources.update_source(source, update_attrs)
- refute_enqueued(worker: MediaDownloadWorker)
- end
- end
-
- describe "update_source/3 when testing slow indexing" do
test "updating the index frequency to >0 will re-schedule the indexing task" do
source = source_fixture()
update_attrs = %{index_frequency_minutes: 123}
@@ -556,47 +462,27 @@ defmodule Pinchflat.SourcesTest do
refute_enqueued(worker: MediaCollectionIndexingWorker, args: %{"id" => source.id})
end
- test "disabling a source will delete any pending tasks" do
- source = source_fixture()
- update_attrs = %{enabled: false}
-
- {:ok, job} = Oban.insert(MediaCollectionIndexingWorker.new(%{"id" => source.id}))
- task = task_fixture(source_id: source.id, job_id: job.id)
-
- assert {:ok, %Source{}} = Sources.update_source(source, update_attrs)
-
- assert_raise Ecto.NoResultsError, fn -> Repo.reload!(task) end
- end
-
- test "updating the index frequency will not create a task if the source is disabled" do
- source = source_fixture(enabled: false)
- update_attrs = %{index_frequency_minutes: 123}
-
- refute_enqueued(worker: MediaCollectionIndexingWorker)
- assert {:ok, %Source{}} = Sources.update_source(source, update_attrs)
- refute_enqueued(worker: MediaCollectionIndexingWorker)
- end
-
- test "enabling a source will create a task if the index frequency is >0" do
- source = source_fixture(enabled: false, index_frequency_minutes: 123)
- update_attrs = %{enabled: true}
+ test "enabling the download_media attribute will schedule a download task" do
+ source = source_fixture(download_media: false)
+ media_item = media_item_fixture(source_id: source.id, media_filepath: nil)
+ update_attrs = %{download_media: true}
- refute_enqueued(worker: MediaCollectionIndexingWorker)
+ refute_enqueued(worker: MediaDownloadWorker)
assert {:ok, %Source{}} = Sources.update_source(source, update_attrs)
- assert_enqueued(worker: MediaCollectionIndexingWorker, args: %{"id" => source.id})
+ assert_enqueued(worker: MediaDownloadWorker, args: %{"id" => media_item.id})
end
- test "enabling a source will not create a task if the index frequency is 0" do
- source = source_fixture(enabled: false, index_frequency_minutes: 0)
- update_attrs = %{enabled: true}
+ test "disabling the download_media attribute will cancel the download task" do
+ source = source_fixture(download_media: true)
+ media_item = media_item_fixture(source_id: source.id, media_filepath: nil)
+ update_attrs = %{download_media: false}
+ DownloadingHelpers.enqueue_pending_download_tasks(source)
- refute_enqueued(worker: MediaCollectionIndexingWorker)
+ assert_enqueued(worker: MediaDownloadWorker, args: %{"id" => media_item.id})
assert {:ok, %Source{}} = Sources.update_source(source, update_attrs)
- refute_enqueued(worker: MediaCollectionIndexingWorker)
+ refute_enqueued(worker: MediaDownloadWorker)
end
- end
- describe "update_source/3 when testing fast indexing" do
test "enabling fast_index will schedule a fast indexing task" do
source = source_fixture(fast_index: false)
update_attrs = %{fast_index: true}
@@ -617,6 +503,15 @@ defmodule Pinchflat.SourcesTest do
refute_enqueued(worker: FastIndexingWorker)
end
+ test "updates with invalid data returns error changeset" do
+ source = source_fixture()
+
+ assert {:error, %Ecto.Changeset{}} =
+ Sources.update_source(source, @invalid_source_attrs)
+
+ assert source == Sources.get_source!(source.id)
+ end
+
test "fast_index forces the index frequency to be a default value" do
source = source_fixture(%{fast_index: true})
update_attrs = %{index_frequency_minutes: 0}
@@ -635,43 +530,23 @@ defmodule Pinchflat.SourcesTest do
assert source.index_frequency_minutes == 0
end
- test "disabling a source will delete any pending tasks" do
+ test "updating will kickoff a metadata storage worker if the original_url changes" do
+ expect(YtDlpRunnerMock, :run, &playlist_mock/4)
source = source_fixture()
- update_attrs = %{enabled: false}
-
- {:ok, job} = Oban.insert(FastIndexingWorker.new(%{"id" => source.id}))
- task = task_fixture(source_id: source.id, job_id: job.id)
-
- assert {:ok, %Source{}} = Sources.update_source(source, update_attrs)
-
- assert_raise Ecto.NoResultsError, fn -> Repo.reload!(task) end
- end
+ update_attrs = %{original_url: "https://www.youtube.com/channel/cba321"}
- test "updating fast indexing will not create a task if the source is disabled" do
- source = source_fixture(enabled: false, fast_index: false)
- update_attrs = %{fast_index: true}
+ assert {:ok, %Source{} = source} = Sources.update_source(source, update_attrs)
- refute_enqueued(worker: FastIndexingWorker)
- assert {:ok, %Source{}} = Sources.update_source(source, update_attrs)
- refute_enqueued(worker: FastIndexingWorker)
+ assert_enqueued(worker: SourceMetadataStorageWorker, args: %{"id" => source.id})
end
- test "enabling a source will create a task if fast_index is true" do
- source = source_fixture(enabled: false, fast_index: true)
- update_attrs = %{enabled: true}
+ test "updating will not kickoff a metadata storage worker other attrs change" do
+ source = source_fixture()
+ update_attrs = %{name: "some new name"}
- refute_enqueued(worker: FastIndexingWorker)
assert {:ok, %Source{}} = Sources.update_source(source, update_attrs)
- assert_enqueued(worker: FastIndexingWorker, args: %{"id" => source.id})
- end
-
- test "enabling a source will not create a task if fast_index is false" do
- source = source_fixture(enabled: false, fast_index: false)
- update_attrs = %{enabled: true}
- refute_enqueued(worker: FastIndexingWorker)
- assert {:ok, %Source{}} = Sources.update_source(source, update_attrs)
- refute_enqueued(worker: FastIndexingWorker)
+ refute_enqueued(worker: SourceMetadataStorageWorker)
end
end
--- test/pinchflat/tasks_test.exs
@@ -247,7 +247,7 @@ defmodule Pinchflat.TasksTest do
assert_raise Ecto.NoResultsError, fn -> Repo.reload!(task) end
end
- test "deletion can optionally include executing tasks" do
+ test "deletion can optionall include executing tasks" do
source = source_fixture()
task = task_fixture(source_id: source.id)
--- test/pinchflat_web/controllers/source_controller_test.exs
@@ -34,10 +34,27 @@ defmodule PinchflatWeb.SourceControllerTest do
end
describe "index" do
- # Most of the tests are in `index_table_list_test.exs`
- test "returns 200", %{conn: conn} do
+ test "lists all sources", %{conn: conn} do
+ source = source_fixture()
conn = get(conn, ~p"/sources")
+
assert html_response(conn, 200) =~ "Sources"
+ assert html_response(conn, 200) =~ source.custom_name
+ end
+
+ test "omits sources that have marked_for_deletion_at set", %{conn: conn} do
+ source = source_fixture(marked_for_deletion_at: DateTime.utc_now())
+ conn = get(conn, ~p"/sources")
+
+ refute html_response(conn, 200) =~ source.custom_name
+ end
+
+ test "omits sources who's media profile has marked_for_deletion_at set", %{conn: conn} do
+ media_profile = media_profile_fixture(marked_for_deletion_at: DateTime.utc_now())
+ source = source_fixture(media_profile_id: media_profile.id)
+ conn = get(conn, ~p"/sources")
+
+ refute html_response(conn, 200) =~ source.custom_name
end
end
--- test/pinchflat_web/controllers/sources/index_table_live_test.exs
@@ -1,55 +0,0 @@
-defmodule PinchflatWeb.Sources.IndexTableLiveTest do
- use PinchflatWeb.ConnCase
-
- import Phoenix.LiveViewTest
- import Pinchflat.SourcesFixtures
- import Pinchflat.ProfilesFixtures
-
- alias Pinchflat.Sources.Source
- alias PinchflatWeb.Sources.IndexTableLive
-
- describe "initial rendering" do
- test "lists all sources", %{conn: conn} do
- source = source_fixture()
-
- {:ok, _view, html} = live_isolated(conn, IndexTableLive)
-
- assert html =~ source.custom_name
- end
-
- test "omits sources that have marked_for_deletion_at set", %{conn: conn} do
- source = source_fixture(marked_for_deletion_at: DateTime.utc_now())
-
- {:ok, _view, html} = live_isolated(conn, IndexTableLive)
-
- refute html =~ source.custom_name
- end
-
- test "omits sources who's media profile has marked_for_deletion_at set", %{conn: conn} do
- media_profile = media_profile_fixture(marked_for_deletion_at: DateTime.utc_now())
- source = source_fixture(media_profile_id: media_profile.id)
-
- {:ok, _view, html} = live_isolated(conn, IndexTableLive)
-
- refute html =~ source.custom_name
- end
- end
-
- describe "when a source is enabled or disabled" do
- test "updates the source's enabled status", %{conn: conn} do
- source = source_fixture(enabled: true)
- {:ok, view, _html} = live_isolated(conn, IndexTableLive)
-
- params = %{
- "event" => "toggle_enabled",
- "id" => source.id,
- "value" => "false"
- }
-
- # Send an event to the server directly
- render_change(view, "formless-input", params)
-
- assert %{enabled: false} = Repo.get!(Source, source.id)
- end
- end
-end
--- test/pinchflat_web/controllers/sources/media_item_table_live_test.exs
@@ -6,7 +6,7 @@ defmodule PinchflatWeb.Sources.MediaItemTableLiveTest do
import Pinchflat.SourcesFixtures
import Pinchflat.ProfilesFixtures
- alias PinchflatWeb.Sources.MediaItemTableLive
+ alias Pinchflat.Sources.MediaItemTableLive
setup do
source = source_fixture()
--- test/support/fixtures/sources_fixtures.ex
@@ -20,7 +20,6 @@ defmodule Pinchflat.SourcesFixtures do
Enum.into(
attrs,
%{
- enabled: true,
collection_name: "Source ##{:rand.uniform(1_000_000)}",
collection_id: Base.encode16(:crypto.hash(:md5, "#{:rand.uniform(1_000_000)}")),
collection_type: "channel",
|
pinchflat
|
kieraneglin
|
Elixir
|
Elixir
| 2,779
| 59
|
Your next YouTube media manager
|
kieraneglin_pinchflat
|
NEW_FEAT
|
Introduce a new functionality
|
79c5839f0ad73300a489500ebec501717da8bbcd
|
2023-10-02 08:41:39
|
bannedbook
|
update
| false
| 0
| 0
| 0
|
--- README.md
Binary files a/README.md and b/README.md differ
--- game/SStap和Netch免费游戏加速器教程.md
Binary files "a/game/SStap\345\222\214Netch\345\205\215\350\264\271\346\270\270\346\210\217\345\212\240\351\200\237\345\231\250\346\225\231\347\250\213.md" and "b/game/SStap\345\222\214Netch\345\205\215\350\264\271\346\270\270\346\210\217\345\212\240\351\200\237\345\231\250\346\225\231\347\250\213.md" differ
|
fanqiang
|
bannedbook
|
Kotlin
|
Kotlin
| 39,286
| 7,317
|
翻墙-科学上网
|
bannedbook_fanqiang
|
CONFIG_CHANGE
|
Very small changes
|
2e1320736bdcaaa59e5a0846f8d4a0aa7da56d5c
|
2025-04-06T09:30:23Z
|
Chromium WPT Sync
|
Import wpt@e3e6eee31d98d5b5dea1a3159febe139280c10ad https://github.com/web-platform-tests/wpt/compare/2518df1a5...e3e6eee31 Using wpt-import in Chromium 5e7b118d1f59ba417f377db246045f2193c83aa4. With Chromium commits locally applied on WPT: 77ab8bb837 "Reland "Reland "Ignore Letter Spacing in Cursive [...] Note to gardeners: This CL imports external tests and adds expectations for those tests; if this CL is large and causes a few new failures, please fix the failures by adding new lines to TestExpectations rather than reverting. See: https://chromium.googlesource.com/chromium/src/+/main/docs/testing/web_platform_tests.md NOAUTOREVERT=true [email protected] No-Export: true Cq-Include-Trybots: luci.chromium.try:linux-blink-rel Change-Id: I413ccf00f13cae5b63659683b7376af824aa81f9 Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/6433257 Bot-Commit: Rubber Stamper <[email protected]> Commit-Queue: Rubber Stamper <[email protected]> Auto-Submit: WPT Autoroller <[email protected]> Cr-Commit-Position: refs/heads/main@{#1443180}
| false
| 446
| 11
| 457
|
--- third_party/blink/web_tests/TestExpectations
@@ -2607,6 +2607,9 @@ crbug.com/383880384 [ Win ] external/wpt/css/css-properties-values-api/registere
[ Linux ] external/wpt/webdriver/tests/bidi/network/continue_response/cookies.py [ Pass Timeout ]
# ====== New tests from wpt-importer added here ======
+[ Mac14 ] external/wpt/html/cross-origin-opener-policy/iframe-popup-unsafe-none-to-unsafe-none.https.html?7-8 [ Crash ]
+[ Mac12 ] external/wpt/html/semantics/popovers/popover-focus-2.html [ Skip Timeout ]
+[ Mac12 ] virtual/html-anchor-attribute-disabled/external/wpt/html/semantics/popovers/popover-focus-2.html [ Skip Timeout ]
crbug.com/408039461 [ Win11-arm64 ] external/wpt/css/css-text/white-space/control-chars-01D.html [ Failure ]
crbug.com/408034113 [ Win11-arm64 ] external/wpt/service-workers/service-worker/fetch-mixed-content-to-inscope.https.html [ Crash ]
crbug.com/408039452 [ Win11-arm64 ] external/wpt/webaudio/the-audio-api/the-audioworklet-interface/audioworkletnode-constructor-options.https.html [ Timeout ]
--- third_party/blink/web_tests/external/WPT_BASE_MANIFEST_8.json
@@ -381583,19 +381583,19 @@
[]
],
"playback-destroy-persistent-license.js": [
- "8a6cacedb40ad424ffe3a4bd5bb3c6c2c1dfd831",
+ "ce3ede67b654ea3302a9d371b7d11e02d1dacc18",
[]
],
"playback-persistent-license-events.js": [
- "2d99f679f4cf489703a20b105a7e3952e40a3bef",
+ "7a381bb2f9d090424f0bf60d254c056fdac3f7d2",
[]
],
"playback-persistent-license.js": [
- "c7e56e3aeaf30c809ea3563548f6198925e0c2c2",
+ "9b1f9da4cdd997b3ca9f45a4e577090b264352de",
[]
],
"playback-retrieve-persistent-license.js": [
- "83cba34028e92a38b2079bcf78460a4b2fa29e31",
+ "ad9728e01adf66fe07822d39a2631567d5800e61",
[]
],
"playback-temporary-encrypted-clear-segmented-sources.js": [
@@ -398694,7 +398694,13 @@
"quirk-origin-check-expected.txt": [
"a5e4c87b5d4ed66359fe70245681c41f3cf2adec",
[]
- ]
+ ],
+ "resources": {
+ "quirk-stylesheet.css.txt": [
+ "d2ae382583192b0284aa44c038811e7254a9386b",
+ []
+ ]
+ }
}
},
"meta": {
@@ -402673,10 +402679,6 @@
"12efbb6b1e4934de6b32df60304cd40a44548301",
[]
],
- "popover-focus-2-expected.txt": [
- "2121245eb74b4d2a0c404f35d04de04eebb4b817",
- []
- ],
"popover-hidden-display-ref.html": [
"2dc0d558b634279a09703798e9a531748215a563",
[]
@@ -402711,7 +402713,7 @@
[]
],
"popover-utils.js": [
- "544ec843152cedacc6c6ebb5ecfa14f33fd5b2ff",
+ "6ab5a08898d53bbfec92be2664e68897600b035f",
[]
]
}
@@ -636268,6 +636270,13 @@
}
},
"stylesheet": {
+ "quirk-origin-check-positive.html": [
+ "02ab832cbe207c9bbfb1d16cbdc9a685b474372c",
+ [
+ null,
+ {}
+ ]
+ ],
"quirk-origin-check-recursive-import.html": [
"c0053f1f29ad8f23dbebdd41b38dbdd8537e9e27",
[
@@ -646908,7 +646917,27 @@
]
],
"popover-focus-2.html": [
- "6f361698f54e7ceee7aabaa08c5d0f13d190d8d4",
+ "cd03c6a8434f821183ebf64fc2ff2136f387f8b0",
+ [
+ null,
+ {
+ "testdriver": true,
+ "timeout": "long"
+ }
+ ]
+ ],
+ "popover-focus-3.html": [
+ "bcf49cb19e38e4811e5415b937c4b81e28980efa",
+ [
+ null,
+ {
+ "testdriver": true,
+ "timeout": "long"
+ }
+ ]
+ ],
+ "popover-focus-4.html": [
+ "81a2f26e3b3ffc8976fe3214aa70762a834d1f34",
[
null,
{
--- third_party/blink/web_tests/external/wpt/html/links/stylesheet/quirk-origin-check-positive.html
@@ -0,0 +1,18 @@
+<!-- quirks -->
+<title>Same-origin stylesheet with non-CSS MIME type quirk</title>
+<link rel="help" href="https://html.spec.whatwg.org/multipage/#link-type-stylesheet">
+<script src="/resources/testharness.js"></script>
+<script src="/resources/testharnessreport.js"></script>
+<link rel="stylesheet" href="resources/quirk-stylesheet.css.txt">
+<p class="test">This text should be green.</p>
+<script>
+setup({ single_test: true });
+onload = () => {
+ assert_equals(
+ getComputedStyle(document.querySelector('.test')).color,
+ 'rgb(0, 128, 0)',
+ "Same-origin stylesheet with non-CSS MIME type should be applied in quirks mode"
+ );
+ done();
+};
+</script>
\ No newline at end of file
--- third_party/blink/web_tests/external/wpt/html/links/stylesheet/resources/quirk-stylesheet.css.txt
@@ -0,0 +1 @@
+.test { color: green; }
\ No newline at end of file
--- third_party/blink/web_tests/platform/mac-mac14-arm64/virtual/webnn-service-on-npu/external/wpt/webnn/conformance_tests/cast.https.any_npu-expected.txt
@@ -0,0 +1,73 @@
+This is a testharness.js-based test.
+[FAIL] cast float32 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast float32 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast float32 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast float32 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast float16 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast float16 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast float16 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast float16 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast int32 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int32 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int32 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to float32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to float16
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to int32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to float32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to float16
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to int32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to float32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to float16
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to int32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to float32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to float16
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to int32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+Harness: the test ran to completion.
+
--- third_party/blink/web_tests/platform/mac-mac14-arm64/virtual/webnn-service-on-npu/external/wpt/webnn/conformance_tests/dequantizeLinear.https.any_npu-expected.txt
@@ -0,0 +1,17 @@
+This is a testharness.js-based test.
+[FAIL] dequantizeLinear uint4 1D tensor with even input size
+ promise_test: Unhandled rejection with value: object "TypeError: Failed to execute 'dequantizeLinear' on 'MLGraphBuilder': Unsupported data type uint4 for argument input, must be one of [int32, uint32, int8, uint8]."
+[FAIL] dequantizeLinear uint4 1D tensor with odd input size
+ promise_test: Unhandled rejection with value: object "TypeError: Failed to execute 'dequantizeLinear' on 'MLGraphBuilder': Unsupported data type uint4 for argument input, must be one of [int32, uint32, int8, uint8]."
+[FAIL] dequantizeLinear uint4 4D constant tensor broadcasting zeroPoint
+ promise_test: Unhandled rejection with value: object "TypeError: Failed to execute 'dequantizeLinear' on 'MLGraphBuilder': Unsupported data type uint4 for argument input, must be one of [int32, uint32, int8, uint8]."
+[FAIL] dequantizeLinear uint4 3D input with block_size = [1, 1, 2]
+ promise_test: Unhandled rejection with value: object "TypeError: Failed to execute 'dequantizeLinear' on 'MLGraphBuilder': Unsupported data type uint4 for argument input, must be one of [int32, uint32, int8, uint8]."
+[FAIL] dequantizeLinear int4 1D tensor with even size
+ promise_test: Unhandled rejection with value: object "TypeError: Failed to execute 'dequantizeLinear' on 'MLGraphBuilder': Unsupported data type int4 for argument input, must be one of [int32, uint32, int8, uint8]."
+[FAIL] dequantizeLinear int4 1D tensor with odd size
+ promise_test: Unhandled rejection with value: object "TypeError: Failed to execute 'dequantizeLinear' on 'MLGraphBuilder': Unsupported data type int4 for argument input, must be one of [int32, uint32, int8, uint8]."
+[FAIL] per-tensor dequantizeLinear for int4 4D constant
+ promise_test: Unhandled rejection with value: object "TypeError: Failed to execute 'dequantizeLinear' on 'MLGraphBuilder': Unsupported data type int4 for argument input, must be one of [int32, uint32, int8, uint8]."
+Harness: the test ran to completion.
+
--- third_party/blink/web_tests/platform/mac-mac14-arm64/virtual/webnn-service-with-gpu/external/wpt/webnn/conformance_tests/cast.https.any_gpu-expected.txt
@@ -0,0 +1,73 @@
+This is a testharness.js-based test.
+[FAIL] cast float32 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast float32 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast float32 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast float32 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast float16 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast float16 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast float16 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast float16 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast int32 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int32 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int32 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to float32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to float16
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to int32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to float32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to float16
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to int32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to float32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to float16
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to int32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to float32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to float16
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to int32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+Harness: the test ran to completion.
+
--- third_party/blink/web_tests/platform/mac-mac14-arm64/virtual/webnn-service-with-gpu/external/wpt/webnn/conformance_tests/dequantizeLinear.https.any_gpu-expected.txt
@@ -0,0 +1,17 @@
+This is a testharness.js-based test.
+[FAIL] dequantizeLinear uint4 1D tensor with even input size
+ promise_test: Unhandled rejection with value: object "TypeError: Failed to execute 'dequantizeLinear' on 'MLGraphBuilder': Unsupported data type uint4 for argument input, must be one of [int32, uint32, int8, uint8]."
+[FAIL] dequantizeLinear uint4 1D tensor with odd input size
+ promise_test: Unhandled rejection with value: object "TypeError: Failed to execute 'dequantizeLinear' on 'MLGraphBuilder': Unsupported data type uint4 for argument input, must be one of [int32, uint32, int8, uint8]."
+[FAIL] dequantizeLinear uint4 4D constant tensor broadcasting zeroPoint
+ promise_test: Unhandled rejection with value: object "TypeError: Failed to execute 'dequantizeLinear' on 'MLGraphBuilder': Unsupported data type uint4 for argument input, must be one of [int32, uint32, int8, uint8]."
+[FAIL] dequantizeLinear uint4 3D input with block_size = [1, 1, 2]
+ promise_test: Unhandled rejection with value: object "TypeError: Failed to execute 'dequantizeLinear' on 'MLGraphBuilder': Unsupported data type uint4 for argument input, must be one of [int32, uint32, int8, uint8]."
+[FAIL] dequantizeLinear int4 1D tensor with even size
+ promise_test: Unhandled rejection with value: object "TypeError: Failed to execute 'dequantizeLinear' on 'MLGraphBuilder': Unsupported data type int4 for argument input, must be one of [int32, uint32, int8, uint8]."
+[FAIL] dequantizeLinear int4 1D tensor with odd size
+ promise_test: Unhandled rejection with value: object "TypeError: Failed to execute 'dequantizeLinear' on 'MLGraphBuilder': Unsupported data type int4 for argument input, must be one of [int32, uint32, int8, uint8]."
+[FAIL] per-tensor dequantizeLinear for int4 4D constant
+ promise_test: Unhandled rejection with value: object "TypeError: Failed to execute 'dequantizeLinear' on 'MLGraphBuilder': Unsupported data type int4 for argument input, must be one of [int32, uint32, int8, uint8]."
+Harness: the test ran to completion.
+
--- third_party/blink/web_tests/platform/mac-mac14-arm64/virtual/webnn-service-with-gpu/external/wpt/webnn/conformance_tests/gru.https.any_gpu-expected.txt
@@ -0,0 +1,4 @@
+This is a testharness.js-based test.
+All subtests passed and are omitted for brevity.
+See https://chromium.googlesource.com/chromium/src/+/HEAD/docs/testing/writing_web_tests.md#Text-Test-Baselines for details.
+Harness: the test ran to completion.
--- third_party/blink/web_tests/platform/mac-mac14/virtual/webnn-service-on-npu/external/wpt/webnn/conformance_tests/cast.https.any_npu-expected.txt
@@ -0,0 +1,77 @@
+This is a testharness.js-based test.
+[FAIL] cast float32 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast float32 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast float32 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast float32 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast float16 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast float16 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast float16 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast float16 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast int32 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int32 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int32 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to float32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to float16
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to int32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to float32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to float16
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to int32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int8 0D constant tensor to int32
+ assert_true: assert_array_approx_equals_ulp: test cast int32 actual 0 should be close enough to expected 17 by the acceptable 0 ULP distance, but they have 17 ULP distance expected true got false
+[FAIL] cast int8 1D constant tensor to int32
+ assert_true: assert_array_approx_equals_ulp: test cast int32 actual 0 should be close enough to expected 123 by the acceptable 0 ULP distance, but they have 123 ULP distance expected true got false
+[FAIL] cast int8 4D tensor to float32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to float16
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to int32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to float32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to float16
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to int32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+Harness: the test ran to completion.
+
--- third_party/blink/web_tests/platform/mac-mac14/virtual/webnn-service-on-npu/external/wpt/webnn/conformance_tests/dequantizeLinear.https.any_npu-expected.txt
@@ -1,4 +1,6 @@
This is a testharness.js-based test.
+[FAIL] dequantizeLinear int8 4D constant tensor with block_size = [3, 2]
+ assert_true: assert_array_approx_equals_ulp: test dequantizeLinear float32 actual 0 should be close enough to expected -35.00859069824219 by the acceptable 1 ULP distance, but they have 1108084940 ULP distance expected true got false
[FAIL] dequantizeLinear uint4 1D tensor with even input size
promise_test: Unhandled rejection with value: object "TypeError: Failed to execute 'dequantizeLinear' on 'MLGraphBuilder': Unsupported data type uint4 for argument input, must be one of [int32, uint32, int8, uint8]."
[FAIL] dequantizeLinear uint4 1D tensor with odd input size
--- third_party/blink/web_tests/platform/mac-mac14/virtual/webnn-service-on-npu/external/wpt/webnn/conformance_tests/identity.https.any.serviceworker_npu-expected.txt
@@ -0,0 +1,3 @@
+This is a testharness.js-based test.
+Harness: the test ran to completion.
+
--- third_party/blink/web_tests/platform/mac-mac14/virtual/webnn-service-on-npu/external/wpt/webnn/conformance_tests/identity.https.any_npu-expected.txt
@@ -0,0 +1,3 @@
+This is a testharness.js-based test.
+Harness: the test ran to completion.
+
--- third_party/blink/web_tests/platform/mac-mac14/virtual/webnn-service-on-npu/external/wpt/webnn/conformance_tests/reduce_product.https.any_npu-expected.txt
@@ -0,0 +1,3 @@
+This is a testharness.js-based test.
+Harness: the test ran to completion.
+
--- third_party/blink/web_tests/platform/mac-mac14/virtual/webnn-service-on-npu/external/wpt/webnn/conformance_tests/reduce_sum.https.any_npu-expected.txt
@@ -0,0 +1,3 @@
+This is a testharness.js-based test.
+Harness: the test ran to completion.
+
--- third_party/blink/web_tests/platform/mac-mac14/virtual/webnn-service-with-gpu/external/wpt/webnn/conformance_tests/cast.https.any_gpu-expected.txt
@@ -0,0 +1,77 @@
+This is a testharness.js-based test.
+[FAIL] cast float32 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast float32 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast float32 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast float32 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast float16 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast float16 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast float16 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast float16 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast int32 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int32 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int32 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, output 'output' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to float32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to float16
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to int32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast uint32 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint32 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to float32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to float16
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to int32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int64 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int64 must be one of [float32,float16,int32]."
+[FAIL] cast int8 0D constant tensor to int32
+ assert_true: assert_array_approx_equals_ulp: test cast int32 actual 0 should be close enough to expected 17 by the acceptable 0 ULP distance, but they have 17 ULP distance expected true got false
+[FAIL] cast int8 1D constant tensor to int32
+ assert_true: assert_array_approx_equals_ulp: test cast int32 actual 0 should be close enough to expected 123 by the acceptable 0 ULP distance, but they have 123 ULP distance expected true got false
+[FAIL] cast int8 4D tensor to float32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to float16
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to int32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast int8 4D tensor to uint8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type int8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to float32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to float16
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to int32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to uint32
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to int64
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+[FAIL] cast uint8 4D tensor to int8
+ promise_test: Unhandled rejection with value: object "TypeError: Unsupported data type, input 'input' data type uint8 must be one of [float32,float16,int32]."
+Harness: the test ran to completion.
+
--- third_party/blink/web_tests/platform/mac-mac14/virtual/webnn-service-with-gpu/external/wpt/webnn/conformance_tests/dequantizeLinear.https.any_gpu-expected.txt
@@ -1,4 +1,6 @@
This is a testharness.js-based test.
+[FAIL] dequantizeLinear int8 4D constant tensor with block_size = [3, 2]
+ assert_true: assert_array_approx_equals_ulp: test dequantizeLinear float32 actual 0 should be close enough to expected -35.00859069824219 by the acceptable 1 ULP distance, but they have 1108084940 ULP distance expected true got false
[FAIL] dequantizeLinear uint4 1D tensor with even input size
promise_test: Unhandled rejection with value: object "TypeError: Failed to execute 'dequantizeLinear' on 'MLGraphBuilder': Unsupported data type uint4 for argument input, must be one of [int32, uint32, int8, uint8]."
[FAIL] dequantizeLinear uint4 1D tensor with odd input size
--- third_party/blink/web_tests/platform/mac-mac14/virtual/webnn-service-with-gpu/external/wpt/webnn/conformance_tests/gru.https.any_gpu-expected.txt
@@ -0,0 +1,5 @@
+This is a testharness.js-based test.
+[FAIL] gru float32 tensors steps=2 with options.bias, options.recurrentBias, options.direction='both' and options.returnSequence=true
+ assert_true: assert_array_approx_equals_ulp: test gru float32 actual -2.213313102722168 should be close enough to expected -2.213315725326538 by the acceptable 6 ULP distance, but they have 11 ULP distance expected true got false
+Harness: the test ran to completion.
+
--- third_party/blink/web_tests/platform/mac-mac14/virtual/webnn-service-with-gpu/external/wpt/webnn/conformance_tests/identity.https.any.serviceworker_gpu-expected.txt
@@ -0,0 +1,3 @@
+This is a testharness.js-based test.
+Harness: the test ran to completion.
+
--- third_party/blink/web_tests/platform/mac-mac14/virtual/webnn-service-with-gpu/external/wpt/webnn/conformance_tests/identity.https.any.sharedworker_gpu-expected.txt
@@ -0,0 +1,3 @@
+This is a testharness.js-based test.
+Harness: the test ran to completion.
+
--- third_party/blink/web_tests/platform/mac-mac14/virtual/webnn-service-with-gpu/external/wpt/webnn/conformance_tests/identity.https.any.worker_gpu-expected.txt
@@ -0,0 +1,3 @@
+This is a testharness.js-based test.
+Harness: the test ran to completion.
+
--- third_party/blink/web_tests/platform/mac-mac14/virtual/webnn-service-with-gpu/external/wpt/webnn/conformance_tests/reduce_max.https.any_gpu-expected.txt
@@ -0,0 +1,3 @@
+This is a testharness.js-based test.
+Harness: the test ran to completion.
+
--- third_party/blink/web_tests/platform/mac-mac14/virtual/webnn-service-with-gpu/external/wpt/webnn/conformance_tests/reduce_mean.https.any_gpu-expected.txt
@@ -0,0 +1,3 @@
+This is a testharness.js-based test.
+Harness: the test ran to completion.
+
--- third_party/blink/web_tests/platform/mac-mac14/virtual/webnn-service-with-gpu/external/wpt/webnn/conformance_tests/reduce_min.https.any_gpu-expected.txt
@@ -0,0 +1,3 @@
+This is a testharness.js-based test.
+Harness: the test ran to completion.
+
--- third_party/blink/web_tests/platform/mac-mac14/virtual/webnn-service-with-gpu/external/wpt/webnn/conformance_tests/reduce_sum.https.any_gpu-expected.txt
@@ -0,0 +1,3 @@
+This is a testharness.js-based test.
+Harness: the test ran to completion.
+
--- third_party/blink/web_tests/platform/win11-arm64/external/wpt/partitioned-popins/partitioned-popins.partitions.tentative.https.window_include=variant-5-test-expected.txt
@@ -0,0 +1,4 @@
+This is a testharness.js-based test.
+Harness Error. harness_status.status = 1 , harness_status.message = Timeout while running cleanup for test named "Verify Partitioned Popins have access to the proper cookie/storage partitions - Cross-site frame opens alternative-host popin.".
+Harness: the test ran to completion.
+
|
chromium
| null |
C
|
C
| null | null |
Browser
|
_chromium
|
NEW_FEAT
|
Large additions with few deletions
|
2af1733f384f4ae750660c4993b3e2883401b6d1
|
2023-03-23 14:56:50
|
schochastics
|
added book (closes #28)
| false
| 1
| 0
| 1
|
--- README.md
@@ -60,7 +60,6 @@ chronologically.
Methods](https://uk.sagepub.com/en-gb/eur/the-sage-handbook-of-social-media-research-methods/book272098)
edited by Anabel Quan-Haase and Luke Sloan (2022)
- [Research Handbook on Digital Sociology](https://www.e-elgar.com/shop/gbp/research-handbook-on-digital-sociology-9781789906752.html) edited by Jan Skopek (2023)
-- [Handbook of Computational Social Science for Policy](https://link.springer.com/book/10.1007/978-3-031-16624-2) by Eleonora Bertoni, Matteo Fontana, Lorenzo Gabrielli, Serena Signorelli, Michele Vespe (2023)
## Conferences
|
awesome-computational-social-science
|
gesiscss
|
R
|
R
| 648
| 83
|
A list of awesome resources for Computational Social Science
|
gesiscss_awesome-computational-social-science
|
DOC_CHANGE
|
changes in readme
|
073eaa9a26f2e45676af6491fcc857a1ea734df1
|
2024-09-18 07:01:14
|
2dust
|
Bug fix
| false
| 0
| 1
| 1
|
--- v2rayN/v2rayN.Desktop/Views/SubEditWindow.axaml
@@ -150,6 +150,7 @@
Grid.Row="6"
Grid.Column="1"
Width="200"
+ VerticalAlignment=" "
Classes="Margin8"
ToolTip.Tip="{x:Static resx:ResUI.LvConvertTargetTip}" />
|
v2rayn
|
2dust
|
C#
|
C#
| 75,986
| 12,289
|
A GUI client for Windows, Linux and macOS, support Xray and sing-box and others
|
2dust_v2rayn
|
BUG_FIX
|
obvious
|
488b38dfa466132a70659c96ad82f7ea1cf23b8b
|
2024-07-02 09:39:49
|
dependabot[bot]
|
Build(deps-dev): Bump eslint-plugin-unicorn from 52.0.0 to 54.0.0 (#40565) * Build(deps-dev): Bump eslint-plugin-unicorn from 52.0.0 to 54.0.0
Bumps [eslint-plugin-unicorn](https://github.com/sindresorhus/eslint-plugin-unicorn) from 52.0.0 to 54.0.0.
- [Release notes](https://github.com/sindresorhus/eslint-plugin-unicorn/releases)
- [Commits](https://github.com/sindresorhus/eslint-plugin-unicorn/compare/v52.0.0...v54.0.0)
---
updated-dependencies:
- dependency-name: eslint-plugin-unicorn
dependency-type: direct:development
update-type: version-update:semver-major
...
Signed-off-by: dependabot[bot] <[email protected]>
* Update .eslintrc.json
---------
Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: XhmikosR <[email protected]>
| false
| 82
| 37
| 119
|
--- .eslintrc.json
@@ -98,9 +98,7 @@
"unicorn/prefer-module": "off",
"unicorn/prefer-query-selector": "off",
"unicorn/prefer-spread": "off",
- "unicorn/prefer-string-raw": "off",
"unicorn/prefer-string-replace-all": "off",
- "unicorn/prefer-structured-clone": "off",
"unicorn/prevent-abbreviations": "off"
},
"overrides": [
--- package-lock.json
@@ -39,7 +39,7 @@
"eslint-plugin-html": "^8.1.1",
"eslint-plugin-import": "^2.29.1",
"eslint-plugin-markdown": "^5.0.0",
- "eslint-plugin-unicorn": "^54.0.0",
+ "eslint-plugin-unicorn": "^52.0.0",
"find-unused-sass-variables": "^6.0.0",
"globby": "^14.0.1",
"hammer-simulator": "0.0.1",
@@ -3271,9 +3271,9 @@
}
},
"node_modules/acorn": {
- "version": "8.12.0",
- "resolved": "https://registry.npmjs.org/acorn/-/acorn-8.12.0.tgz",
- "integrity": "sha512-RTvkC4w+KNXrM39/lWCUaG0IbRkWdCv7W/IOW9oU6SawyxulvkQy5HQPVTKxEjczcUvapcrw3cFx/60VN/NRNw==",
+ "version": "8.11.3",
+ "resolved": "https://registry.npmjs.org/acorn/-/acorn-8.11.3.tgz",
+ "integrity": "sha512-Y9rRfJG5jcKOE0CLisYbojUjIrIEE7AGMzA/Sm4BslANhbS+cDMpgBdcPT91oJ7OuJ9hYJBx59RjbhxVnrF8Xg==",
"dev": true,
"bin": {
"acorn": "bin/acorn"
@@ -5415,17 +5415,17 @@
}
},
"node_modules/eslint-plugin-unicorn": {
- "version": "54.0.0",
- "resolved": "https://registry.npmjs.org/eslint-plugin-unicorn/-/eslint-plugin-unicorn-54.0.0.tgz",
- "integrity": "sha512-XxYLRiYtAWiAjPv6z4JREby1TAE2byBC7wlh0V4vWDCpccOSU1KovWV//jqPXF6bq3WKxqX9rdjoRQ1EhdmNdQ==",
+ "version": "52.0.0",
+ "resolved": "https://registry.npmjs.org/eslint-plugin-unicorn/-/eslint-plugin-unicorn-52.0.0.tgz",
+ "integrity": "sha512-1Yzm7/m+0R4djH0tjDjfVei/ju2w3AzUGjG6q8JnuNIL5xIwsflyCooW5sfBvQp2pMYQFSWWCFONsjCax1EHng==",
"dev": true,
"dependencies": {
- "@babel/helper-validator-identifier": "^7.24.5",
+ "@babel/helper-validator-identifier": "^7.22.20",
"@eslint-community/eslint-utils": "^4.4.0",
- "@eslint/eslintrc": "^3.0.2",
+ "@eslint/eslintrc": "^2.1.4",
"ci-info": "^4.0.0",
"clean-regexp": "^1.0.0",
- "core-js-compat": "^3.37.0",
+ "core-js-compat": "^3.34.0",
"esquery": "^1.5.0",
"indent-string": "^4.0.0",
"is-builtin-module": "^3.2.1",
@@ -5434,11 +5434,11 @@
"read-pkg-up": "^7.0.1",
"regexp-tree": "^0.1.27",
"regjsparser": "^0.10.0",
- "semver": "^7.6.1",
+ "semver": "^7.5.4",
"strip-indent": "^3.0.0"
},
"engines": {
- "node": ">=18.18"
+ "node": ">=16"
},
"funding": {
"url": "https://github.com/sindresorhus/eslint-plugin-unicorn?sponsor=1"
@@ -5447,70 +5447,6 @@
"eslint": ">=8.56.0"
}
},
- "node_modules/eslint-plugin-unicorn/node_modules/@eslint/eslintrc": {
- "version": "3.1.0",
- "resolved": "https://registry.npmjs.org/@eslint/eslintrc/-/eslintrc-3.1.0.tgz",
- "integrity": "sha512-4Bfj15dVJdoy3RfZmmo86RK1Fwzn6SstsvK9JS+BaVKqC6QQQQyXekNaC+g+LKNgkQ+2VhGAzm6hO40AhMR3zQ==",
- "dev": true,
- "dependencies": {
- "ajv": "^6.12.4",
- "debug": "^4.3.2",
- "espree": "^10.0.1",
- "globals": "^14.0.0",
- "ignore": "^5.2.0",
- "import-fresh": "^3.2.1",
- "js-yaml": "^4.1.0",
- "minimatch": "^3.1.2",
- "strip-json-comments": "^3.1.1"
- },
- "engines": {
- "node": "^18.18.0 || ^20.9.0 || >=21.1.0"
- },
- "funding": {
- "url": "https://opencollective.com/eslint"
- }
- },
- "node_modules/eslint-plugin-unicorn/node_modules/eslint-visitor-keys": {
- "version": "4.0.0",
- "resolved": "https://registry.npmjs.org/eslint-visitor-keys/-/eslint-visitor-keys-4.0.0.tgz",
- "integrity": "sha512-OtIRv/2GyiF6o/d8K7MYKKbXrOUBIK6SfkIRM4Z0dY3w+LiQ0vy3F57m0Z71bjbyeiWFiHJ8brqnmE6H6/jEuw==",
- "dev": true,
- "engines": {
- "node": "^18.18.0 || ^20.9.0 || >=21.1.0"
- },
- "funding": {
- "url": "https://opencollective.com/eslint"
- }
- },
- "node_modules/eslint-plugin-unicorn/node_modules/espree": {
- "version": "10.1.0",
- "resolved": "https://registry.npmjs.org/espree/-/espree-10.1.0.tgz",
- "integrity": "sha512-M1M6CpiE6ffoigIOWYO9UDP8TMUw9kqb21tf+08IgDYjCsOvCuDt4jQcZmoYxx+w7zlKw9/N0KXfto+I8/FrXA==",
- "dev": true,
- "dependencies": {
- "acorn": "^8.12.0",
- "acorn-jsx": "^5.3.2",
- "eslint-visitor-keys": "^4.0.0"
- },
- "engines": {
- "node": "^18.18.0 || ^20.9.0 || >=21.1.0"
- },
- "funding": {
- "url": "https://opencollective.com/eslint"
- }
- },
- "node_modules/eslint-plugin-unicorn/node_modules/globals": {
- "version": "14.0.0",
- "resolved": "https://registry.npmjs.org/globals/-/globals-14.0.0.tgz",
- "integrity": "sha512-oahGvuMGQlPw/ivIYBjVSrWAfWLBeku5tpPE2fOPLi+WHffIWbuh2tCjhyQhTBPMf5E9jDEH4FOmTYgYwbKwtQ==",
- "dev": true,
- "engines": {
- "node": ">=18"
- },
- "funding": {
- "url": "https://github.com/sponsors/sindresorhus"
- }
- },
"node_modules/eslint-plugin-unicorn/node_modules/jsesc": {
"version": "3.0.2",
"resolved": "https://registry.npmjs.org/jsesc/-/jsesc-3.0.2.tgz",
@@ -5523,11 +5459,26 @@
"node": ">=6"
}
},
+ "node_modules/eslint-plugin-unicorn/node_modules/lru-cache": {
+ "version": "6.0.0",
+ "resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-6.0.0.tgz",
+ "integrity": "sha512-Jo6dJ04CmSjuznwJSS3pUeWmd/H0ffTlkXXgwZi+eq1UCmqQwCh+eLsYOYCwY991i2Fah4h1BEMCx4qThGbsiA==",
+ "dev": true,
+ "dependencies": {
+ "yallist": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ }
+ },
"node_modules/eslint-plugin-unicorn/node_modules/semver": {
- "version": "7.6.2",
- "resolved": "https://registry.npmjs.org/semver/-/semver-7.6.2.tgz",
- "integrity": "sha512-FNAIBWCx9qcRhoHcgcJ0gvU7SN1lYU2ZXuSfl04bSC5OpvDHFyJCjdNHomPXxjQlCBU67YW64PzY7/VIEH7F2w==",
+ "version": "7.6.0",
+ "resolved": "https://registry.npmjs.org/semver/-/semver-7.6.0.tgz",
+ "integrity": "sha512-EnwXhrlwXMk9gKu5/flx5sv/an57AkRplG3hTK68W7FRDN+k+OWBj65M7719OkA82XLBxrcX0KSHj+X5COhOVg==",
"dev": true,
+ "dependencies": {
+ "lru-cache": "^6.0.0"
+ },
"bin": {
"semver": "bin/semver.js"
},
@@ -5535,6 +5486,12 @@
"node": ">=10"
}
},
+ "node_modules/eslint-plugin-unicorn/node_modules/yallist": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/yallist/-/yallist-4.0.0.tgz",
+ "integrity": "sha512-3wdGidZyq5PB084XLES5TpOSRA3wjXAlIWMhum2kRcv/41Sn2emQ0dycQW4uZXLejwKvg6EsvbdlVL+FYEct7A==",
+ "dev": true
+ },
"node_modules/eslint-scope": {
"version": "7.2.2",
"resolved": "https://registry.npmjs.org/eslint-scope/-/eslint-scope-7.2.2.tgz",
--- package.json
@@ -123,7 +123,7 @@
"eslint-plugin-html": "^8.1.1",
"eslint-plugin-import": "^2.29.1",
"eslint-plugin-markdown": "^5.0.0",
- "eslint-plugin-unicorn": "^54.0.0",
+ "eslint-plugin-unicorn": "^52.0.0",
"find-unused-sass-variables": "^6.0.0",
"globby": "^14.0.1",
"hammer-simulator": "0.0.1",
|
bootstrap
|
twbs
|
JavaScript
|
JavaScript
| 171,693
| 79,045
|
The most popular HTML, CSS, and JavaScript framework for developing responsive, mobile first projects on the web.
|
twbs_bootstrap
|
BUG_FIX
|
Code change: bug removal
|
6400f2fefb543c22f419b9bde39138a9c86f99b8
|
2022-07-08 18:03:45
|
macro
|
Update UmsMemberServiceImpl.java
| false
| 2
| 2
| 4
|
--- mall-portal/src/main/java/com/macro/mall/portal/service/impl/UmsMemberServiceImpl.java
@@ -1,6 +1,5 @@
package com.macro.mall.portal.service.impl;
-import cn.hutool.core.util.StrUtil;
import com.macro.mall.common.exception.Asserts;
import com.macro.mall.mapper.UmsMemberLevelMapper;
import com.macro.mall.mapper.UmsMemberMapper;
@@ -27,6 +26,7 @@ import org.springframework.security.core.userdetails.UsernameNotFoundException;
import org.springframework.security.crypto.password.PasswordEncoder;
import org.springframework.stereotype.Service;
import org.springframework.util.CollectionUtils;
+import org.springframework.util.StringUtils;
import java.util.Date;
import java.util.List;
@@ -186,7 +186,7 @@ public class UmsMemberServiceImpl implements UmsMemberService {
//对输入的验证码进行校验
private boolean verifyAuthCode(String authCode, String telephone){
- if(StrUtil.isEmpty(authCode)){
+ if(StringUtils.isEmpty(authCode)){
return false;
}
String realAuthCode = memberCacheService.getAuthCode(telephone);
|
mall
|
macrozheng
|
Java
|
Java
| 79,319
| 29,052
|
mall项目是一套电商系统,包括前台商城系统及后台管理系统,基于Spring Boot+MyBatis实现,采用Docker容器化部署。 前台商城系统包含首页门户、商品推荐、商品搜索、商品展示、购物车、订单流程、会员中心、客户服务、帮助中心等模块。 后台管理系统包含商品管理、订单管理、会员管理、促销管理、运营管理、内容管理、统计报表、财务管理、权限管理、设置等模块。
|
macrozheng_mall
|
BUG_FIX
|
Obvious
|
54bcadf56c1a509f44c78430a5db94e48370b44a
|
2023-08-16 00:05:29
|
Josh Padnick
|
Add Nullstone Implements https://github.com/opentffoundation/manifesto/pull/50
| false
| 5
| 0
| 5
|
--- index.html
@@ -293,11 +293,6 @@
<td>Company</td>
<td>Development; open-source community efforts</td>
</tr>
- <tr>
- <td><a href="https://nullstone.io">Nullstone</a></td>
- <td>Company</td>
- <td>Development; open-source community efforts</td>
- </tr>
<!-- Projects go below here -->
|
manifesto
|
opentofu
|
HTML
|
HTML
| 36,134
| 1,083
|
The OpenTF Manifesto expresses concern over HashiCorp's switch of the Terraform license from open-source to the Business Source License (BSL) and calls for the tool's return to a truly open-source license.
|
opentofu_manifesto
|
DOC_CHANGE
|
Obvious
|
54c6b4e808608596cd4968c505f9b229c7fbcdd8
|
2024-06-10 18:20:48
|
Suoqin Jin
|
Delete man/projectData.Rd
| false
| 0
| 33
| 33
|
--- man/projectData.Rd
@@ -0,0 +1,33 @@
+% Generated by roxygen2: do not edit by hand
+% Please edit documentation in R/utilities.R
+\name{projectData}
+\alias{projectData}
+\title{Project gene expression data onto a protein-protein interaction network}
+\usage{
+projectData(
+ object,
+ adjMatrix,
+ alpha = 0.5,
+ normalizeAdjMatrix = c("rows", "columns")
+)
+}
+\arguments{
+\item{object}{CellChat object}
+
+\item{adjMatrix}{adjacency matrix of protein-protein interaction network to use}
+
+\item{alpha}{numeric in [0,1] alpha = 0: no smoothing; a larger value alpha results in increasing levels of smoothing.}
+
+\item{normalizeAdjMatrix}{how to normalize the adjacency matrix
+possible values are 'rows' (in-degree)
+and 'columns' (out-degree)}
+}
+\value{
+a projected gene expression matrix
+}
+\description{
+A diffusion process is used to smooth genes’ expression values based on their neighbors’ defined in a high-confidence experimentally validated protein-protein network.
+}
+\details{
+This function is useful when analyzing single-cell data with shallow sequencing depth because the projection reduces the dropout effects of signaling genes, in particular for possible zero expression of subunits of ligands/receptors
+}
|
cellchat
|
jinworks
|
R
|
R
| 367
| 61
|
R toolkit for inference, visualization and analysis of cell-cell communication from single-cell and spatially resolved transcriptomics
|
jinworks_cellchat
|
CODE_IMPROVEMENT
|
looks like a redundant file is removed
|
cee4c244267a8a3234687e29b422c9e2473cedfc
|
2025-02-11 06:27:33
|
Michael de Hoog
|
Migrate from base-org to base (#410)
| false
| 11
| 11
| 22
|
--- .github/workflows/docker.yml
@@ -9,7 +9,7 @@ on:
env:
REGISTRY: ghcr.io
- NAMESPACE: ghcr.io/base
+ NAMESPACE: ghcr.io/base-org
GETH_DEPRECATED_IMAGE_NAME: node
GETH_IMAGE_NAME: node-geth
RETH_IMAGE_NAME: node-reth
--- CONTRIBUTING.md
@@ -50,7 +50,7 @@ be locked to prevent further discussion.
All support requests must be made via [our support team][3].
-[1]: https://github.com/base/node/issues
+[1]: https://github.com/base-org/node/issues
[2]: https://medium.com/brigade-engineering/the-secrets-to-great-commit-messages-106fc0a92a25
[3]: https://support.coinbase.com/customer/en/portal/articles/2288496-how-can-i-contact-coinbase-support-
--- README.md
@@ -8,11 +8,11 @@ This repository contains the relevant Docker builds to run your own node on the
<!-- Badge row 1 - status -->
-[](https://github.com/base/node/graphs/contributors)
-[](https://github.com/base/node/graphs/commit-activity)
-[](https://github.com/base/node/stargazers)
-
-[](https://github.com/base/node/blob/main/LICENSE)
+[](https://github.com/base-org/node/graphs/contributors)
+[](https://github.com/base-org/node/graphs/commit-activity)
+[](https://github.com/base-org/node/stargazers)
+
+[](https://github.com/base-org/node/blob/main/LICENSE)
<!-- Badge row 2 - links and profiles -->
@@ -24,8 +24,8 @@ This repository contains the relevant Docker builds to run your own node on the
<!-- Badge row 3 - detailed status -->
-[](https://github.com/base/node/pulls)
-[](https://github.com/base/node/issues)
+[](https://github.com/base-org/node/pulls)
+[](https://github.com/base-org/node/issues)
### Hardware requirements
@@ -40,7 +40,7 @@ We recommend you have this hardware configuration to run a node:
### Troubleshooting
-If you encounter problems with your node, please open a [GitHub issue](https://github.com/base/node/issues/new/choose) or reach out on our [Discord](https://discord.gg/buildonbase):
+If you encounter problems with your node, please open a [GitHub issue](https://github.com/base-org/node/issues/new/choose) or reach out on our [Discord](https://discord.gg/buildonbase):
- Once you've joined, in the Discord app go to `server menu` > `Linked Roles` > `connect GitHub` and connect your GitHub account so you can gain access to our developer channels
- Report your issue in `#🛟|developer-support` or `🛠|node-operators`
@@ -96,7 +96,7 @@ Note that you'll need to override some of the default configuration that assumes
Example:
```
-docker run --env-file .env.sepolia -e OP_NODE_L2_ENGINE_RPC=ws://localhost:8551 -e OP_NODE_RPC_PORT=7545 ghcr.io/base/node:latest
+docker run --env-file .env.sepolia -e OP_NODE_L2_ENGINE_RPC=ws://localhost:8551 -e OP_NODE_RPC_PORT=7545 ghcr.io/base-org/node:latest
```
### Snapshots
|
node
|
base
|
Shell
|
Shell
| 68,555
| 2,658
|
Everything required to run your own Base node
|
base_node
|
CODE_IMPROVEMENT
|
Code change: type annotation added
|
00ce2a47dc704a5aa27316032322e0e41cc6c03a
|
2023-06-13 23:11:35
|
PowerShell Team Bot
|
Update the cgmanifest (#19792)
| false
| 1
| 1
| 2
|
--- tools/cgmanifest.json
@@ -56,7 +56,7 @@
"Type": "nuget",
"Nuget": {
"Name": "JsonSchema.Net",
- "Version": "4.1.5"
+ "Version": "4.1.2"
}
},
"DevelopmentDependency": false
|
powershell
|
powershell
|
C#
|
C#
| 46,656
| 7,522
|
PowerShell for every system!
|
powershell_powershell
|
CONFIG_CHANGE
|
Very small changes
|
47c83bafc932354acc8eb4b71ff5961a60fb7313
|
2025-03-14 15:27:26
|
Edward Hsing
|
Update README.md
| false
| 8
| 0
| 8
|
--- README.md
@@ -60,11 +60,3 @@ All donations are handled by The Hack Foundation and used strictly for nonprofit
We might introduce more domain options and free hosting in the future to help as many people as possible!
**We can’t wait to see what you build!**
-
----
-
-### 🚨 Abuse Reporting
-We take domain name abuse seriously and are committed to maintaining a safer and more open internet. Every report is carefully reviewed, and response times may vary from a few hours to several days, depending on the complexity of the case.
-
-Email: [email protected]
-Report Form: [Abuse Report Form](https://docs.google.com/forms/d/e/1FAIpQLSdCuhUBFynK4d2YZXptEhV4QHei9-FAk2WhKovrnZRx01lSIQ/viewform)
|
freedomain
|
digitalplatdev
|
HTML
|
HTML
| 41,142
| 933
|
DigitalPlat FreeDomain: Free Domain For Everyone
|
digitalplatdev_freedomain
|
DOC_CHANGE
|
Obvious
|
43e8ecb118d7ef62caf39309c63f3fbdd2ac8cb8
| null |
Cheng Zhao
|
win: screen module is now browser only.
| false
| 1
| 1
| 0
|
--- screen.coffee
@@ -1,5 +1,5 @@
module.exports =
- if process.platform is 'linux' and process.type is 'renderer'
+ if process.platform in ['linux', 'win32'] and process.type is 'renderer'
# On Linux we could not access screen in renderer process.
require('remote').require 'screen'
else
|
electron_electron.json
| null | null | null | null | null | null |
electron_electron.json
|
NEW_FEAT
|
4, new feature that now screen module is browser only
|
bcf36ade9f763b875bd781ad26cdb0549349c5f8
| null |
sekky0905
|
Remove sudo setting from travis.yml (#1816)
| false
| 0
| 1
| -1
|
--- .travis.yml
@@ -1,5 +1,4 @@
language: go
-sudo: false
matrix:
fast_finish: true
|
gin-gonic_gin.json
| null | null | null | null | null | null |
gin-gonic_gin.json
|
BUG_FIX
|
3, most probably sudo setting was causing some issues
|
9ea752c8878776d66ebc2debf5cf3448702b5540
|
2022-09-17 10:33:00
|
Collider LI
|
Plugin: Add uninstall and reveal in Finder
| false
| 56
| 9
| 65
|
--- iina/Base.lproj/Localizable.strings
@@ -236,8 +236,6 @@
"alert.assrt_token_prompt.message" = "Please register in the opened webpage. Get your token in your user home page and enter it below:";
"alert.assrt_token_invalid" = "Your Assrt API token seems to be invalid.";
-"alert.plugin_uninstall.title" = "Are you sure to uninstall %@?";
-"alert.plugin_uninstall.message" = "The plugin will be deleted from your disk.";
"alert.duplicated_plugin_id" = "Duplicated plugin identifiers detected. Two or more plugins have the same identifier \"%@\". This may cause abnormal behaviors. Please find them in Preferences - Plugins and report to their authors. If a plugin is installed locally, you can also manually change its identifier in the plugin's Info.json file.";
"playlist.total_length" = "%@ in total";
--- iina/JavascriptPlugin.swift
@@ -249,7 +249,6 @@ class JavascriptPlugin: NSObject {
}
func remove() {
- JavascriptPlugin.plugins.removeAll { $0 == self }
try? FileManager.default.removeItem(at: root)
}
--- iina/PrefPluginViewController.swift
@@ -327,7 +327,7 @@ class PrefPluginViewController: NSViewController, PreferenceWindowEmbeddable {
message = error.localizedDescription
}
DispatchQueue.main.sync {
- Utility.showAlert("plugin.install-error", arguments: [message], sheetWindow: self.view.window!)
+ Utility.showAlert("plugin.install-error", comment: nil, arguments: [message], style: .critical, sheetWindow: self.view.window!)
}
}
}
@@ -337,22 +337,6 @@ class PrefPluginViewController: NSViewController, PreferenceWindowEmbeddable {
view.window!.endSheet(sender.window!)
}
- @IBAction func uninstallPlugin(_ sender: Any) {
- guard let currentPlugin = currentPlugin else { return }
- Utility.quickAskPanel("plugin_uninstall", titleArgs: [currentPlugin.name], sheetWindow: view.window!) { response in
- if response == .alertFirstButtonReturn {
- currentPlugin.remove()
- self.clearPluginPage()
- self.tableView.reloadData()
- }
- }
- }
-
- @IBAction func revealPlugin(_ sender: Any) {
- guard let currentPlugin = currentPlugin else { return }
- NSWorkspace.shared.activateFileViewerSelecting([currentPlugin.root])
- }
-
private func clearPluginPage() {
pluginInfoContentView.isHidden = true
}
--- iina/PrefPluginViewController.xib
@@ -163,9 +163,6 @@
<behavior key="behavior" pushIn="YES" lightByBackground="YES" lightByGray="YES"/>
<font key="font" metaFont="menu" size="11"/>
</buttonCell>
- <connections>
- <action selector="uninstallPlugin:" target="-2" id="49D-wk-Htc"/>
- </connections>
</button>
<tabView drawsBackground="NO" type="noTabsNoBorder" translatesAutoresizingMaskIntoConstraints="NO" id="Cly-9N-1cz">
<rect key="frame" x="0.0" y="0.0" width="539" height="316"/>
@@ -328,7 +325,7 @@
<textField horizontalHuggingPriority="251" verticalHuggingPriority="750" translatesAutoresizingMaskIntoConstraints="NO" id="uPX-dW-Wcu">
<rect key="frame" x="10" y="110" width="29" height="14"/>
<textFieldCell key="cell" lineBreakMode="clipping" title="Help" id="Ka5-V3-g5E">
- <font key="font" metaFont="menu" size="11"/>
+ <font key="font" metaFont="controlContent" size="11"/>
<color key="textColor" name="secondaryLabelColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="textBackgroundColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
@@ -437,22 +434,11 @@
<action selector="tabSwitched:" target="-2" id="kXO-mk-C1a"/>
</connections>
</segmentedControl>
- <button verticalHuggingPriority="750" translatesAutoresizingMaskIntoConstraints="NO" id="sl8-zI-7f5">
- <rect key="frame" x="361" y="368" width="98" height="17"/>
- <buttonCell key="cell" type="roundRect" title="Reveal in Finder" bezelStyle="roundedRect" alignment="center" controlSize="small" state="on" borderStyle="border" imageScaling="proportionallyDown" inset="2" id="drh-6w-3z1">
- <behavior key="behavior" pushIn="YES" lightByBackground="YES" lightByGray="YES"/>
- <font key="font" metaFont="menu" size="11"/>
- </buttonCell>
- <connections>
- <action selector="revealPlugin:" target="-2" id="qhh-x5-Xcs"/>
- </connections>
- </button>
</subviews>
<constraints>
<constraint firstItem="XdS-aQ-bSd" firstAttribute="leading" secondItem="RQj-8e-E8x" secondAttribute="trailing" constant="8" id="0xI-bW-5pv"/>
<constraint firstItem="304-WM-rM6" firstAttribute="centerX" secondItem="dBW-JH-AvA" secondAttribute="centerX" id="2UO-q1-qSN"/>
<constraint firstItem="XdS-aQ-bSd" firstAttribute="firstBaseline" secondItem="RQj-8e-E8x" secondAttribute="firstBaseline" id="3hj-zc-HFP"/>
- <constraint firstItem="sl8-zI-7f5" firstAttribute="firstBaseline" secondItem="kmI-AD-0GL" secondAttribute="firstBaseline" id="5AW-rI-lN4"/>
<constraint firstItem="8U3-yT-6HJ" firstAttribute="top" secondItem="RQj-8e-E8x" secondAttribute="bottom" constant="4" id="8wC-vh-EFn"/>
<constraint firstItem="8U3-yT-6HJ" firstAttribute="leading" secondItem="dBW-JH-AvA" secondAttribute="leading" constant="12" id="CZq-gN-OVO"/>
<constraint firstAttribute="trailing" secondItem="Cly-9N-1cz" secondAttribute="trailing" id="DNg-ch-ZAL"/>
@@ -461,15 +447,13 @@
<constraint firstItem="RQj-8e-E8x" firstAttribute="top" secondItem="dBW-JH-AvA" secondAttribute="top" constant="12" id="Jar-IH-wWd"/>
<constraint firstItem="O3h-Pq-T36" firstAttribute="leading" secondItem="dBW-JH-AvA" secondAttribute="leading" id="KLG-jn-Xia"/>
<constraint firstItem="Cly-9N-1cz" firstAttribute="leading" secondItem="dBW-JH-AvA" secondAttribute="leading" id="NOH-nT-C8U"/>
- <constraint firstItem="sl8-zI-7f5" firstAttribute="leading" relation="greaterThanOrEqual" secondItem="XdS-aQ-bSd" secondAttribute="trailing" constant="8" id="Rmd-wb-HN8"/>
- <constraint firstItem="sl8-zI-7f5" firstAttribute="firstBaseline" secondItem="XdS-aQ-bSd" secondAttribute="firstBaseline" id="Tl2-GZ-nho"/>
+ <constraint firstItem="kmI-AD-0GL" firstAttribute="firstBaseline" secondItem="XdS-aQ-bSd" secondAttribute="firstBaseline" id="Olc-2v-4Yy"/>
<constraint firstAttribute="trailing" secondItem="O3h-Pq-T36" secondAttribute="trailing" id="Wps-be-0tz"/>
<constraint firstAttribute="bottom" secondItem="Cly-9N-1cz" secondAttribute="bottom" id="Z32-Gf-2AU"/>
<constraint firstItem="Cly-9N-1cz" firstAttribute="top" secondItem="O3h-Pq-T36" secondAttribute="bottom" constant="12" id="Znr-F9-cAE"/>
<constraint firstItem="Cly-9N-1cz" firstAttribute="top" secondItem="304-WM-rM6" secondAttribute="bottom" constant="4" id="kkL-mx-xXX"/>
<constraint firstAttribute="trailing" secondItem="8U3-yT-6HJ" secondAttribute="trailing" constant="12" id="qkn-3v-k2e"/>
<constraint firstItem="RQj-8e-E8x" firstAttribute="leading" secondItem="dBW-JH-AvA" secondAttribute="leading" constant="12" id="rKW-Qy-53M"/>
- <constraint firstItem="kmI-AD-0GL" firstAttribute="leading" secondItem="sl8-zI-7f5" secondAttribute="trailing" constant="8" id="yc9-vk-Vt8"/>
</constraints>
</view>
</subviews>
@@ -597,7 +581,7 @@
<textField verticalHuggingPriority="750" horizontalCompressionResistancePriority="250" translatesAutoresizingMaskIntoConstraints="NO" id="sOM-Pv-QX2">
<rect key="frame" x="18" y="225" width="444" height="28"/>
<textFieldCell key="cell" selectable="YES" id="Nmf-79-hFX">
- <font key="font" metaFont="menu" size="11"/>
+ <font key="font" metaFont="controlContent" size="11"/>
<string key="title">Please enter the full URL of the GitHub repository or a string in the "user/repo" format. You can also choose one from the default plug-ins list.</string>
<color key="textColor" name="secondaryLabelColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="textBackgroundColor" catalog="System" colorSpace="catalog"/>
@@ -655,7 +639,7 @@ DQ
<tableColumns>
<tableColumn identifier="URL" width="331" minWidth="40" maxWidth="1000" id="V2A-RK-hOY">
<tableHeaderCell key="headerCell" lineBreakMode="truncatingTail" borderStyle="border">
- <font key="font" metaFont="menu" size="11"/>
+ <font key="font" metaFont="controlContent" size="11"/>
<color key="textColor" name="headerTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="headerColor" catalog="System" colorSpace="catalog"/>
</tableHeaderCell>
@@ -692,7 +676,7 @@ DQ
</tableColumn>
<tableColumn identifier="Installed" width="99" minWidth="40" maxWidth="1000" id="94B-li-kN0">
<tableHeaderCell key="headerCell" lineBreakMode="truncatingTail" borderStyle="border">
- <font key="font" metaFont="menu" size="11"/>
+ <font key="font" metaFont="controlContent" size="11"/>
<color key="textColor" name="headerTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="headerColor" catalog="System" colorSpace="catalog"/>
</tableHeaderCell>
--- iina/Utility.swift
@@ -91,22 +91,12 @@ class Utility {
- Returns: Whether user dismissed the panel by clicking OK, discardable when using sheet.
*/
@discardableResult
- static func quickAskPanel(_ key: String, titleComment: String? = nil, messageComment: String? = nil, titleArgs: [CVarArg]? = nil, messageArgs: [CVarArg]? = nil, sheetWindow: NSWindow? = nil, callback: ((NSApplication.ModalResponse) -> Void)? = nil) -> Bool {
+ static func quickAskPanel(_ key: String, titleComment: String? = nil, messageComment: String? = nil, sheetWindow: NSWindow? = nil, callback: ((NSApplication.ModalResponse) -> Void)? = nil) -> Bool {
let panel = NSAlert()
let titleKey = "alert." + key + ".title"
let messageKey = "alert." + key + ".message"
- let titleFormat = NSLocalizedString(titleKey, comment: titleComment ?? titleKey)
- let messageFormat = NSLocalizedString(messageKey, comment: messageComment ?? messageKey)
- if let args = titleArgs {
- panel.messageText = String(format: titleFormat, arguments: args)
- } else {
- panel.messageText = titleFormat
- }
- if let args = messageArgs {
- panel.informativeText = String(format: messageFormat, arguments: args)
- } else {
- panel.informativeText = messageFormat
- }
+ panel.messageText = NSLocalizedString(titleKey, comment: titleComment ?? titleKey)
+ panel.informativeText = NSLocalizedString(messageKey, comment: messageComment ?? messageKey)
panel.addButton(withTitle: NSLocalizedString("general.ok", comment: "OK"))
panel.addButton(withTitle: NSLocalizedString("general.cancel", comment: "Cancel"))
--- iina/en.lproj/Localizable.strings
@@ -236,8 +236,6 @@
"alert.assrt_token_prompt.message" = "Please register in the opened webpage. Get your token in your user home page and enter it below:";
"alert.assrt_token_invalid" = "Your Assrt API token seems to be invalid.";
-"alert.plugin_uninstall.title" = "Are you sure to uninstall %@?";
-"alert.plugin_uninstall.message" = "The plugin will be deleted from your disk.";
"alert.duplicated_plugin_id" = "Duplicated plugin identifiers detected. Two or more plugins have the same identifier \"%@\". This may cause abnormal behaviors. Please find them in Preferences - Plugins and report to their authors. If a plugin is installed locally, you can also manually change its identifier in the plugin's Info.json file.";
"playlist.total_length" = "%@ in total";
|
iina
|
iina
|
Swift
|
Swift
| 39,591
| 2,605
|
The modern video player for macOS.
|
iina_iina
|
NEW_FEAT
|
obvious
|
a8a259d066cb48e8341cc64acd1a37a55012270b
|
2023-05-28 15:34:09
|
Evan Luo
|
feat: updated french language localization (#1478) * lang: update french localization
* lang: fix optimization date
* lang: fix line error
* lang: sync with master branch
| false
| 109
| 109
| 218
|
--- Stats/Supporting Files/fr.lproj/Localizable.strings
@@ -26,15 +26,15 @@
"Open Battery settings" = "Ouvrir les paramètres de la batterie";
"Bluetooth" = "Bluetooth";
"Open Bluetooth settings" = "Ouvrir les paramètres Bluetooth";
-"Clock" = "Horloge";
-"Open Clock settings" = "Ouvrir les paramètres de l'horloge";
+"Clock" = "Clock";
+"Open Clock settings" = "Open clock settings";
// Words
"Unknown" = "Inconnu";
"Version" = "Version";
"Processor" = "Processeur";
"Memory" = "Mémoire";
-"Graphics" = "Graphiques";
+"Graphics" = "Graphisme";
"Close" = "Fermer";
"Download" = "Télécharger";
"Install" = "Installer";
@@ -47,7 +47,7 @@
"None" = "Aucun";
"Dots" = "Points";
"Arrows" = "Flèches";
-"Characters" = "Caractères";
+"Characters" = "Lettres";
"Short" = "Court";
"Long" = "Long";
"Statistics" = "Statistiques";
@@ -55,10 +55,10 @@
"Min" = "Min";
"Reset" = "Réinitialiser";
"Alignment" = "Alignement";
-"Left alignment" = "Alignement à gauche";
-"Center alignment" = "Alignement au centre";
-"Right alignment" = "Alignement à droite";
-"Dashboard" = "Tableau de bord";
+"Left alignment" = "Gauche";
+"Center alignment" = "Centre";
+"Right alignment" = "Droite";
+"Dashboard" = "Dashboard";
"Enabled" = "Activé";
"Disabled" = "Désactivé";
"Silent" = "Silencieux";
@@ -67,11 +67,11 @@
"Scaling" = "Échelle";
"Linear" = "Linéaire";
"Square" = "Carré";
-"Cube" = "Cube";
+"Cube" = "Cubique";
"Logarithmic" = "Logarithmique";
"Cores" = "Cœurs";
"Settings" = "Paramètres";
-"Name" = "Nom";
+"Name" = "Name";
"Format" = "Format";
// Setup
@@ -86,61 +86,61 @@
"welcome_message" = "Merci d'utiliser Stats, un moniteur de système macOS gratuit et open source pour votre barre des menus.";
"Start the application automatically when starting your Mac" = "Démarrer l'application automatiquement lors du démarrage de votre Mac";
"Do not start the application automatically when starting your Mac" = "Ne pas démarrer l'application automatiquement lors du démarrage de votre Mac";
-"Do everything silently in the background (recommended)" = "Effectuer toutes les opérations discrètement en arrière-plan (recommandé)";
+"Do everything silently in the background (recommended)" = "Effectuer toutes les opérations en silence en arrière-plan (recommandé)";
"Check for a new version on startup" = "Vérifier s'il existe une nouvelle version au démarrage";
"Check for a new version every day (once a day)" = "Vérifier s'il existe une nouvelle version tous les jours (une fois par jour)";
"Check for a new version every week (once a week)" = "Vérifier s'il existe une nouvelle version toutes les semaines (une fois par semaine)";
"Check for a new version every month (once a month)" = "Vérifier s'il existe une nouvelle version tous les mois (une fois par mois)";
"Never check for updates (not recommended)" = "Ne jamais vérifier les mises à jour (non recommandé)";
"The configuration is completed" = "La configuration est terminée";
-"finish_setup_message" = "Tout est prêt ! \n Stats est un outil open source, gratuit et le restera toujours. \n Si vous appréciez ce projet, vous pouvez le soutenir, cela sera toujours apprécié !";
+"finish_setup_message" = "Tout est prêt! \n Stats est un outil open source, il est gratuit et le restera toujours. \n Si vous l'appréciez, vous pouvez soutenir le projet, c'est toujours apprécié!";
// Alerts
"New version available" = "Nouvelle version disponible";
"Click to install the new version of Stats" = "Cliquez pour installer la nouvelle version de Stats";
-"Successfully updated" = "Mise à jour effectuée avec succès";
-"Stats was updated to v" = "Stats a été mis à jour en version %0";
-"Reset settings text" = "Tous les paramètres de l'application seront réinitialisés et l'application redémarrera. Êtes-vous sûr de vouloir continuer ?";
+"Successfully updated" = "Mise à jour terminée avec succès";
+"Stats was updated to v" = "Stats a été mis à jour en v%0";
+"Reset settings text" = "Tous les paramètres de l'application seront réinitialisés et l'application sera redémarrée. Êtes-vous sûr de vouloir continuer ?";
// Settings
-"Open Activity Monitor" = "Ouvrir le Moniteur d'activité";
+"Open Activity Monitor" = "Ouvrir le moniteur d'activité";
"Report a bug" = "Signaler un bug";
"Support the application" = "Soutenir l'application";
"Close application" = "Fermer l'application";
"Open application settings" = "Ouvrir les paramètres de l'application";
-"Open dashboard" = "Ouvrir le tableau de bord";
+"Open dashboard" = "Ouvrir la dashboard";
// Application settings
"Update application" = "Mettre à jour l'application";
-"Check for updates" = "Vérifier les mises à jour";
+"Check for updates" = "Rechercher des mises à jour";
"At start" = "Au démarrage";
"Once per day" = "Une fois par jour";
"Once per week" = "Une fois par semaine";
"Once per month" = "Une fois par mois";
"Never" = "Jamais";
-"Check for update" = "Vérifier les mises à jour";
-"Show icon in dock" = "Afficher l'icône dans le Dock";
-"Start at login" = "Démarrer à la connexion";
-"Build number" = "Numéro de version";
+"Check for update" = "Vérifier la mise à jour";
+"Show icon in dock" = "Afficher l'icône dans le dock";
+"Start at login" = "Démarrer au lancement";
+"Build number" = "numéro de build";
"Reset settings" = "Réinitialiser les paramètres";
-"Pause the Stats" = "Mettre Stats en pause";
+"Pause the Stats" = "Mettre en pause Stats";
"Resume the Stats" = "Reprendre Stats";
"Combined modules" = "Modules combinés";
"Spacing" = "Espacement";
// Dashboard
"Serial number" = "Numéro de série";
-"Uptime" = "Temps de fonctionnement";
-"Number of cores" = "Nombre de cœurs : %0";
-"Number of threads" = "Nombre de threads : %0";
-"Number of e-cores" = "Nombre de cœurs d'efficacité énergétique : %0";
-"Number of p-cores" = "Nombre de cœurs de performance : %0";
+"Uptime" = "Disponibilité";
+"Number of cores" = "%0 cœurs";
+"Number of threads" = "%0 threads";
+"Number of e-cores" = "%0 cœurs à haute efficacité énergétique";
+"Number of p-cores" = "%0 cœurs de performance";
// Update
-"The latest version of Stats installed" = "La dernière version de Stats est installée";
-"Downloading..." = "Téléchargement en cours...";
-"Current version: " = "Version actuelle : ";
-"Latest version: " = "Dernière version : ";
+"The latest version of Stats installed" = "La dernière version de Stats installée";
+"Downloading..." = "Téléchargement...";
+"Current version: " = "Version actuelle: ";
+"Latest version: " = "Dernière version: ";
// Widgets
"Color" = "Couleur";
@@ -150,11 +150,11 @@
"Value" = "Valeur";
"Colorize" = "Coloriser";
"Colorize value" = "Coloriser la valeur";
-"Additional information" = "Informations supplémentaires";
+"Additional information" = "Informations complémentaires";
"Reverse values order" = "Inverser l'ordre des valeurs";
"Base" = "Base";
"Display mode" = "Mode d'affichage";
-"One row" = "Une seule ligne";
+"One row" = "Une ligne";
"Two rows" = "Deux lignes";
"Mini widget" = "Mini";
"Line chart widget" = "Graphique en ligne";
@@ -169,11 +169,11 @@
"Tachometer widget" = "Tachometer";
"State widget" = "Widget d'état";
"Show symbols" = "Afficher les symboles";
-"Label widget" = "Widget d'étiquette";
+"Label widget" = "Libellé";
"Number of reads in the chart" = "Nombre de lectures sur le graphique";
-"Color of download" = "Couleur du téléchargement";
-"Color of upload" = "Couleur du téléversement";
-"Monospaced font" = "Police à espacement fixe";
+"Color of download" = "Couleur de téléchargement";
+"Color of upload" = "Couleur de téléversement";
+"Monospaced font" = "Monospaced font";
// Module Kit
"Open module settings" = "Ouvrir les paramètres du module";
@@ -184,12 +184,12 @@
"Details" = "Détails";
"Top processes" = "Processus principaux";
"Pictogram" = "Pictogramme";
-"Module settings" = "Paramètres du module";
-"Widget settings" = "Paramètres du widget";
+"Module settings" = "Paramètres du Module";
+"Widget settings" = "Paramètres du Widget";
"Popup settings" = "Paramètres de la fenêtre contextuelle";
-"Merge widgets" = "Fusionner les widgets";
+"Merge widgets" = "Combiner les widgets";
"No available widgets to configure" = "Aucun widget disponible à configurer";
-"No options to configure for the popup in this module" = "Aucune option à configurer pour la fenêtre contextuelle dans ce module";
+"No options to configure for the popup in this module" = "Il n'y a pas d'options à configurer pour la fenêtre contextuelle dans ce module";
// Modules
"Number of top processes" = "Nombre de processus principaux";
@@ -198,30 +198,30 @@
"Chart color" = "Couleur du graphique";
// CPU
-"CPU usage" = "Utilisation du CPU";
+"CPU usage" = "Usage CPU";
"CPU temperature" = "Température du CPU";
"CPU frequency" = "Fréquence du CPU";
"System" = "Système";
"User" = "Utilisateur";
-"Idle" = "Inactif";
-"Show usage per core" = "Afficher l'utilisation par cœur";
-"Show hyper-threading cores" = "Afficher les cœurs avec hyper-threading";
+"Idle" = "Libre";
+"Show usage per core" = "Afficher l'utilisation par coeur";
+"Show hyper-threading cores" = "Afficher les coeurs hyper-threading";
"Split the value (System/User)" = "Diviser la valeur (Système/Utilisateur)";
-"Scheduler limit" = "Limite de planification";
+"Scheduler limit" = "Limite du planificateur";
"Speed limit" = "Limite de vitesse";
"Average load" = "Charge moyenne";
"1 minute" = "1 minute";
"5 minutes" = "5 minutes";
"15 minutes" = "15 minutes";
"CPU usage threshold" = "Seuil d'utilisation du CPU";
-"CPU usage is" = "L'utilisation du CPU est de %0";
-"Efficiency cores" = "Cœurs d'efficacité énergétique";
+"CPU usage is" = "La usage CPU est de %0";
+"Efficiency cores" = "Cœurs à haute efficacité énergétique";
"Performance cores" = "Cœurs de performance";
"System color" = "Couleur du système";
"User color" = "Couleur de l'utilisateur";
-"Idle color" = "Couleur de l'inactivité";
-"Cluster grouping" = "Regroupement par cluster";
-"Efficiency cores color" = "Couleur des cœurs d'efficacité énergétique";
+"Idle color" = "Couleur d'inactivité";
+"Cluster grouping" = "Regrouper en fonction des clusters";
+"Efficiency cores color" = "Couleur des cœurs à haute efficacité énergétique";
"Performance cores color" = "Couleur des cœurs de performance";
// GPU
@@ -237,8 +237,8 @@
"Active" = "Actif";
"Non active" = "Inactif";
"Fan speed" = "Vitesse du ventilateur";
-"Core clock" = "Fréquence du cœur";
-"Memory clock" = "Fréquence de la mémoire";
+"Core clock" = "Fréquence CPU";
+"Memory clock" = "Fréquence RAM";
"Utilization" = "Utilisation";
"Render utilization" = "Utilisation du rendu";
"Tiler utilization" = "Utilisation du tiler";
@@ -251,85 +251,85 @@
"Total" = "Total";
"Used" = "Utilisée";
"App" = "Application";
-"Wired" = "Câblée";
+"Wired" = "Réservée";
"Compressed" = "Compressée";
-"Free" = "Inactive";
+"Free" = "Libre";
"Swap" = "Swap";
-"Split the value (App/Wired/Compressed)" = "Diviser la valeur (Application/Câblée/Compressée)";
-"RAM utilization threshold" = "Seuil d'utilisation de la RAM";
+"Split the value (App/Wired/Compressed)" = "Diviser la valeur (Application/Réservée/Compressée)";
+"RAM utilization threshold" = "Seuil d'utilisation du RAM";
"RAM utilization is" = "L'utilisation de la RAM est de %0";
"App color" = "Couleur des applications";
"Wired color" = "Couleur de la mémoire câblée";
"Compressed color" = "Couleur de la mémoire compressée";
-"Free color" = "Couleur de la mémoire inactive";
+"Free color" = "Couleur de l'inactivité";
// Disk
"Show removable disks" = "Afficher les disques amovibles";
-"Used disk memory" = "Espace disque utilisé : %0 sur %1";
-"Free disk memory" = "Espace disque libre : %0 sur %1";
+"Used disk memory" = "%0 utilisé sur %1";
+"Free disk memory" = "%0 libre sur %1";
"Disk to show" = "Disque à afficher";
"Open disk" = "Ouvrir le disque";
"Switch view" = "Changer de vue";
-"Disk utilization threshold" = "Seuil d'utilisation du disque";
+"Disk utilization threshold" = "Seuil d'utilisation de disque";
"Disk utilization is" = "L'utilisation du disque est de %0";
-"Read color" = "Couleur de lecture";
-"Write color" = "Couleur d'écriture";
-"Disk usage" = "Utilisation du disque";
+"Read color" = "Read color";
+"Write color" = "Write color";
+"Disk usage" = "Disk usage";
// Sensors
"Temperature unit" = "Unité de température";
"Celsius" = "Celsius";
"Fahrenheit" = "Fahrenheit";
-"Save the fan speed" = "Enregistrer la vitesse du ventilateur";
+"Save the fan speed" = "Sauvegarder la vitesse des ventilateurs";
"Fan" = "Ventilateur";
"HID sensors" = "Capteurs HID";
"Synchronize fan's control" = "Synchroniser le contrôle des ventilateurs";
"Current" = "Courant";
"Energy" = "Énergie";
"Show unknown sensors" = "Afficher les capteurs inconnus";
-"Install fan helper" = "Installer l'assistant des ventilateurs";
-"Uninstall fan helper" = "Désinstaller l'assistant des ventilateurs";
-"Fan value" = "Vitesse du ventilateur";
+"Install fan helper" = "Installer l'assistant de ventilateur";
+"Uninstall fan helper" = "Désinstaller l'aide de ventilateur";
+"Fan value" = "La valeur du ventilateur";
// Network
-"Uploading" = "Envoi";
-"Downloading" = "Téléchargement";
+"Uploading" = "Upload";
+"Downloading" = "Download";
"Public IP" = "IP publique";
"Local IP" = "IP locale";
"Interface" = "Interface";
"Physical address" = "Adresse physique";
-"Refresh" = "Actualiser";
+"Refresh" = "Rafraîchir";
"Click to copy public IP address" = "Cliquez pour copier l'adresse IP publique";
"Click to copy local IP address" = "Cliquez pour copier l'adresse IP locale";
-"Click to copy wifi name" = "Cliquez pour copier le nom du réseau Wi-Fi";
-"Click to copy mac address" = "Cliquez pour copier l'adresse MAC";
+"Click to copy wifi name" = "Cliquez pour copier le nom du wifi";
+"Click to copy mac address" = "Cliquez pour copier l'adresse mac";
"No connection" = "Aucune connexion";
"Network interface" = "Interface réseau";
-"Total download" = "Total des téléchargements";
-"Total upload" = "Total des envois";
+"Total download" = "Total download";
+"Total upload" = "Total upload";
"Reader type" = "Type de lecteur";
-"Interface based" = "Basé sur l'interface";
-"Processes based" = "Basé sur les processus";
+"Interface based" = "Interface";
+"Processes based" = "Processus";
"Reset data usage" = "Réinitialiser l'utilisation des données";
"VPN mode" = "Mode VPN";
"Standard" = "Standard";
"Security" = "Sécurité";
"Channel" = "Canal";
"Common scale" = "Échelle commune";
-"Autodetection" = "Détection automatique";
+"Autodetection" = "Autodétection";
"Widget activation threshold" = "Seuil d'activation du widget";
-"Internet connection" = "Connexion Internet";
-"Active state color" = "Couleur de l'état actif";
-"Nonactive state color" = "Couleur de l'état inactif";
-"Connectivity host (ICMP)" = "Hôte de connectivité (ICMP)";
-"Leave empty to disable the check" = "Laissez vide pour désactiver la vérification";
+"Internet connection" = "Connexion internet";
+"Active state color" = "Color d'état actif";
+"Nonactive state color" = "Couleur d'état non actif";
+"Connectivity host (ICMP)" = "Connectivity host (ICMP)";
+"Leave empty to disable the check" = "Laissez vide pour désactiver la vérification.";
"Transparent pictogram when no activity" = "Pictogramme transparent en l'absence d'activité";
-"Connectivity history" = "Historique de la connectivité";
-"Auto-refresh public IP address" = "Actualiser automatiquement l'adresse IP publique";
-"Every hour" = "Toutes les heures";
-"Every 12 hours" = "Toutes les 12 heures";
-"Every 24 hours" = "Toutes les 24 heures";
-"Network activity" = "L'activité réseau";
+"Connectivity history" = "Connectivity history";
+"Auto-refresh public IP address" = "Auto-refresh public IP address";
+"Every hour" = "Every hour";
+"Every 12 hours" = "Every 12 hours";
+"Every 24 hours" = "Every 24 hours";
+"Network activity" = "Network activity";
// Battery
"Level" = "Niveau";
@@ -346,16 +346,16 @@
"Power" = "Courant";
"Is charging" = "En charge";
"Time to discharge" = "Temps restant";
-"Time to charge" = "Temps de charge";
-"Calculating" = "Calcul en cours";
+"Time to charge" = "Temps pour charger";
+"Calculating" = "Calcul";
"Fully charged" = "Complètement chargée";
"Not connected" = "Non connecté";
"Low level notification" = "Notification de niveau faible";
-"High level notification" = "Notification de niveau élevé";
+"High level notification" = "Notification de haut niveau";
"Low battery" = "Batterie faible";
-"High battery" = "Batterie élevée";
+"High battery" = "Batterie haute";
"Battery remaining" = "%0% restant";
-"Battery remaining to full charge" = "%0% jusqu'à charge complète";
+"Battery remaining to full charge" = "%0% pour charger complètement";
"Percentage" = "Pourcentage";
"Percentage and time" = "Pourcentage et temps";
"Time and percentage" = "Temps et pourcentage";
@@ -363,31 +363,31 @@
"Hide additional information when full" = "Masquer les informations supplémentaires lorsque la batterie est pleine";
"Last charge" = "Dernière charge";
"Capacity" = "Capacité";
-"current / maximum / designed" = "actuelle / maximale / prévue";
-"Low power mode" = "Mode d'économie d'énergie";
-"Percentage inside the icon" = "Pourcentage à l'intérieur de l'icône";
-"Colorize battery" = "Colorer la batterie";
+"current / maximum / designed" = "actuel / maximum / prévu";
+"Low power mode" = "Mode d’économie d’énergie";
+"Percentage inside the icon" = "Pourcentage dans l'icône";
+"Colorize battery" = "Colorize battery";
// Bluetooth
-"Battery to show" = "Batterie à afficher";
-"No Bluetooth devices are available" = "Aucun appareil Bluetooth n'est disponible";
+"Battery to show" = "Batterie à montrer";
+"No Bluetooth devices are available" = "Aucun appareil Bluetooth n’est disponible";
// Clock
-"Time zone" = "Fuseau horaire";
+"Time zone" = "Time zone";
"Local" = "Local";
// Colors
"Based on utilization" = "Basé sur l'utilisation";
"Based on pressure" = "Basé sur la pression";
-"Based on cluster" = "Basé sur le cluster";
-"System accent" = "Couleur d'accentuation du système";
-"Monochrome accent" = "Accentuation monochrome";
+"Based on cluster" = "Based on cluster";
+"System accent" = "Couleur système";
+"Monochrome accent" = "Monochrome";
"Clear" = "Transparent";
"White" = "Blanc";
"Black" = "Noir";
"Gray" = "Gris";
"Second gray" = "Deuxième gris";
-"Dark gray" = "Gris foncé";
+"Dark gray" = "Gris sombre";
"Light gray" = "Gris clair";
"Red" = "Rouge";
"Second red" = "Deuxième rouge";
@@ -400,9 +400,9 @@
"Orange" = "Orange";
"Second orange" = "Deuxième orange";
"Purple" = "Violet";
-"Second purple" = "Deuxième violet";
-"Brown" = "Brun";
-"Second brown" = "Deuxième brun";
+"Second purple" = "Deuxième puVioletrple";
+"Brown" = "Marron";
+"Second brown" = "Deuxième marron";
"Cyan" = "Cyan";
"Magenta" = "Magenta";
"Pink" = "Rose";
|
stats
|
exelban
|
Swift
|
Swift
| 29,655
| 950
|
macOS system monitor in your menu bar
|
exelban_stats
|
NEW_FEAT
|
added some translations
|
c22c45e6af516b1a37c425577043cc3f03db0223
|
2024-05-12 00:57:45
|
Kittisak Phetrungnapha
|
Fix testThatDatesCanBeEncodedAsFormatted test fails due to current calendar mismatches with expected result (#3858) ### Goals :soccer:
Currently test function `testThatDatesCanBeEncodedAsFormatted` fails
when trying to run tests from simulator that is not using Gregorian
calendar while the assertion is expecting date in Gregorian format. The
goal should be running tests passed for all simulator calendar settings.
### Implementation Details :construction:
Force `DateFormatter` in the test function
`testThatDatesCanBeEncodedAsFormatted` to be Gregorian so that it will
be run tests passed for all simulator calendar settings.
| false
| 1
| 0
| 1
|
--- Tests/ParameterEncoderTests.swift
@@ -730,7 +730,6 @@ final class URLEncodedFormEncoderTests: BaseTestCase {
let dateFormatter = DateFormatter()
dateFormatter.dateFormat = "yyyy-MM-dd HH:mm:ss.SSSS"
dateFormatter.timeZone = TimeZone(secondsFromGMT: 0)
- dateFormatter.calendar = Calendar(identifier: .gregorian)
let encoder = URLEncodedFormEncoder(dateEncoding: .formatted(dateFormatter))
let parameters = ["date": Date(timeIntervalSinceReferenceDate: 123.456)]
|
alamofire
|
alamofire
|
Swift
|
Swift
| 41,720
| 7,598
|
Elegant HTTP Networking in Swift
|
alamofire_alamofire
|
BUG_FIX
|
obvious
|
2a9f4c04e54294b668e0a2ae11c1930c2e57b248
|
2024-11-23 03:49:20
|
Jordan Brown
|
[compiler] Infer deps configuration (#31616) Adds a way to configure how we insert deps for experimental purposes.
```
[
{
module: 'react',
imported: 'useEffect',
numRequiredArgs: 1,
},
{
module: 'MyExperimentalEffectHooks',
imported: 'useExperimentalEffect',
numRequiredArgs: 2,
},
]
```
would insert dependencies for calls of `useEffect` imported from `react`
if they have 1 argument and calls of useExperimentalEffect` from
`MyExperimentalEffectHooks` if they have 2 arguments. The pushed dep
array is appended to the arg list.
| false
| 168
| 22
| 190
|
--- compiler/packages/babel-plugin-react-compiler/src/Entrypoint/Pipeline.ts
@@ -356,7 +356,7 @@ function* runWithEnvironment(
});
if (env.config.inferEffectDependencies) {
- inferEffectDependencies(hir);
+ inferEffectDependencies(env, hir);
}
if (env.config.inlineJsxTransform) {
--- compiler/packages/babel-plugin-react-compiler/src/HIR/Environment.ts
@@ -242,40 +242,9 @@ const EnvironmentConfigSchema = z.object({
enableOptionalDependencies: z.boolean().default(true),
/**
- * Enables inference and auto-insertion of effect dependencies. Takes in an array of
- * configurable module and import pairs to allow for user-land experimentation. For example,
- * [
- * {
- * module: 'react',
- * imported: 'useEffect',
- * numRequiredArgs: 1,
- * },{
- * module: 'MyExperimentalEffectHooks',
- * imported: 'useExperimentalEffect',
- * numRequiredArgs: 2,
- * },
- * ]
- * would insert dependencies for calls of `useEffect` imported from `react` and calls of
- * useExperimentalEffect` from `MyExperimentalEffectHooks`.
- *
- * `numRequiredArgs` tells the compiler the amount of arguments required to append a dependency
- * array to the end of the call. With the configuration above, we'd insert dependencies for
- * `useEffect` if it is only given a single argument and it would be appended to the argument list.
- *
- * numRequiredArgs must always be greater than 0, otherwise there is no function to analyze for dependencies
- *
- * Still experimental.
+ * Enables inference and auto-insertion of effect dependencies. Still experimental.
*/
- inferEffectDependencies: z
- .nullable(
- z.array(
- z.object({
- function: ExternalFunctionSchema,
- numRequiredArgs: z.number(),
- }),
- ),
- )
- .default(null),
+ inferEffectDependencies: z.boolean().default(false),
/**
* Enables inlining ReactElement object literals in place of JSX
@@ -645,22 +614,6 @@ const testComplexConfigDefaults: PartialEnvironmentConfig = {
source: 'react-compiler-runtime',
importSpecifierName: 'useContext_withSelector',
},
- inferEffectDependencies: [
- {
- function: {
- source: 'react',
- importSpecifierName: 'useEffect',
- },
- numRequiredArgs: 1,
- },
- {
- function: {
- source: 'shared-runtime',
- importSpecifierName: 'useSpecialEffect',
- },
- numRequiredArgs: 2,
- },
- ],
};
/**
--- compiler/packages/babel-plugin-react-compiler/src/Inference/InferEffectDependencies.ts
@@ -8,6 +8,7 @@ import {
HIRFunction,
IdentifierId,
Instruction,
+ isUseEffectHookType,
makeInstructionId,
TInstruction,
InstructionId,
@@ -22,33 +23,20 @@ import {
markInstructionIds,
} from '../HIR/HIRBuilder';
import {eachInstructionOperand, eachTerminalOperand} from '../HIR/visitors';
-import {getOrInsertWith} from '../Utils/utils';
/**
* Infers reactive dependencies captured by useEffect lambdas and adds them as
* a second argument to the useEffect call if no dependency array is provided.
*/
-export function inferEffectDependencies(fn: HIRFunction): void {
+export function inferEffectDependencies(
+ env: Environment,
+ fn: HIRFunction,
+): void {
let hasRewrite = false;
const fnExpressions = new Map<
IdentifierId,
TInstruction<FunctionExpression>
>();
-
- const autodepFnConfigs = new Map<string, Map<string, number>>();
- for (const effectTarget of fn.env.config.inferEffectDependencies!) {
- const moduleTargets = getOrInsertWith(
- autodepFnConfigs,
- effectTarget.function.source,
- () => new Map<string, number>(),
- );
- moduleTargets.set(
- effectTarget.function.importSpecifierName,
- effectTarget.numRequiredArgs,
- );
- }
- const autodepFnLoads = new Map<IdentifierId, number>();
-
const scopeInfos = new Map<
ScopeId,
{pruned: boolean; deps: ReactiveScopeDependencies; hasSingleInstr: boolean}
@@ -86,23 +74,15 @@ export function inferEffectDependencies(fn: HIRFunction): void {
lvalue.identifier.id,
instr as TInstruction<FunctionExpression>,
);
- } else if (
- value.kind === 'LoadGlobal' &&
- value.binding.kind === 'ImportSpecifier'
- ) {
- const moduleTargets = autodepFnConfigs.get(value.binding.module);
- if (moduleTargets != null) {
- const numRequiredArgs = moduleTargets.get(value.binding.imported);
- if (numRequiredArgs != null) {
- autodepFnLoads.set(lvalue.identifier.id, numRequiredArgs);
- }
- }
} else if (
/*
- * TODO: Handle method calls
+ * This check is not final. Right now we only look for useEffects without a dependency array.
+ * This is likely not how we will ship this feature, but it is good enough for us to make progress
+ * on the implementation and test it.
*/
value.kind === 'CallExpression' &&
- autodepFnLoads.get(value.callee.identifier.id) === value.args.length &&
+ isUseEffectHookType(value.callee.identifier) &&
+ value.args.length === 1 &&
value.args[0].kind === 'Identifier'
) {
const fnExpr = fnExpressions.get(value.args[0].identifier.id);
@@ -152,7 +132,7 @@ export function inferEffectDependencies(fn: HIRFunction): void {
loc: GeneratedSource,
};
- const depsPlace = createTemporaryPlace(fn.env, GeneratedSource);
+ const depsPlace = createTemporaryPlace(env, GeneratedSource);
depsPlace.effect = Effect.Read;
newInstructions.push({
@@ -162,8 +142,8 @@ export function inferEffectDependencies(fn: HIRFunction): void {
value: deps,
});
- // Step 2: push the inferred deps array as an argument of the useEffect
- value.args.push({...depsPlace, effect: Effect.Freeze});
+ // Step 2: insert the deps array as an argument of the useEffect
+ value.args[1] = {...depsPlace, effect: Effect.Freeze};
rewriteInstrs.set(instr.id, newInstructions);
}
}
--- compiler/packages/babel-plugin-react-compiler/src/__tests__/fixtures/compiler/infer-deps-custom-config.expect.md
@@ -1,61 +0,0 @@
-
-## Input
-
-```javascript
-// @inferEffectDependencies
-import {print, useSpecialEffect} from 'shared-runtime';
-
-function CustomConfig({propVal}) {
- // Insertion
- useSpecialEffect(() => print(propVal), [propVal]);
- // No insertion
- useSpecialEffect(() => print(propVal), [propVal], [propVal]);
-}
-
-```
-
-## Code
-
-```javascript
-import { c as _c } from "react/compiler-runtime"; // @inferEffectDependencies
-import { print, useSpecialEffect } from "shared-runtime";
-
-function CustomConfig(t0) {
- const $ = _c(7);
- const { propVal } = t0;
- let t1;
- let t2;
- if ($[0] !== propVal) {
- t1 = () => print(propVal);
- t2 = [propVal];
- $[0] = propVal;
- $[1] = t1;
- $[2] = t2;
- } else {
- t1 = $[1];
- t2 = $[2];
- }
- useSpecialEffect(t1, t2, [propVal]);
- let t3;
- let t4;
- let t5;
- if ($[3] !== propVal) {
- t3 = () => print(propVal);
- t4 = [propVal];
- t5 = [propVal];
- $[3] = propVal;
- $[4] = t3;
- $[5] = t4;
- $[6] = t5;
- } else {
- t3 = $[4];
- t4 = $[5];
- t5 = $[6];
- }
- useSpecialEffect(t3, t4, t5);
-}
-
-```
-
-### Eval output
-(kind: exception) Fixture not implemented
\ No newline at end of file
--- compiler/packages/babel-plugin-react-compiler/src/__tests__/fixtures/compiler/infer-deps-custom-config.js
@@ -1,9 +0,0 @@
-// @inferEffectDependencies
-import {print, useSpecialEffect} from 'shared-runtime';
-
-function CustomConfig({propVal}) {
- // Insertion
- useSpecialEffect(() => print(propVal), [propVal]);
- // No insertion
- useSpecialEffect(() => print(propVal), [propVal], [propVal]);
-}
--- compiler/packages/babel-plugin-react-compiler/src/__tests__/fixtures/compiler/infer-effect-dependencies.expect.md
@@ -3,8 +3,6 @@
```javascript
// @inferEffectDependencies
-import {useEffect, useRef} from 'react';
-
const moduleNonReactive = 0;
function Component({foo, bar}) {
@@ -47,8 +45,6 @@ function Component({foo, bar}) {
```javascript
import { c as _c } from "react/compiler-runtime"; // @inferEffectDependencies
-import { useEffect, useRef } from "react";
-
const moduleNonReactive = 0;
function Component(t0) {
--- compiler/packages/babel-plugin-react-compiler/src/__tests__/fixtures/compiler/infer-effect-dependencies.js
@@ -1,6 +1,4 @@
// @inferEffectDependencies
-import {useEffect, useRef} from 'react';
-
const moduleNonReactive = 0;
function Component({foo, bar}) {
--- compiler/packages/snap/src/compiler.ts
@@ -174,6 +174,11 @@ function makePluginOptions(
.filter(s => s.length > 0);
}
+ let inferEffectDependencies = false;
+ if (firstLine.includes('@inferEffectDependencies')) {
+ inferEffectDependencies = true;
+ }
+
let logs: Array<{filename: string | null; event: LoggerEvent}> = [];
let logger: Logger | null = null;
if (firstLine.includes('@logger')) {
@@ -197,6 +202,7 @@ function makePluginOptions(
hookPattern,
validatePreserveExistingMemoizationGuarantees,
validateBlocklistedImports,
+ inferEffectDependencies,
},
compilationMode,
logger,
--- compiler/packages/snap/src/sprout/shared-runtime.ts
@@ -363,14 +363,6 @@ export function useFragment(..._args: Array<any>): object {
};
}
-export function useSpecialEffect(
- fn: () => any,
- _secondArg: any,
- deps: Array<any>,
-) {
- React.useEffect(fn, deps);
-}
-
export function typedArrayPush<T>(array: Array<T>, item: T): void {
array.push(item);
}
@@ -378,5 +370,4 @@ export function typedArrayPush<T>(array: Array<T>, item: T): void {
export function typedLog(...values: Array<any>): void {
console.log(...values);
}
-
export default typedLog;
|
react
|
facebook
|
JavaScript
|
JavaScript
| 232,878
| 47,794
|
The library for web and native user interfaces.
|
facebook_react
|
CODE_IMPROVEMENT
|
changes in dependencies configuration for experimental purposes
|
750fe25084be5476e24feaf2be9d384b33d285db
|
2024-01-07 17:07:50
|
Syuugo
|
Update Japanese doc
| false
| 2
| 1
| 3
|
--- docs/README-ja.md
@@ -60,8 +60,7 @@ BootLoader のロック解除に対する Xiaomi の制限は無限であり、
## ⚙️ 使用方法
1. [公式サイト](https://www.php.net/downloads) からシステムに PHP 8.0+ をダウンロードしてインストールします。
-2. `php.ini` で OpenSSL と Curl 拡張機能を有効にします。
- (スクリプトが機能しない場合は、`extension_dir` を PHP の `ext` ディレクトリに設定してください。)
+2. `php.ini` で OpenSSL と Curl 拡張機能を有効にします。
3. [php-adb](https://github.com/MlgmXyysd/php-adb) の `adb.php` をディレクトリに配置します。
4. [platform-tools](https://developer.android.com/studio/releases/platform-tools?hl=ja#downloads) をダウンロードして`libraries` に展開します。
※注意: Mac OS では、`adb` の名前を `adb-darwin` に変更する必要があります。
|
xiaomi-hyperos-bootloader-bypass
|
mlgmxyysd
|
PHP
|
PHP
| 3,496
| 367
|
A PoC that exploits a vulnerability to bypass the Xiaomi HyperOS community restrictions of BootLoader unlocked account bindings.
|
mlgmxyysd_xiaomi-hyperos-bootloader-bypass
|
DOC_CHANGE
|
Obvious
|
74eb3960996576cd2f8ca8709574fc46e4c4f923
|
2025-02-05 02:07:59
|
Jordan Harband
|
[Dev Deps] update `markdown-link-check`
| false
| 0
| 0
| 0
|
(error extracting diff)
|
nvm
|
nvm-sh
|
Shell
|
Shell
| 82,623
| 8,249
|
Node Version Manager - POSIX-compliant bash script to manage multiple active node.js versions
|
nvm-sh_nvm
|
CONFIG_CHANGE
|
version updates are done
|
afa72846b133d830446ec485cdb09a584ae664c2
|
2023-11-21 15:42:49
|
Moritz
|
[QuickAccent]Added Greek (EL) language (ISO 639-1) (#29709)
| false
| 57
| 19
| 76
|
--- src/modules/poweraccent/PowerAccent.Core/Languages.cs
@@ -18,7 +18,6 @@ namespace PowerAccent.Core
GA,
GD,
DE,
- EL,
EST,
FI,
FR,
@@ -59,7 +58,6 @@ namespace PowerAccent.Core
Language.GA => GetDefaultLetterKeyGA(letter), // Gaeilge (Irish)
Language.GD => GetDefaultLetterKeyGD(letter), // Gàidhlig (Scottish Gaelic)
Language.DE => GetDefaultLetterKeyDE(letter), // German
- Language.EL => GetDefaultLetterKeyEL(letter), // Greek
Language.EST => GetDefaultLetterKeyEST(letter), // Estonian
Language.FI => GetDefaultLetterKeyFI(letter), // Finnish
Language.FR => GetDefaultLetterKeyFR(letter), // French
@@ -103,7 +101,6 @@ namespace PowerAccent.Core
.Union(GetDefaultLetterKeyGA(letter))
.Union(GetDefaultLetterKeyGD(letter))
.Union(GetDefaultLetterKeyDE(letter))
- .Union(GetDefaultLetterKeyEL(letter))
.Union(GetDefaultLetterKeyEST(letter))
.Union(GetDefaultLetterKeyFI(letter))
.Union(GetDefaultLetterKeyFR(letter))
@@ -147,32 +144,32 @@ namespace PowerAccent.Core
LetterKey.VK_4 => new[] { "⅘" },
LetterKey.VK_5 => new[] { "⅚", "⅝" },
LetterKey.VK_7 => new[] { "⅞" },
- LetterKey.VK_A => new[] { "ά", "ȧ" },
- LetterKey.VK_B => new[] { "ḃ" },
- LetterKey.VK_C => new[] { "ċ", "°C", "©", "ℂ" },
- LetterKey.VK_D => new[] { "ḍ", "ḋ" },
- LetterKey.VK_E => new[] { "έ", "ή", "∈" },
+ LetterKey.VK_A => new[] { "α", "ά", "ȧ" },
+ LetterKey.VK_B => new[] { "ḃ", "β" },
+ LetterKey.VK_C => new[] { "ċ", "χ", "°C", "©", "ℂ" },
+ LetterKey.VK_D => new[] { "ḍ", "ḋ", "δ" },
+ LetterKey.VK_E => new[] { "ε", "έ", "η", "ή", "∈" },
LetterKey.VK_F => new[] { "ḟ", "°F" },
- LetterKey.VK_G => new[] { "ģ", "ǧ", "ġ", "ĝ", "ǥ" },
+ LetterKey.VK_G => new[] { "ģ", "ǧ", "ġ", "ĝ", "ǥ", "γ" },
LetterKey.VK_H => new[] { "ḣ", "ĥ", "ħ" },
- LetterKey.VK_I => new[] { "ί" },
+ LetterKey.VK_I => new[] { "ι", "ί" },
LetterKey.VK_J => new[] { "ĵ" },
- LetterKey.VK_K => new[] { "ķ", "ǩ" },
- LetterKey.VK_L => new[] { "ļ", "₺" }, // ₺ is in VK_T for other languages, but not VK_L, so we add it here.
- LetterKey.VK_M => new[] { "ṁ" },
- LetterKey.VK_N => new[] { "ņ", "ṅ", "ⁿ", "ℕ" },
- LetterKey.VK_O => new[] { "ȯ", "ώ", "ό" },
- LetterKey.VK_P => new[] { "ṗ", "℗" },
+ LetterKey.VK_K => new[] { "ķ", "ǩ", "κ" },
+ LetterKey.VK_L => new[] { "ļ", "₺", "λ" }, // ₺ is in VK_T for other languages, but not VK_L, so we add it here.
+ LetterKey.VK_M => new[] { "ṁ", "μ" },
+ LetterKey.VK_N => new[] { "ņ", "ṅ", "ⁿ", "ν", "ℕ" },
+ LetterKey.VK_O => new[] { "ȯ", "ω", "ώ", "ο", "ό" },
+ LetterKey.VK_P => new[] { "ṗ", "φ", "ψ", "℗" },
LetterKey.VK_Q => new[] { "ℚ" },
- LetterKey.VK_R => new[] { "ṙ", "®", "ℝ" },
- LetterKey.VK_S => new[] { "ṡ", "\u00A7" },
- LetterKey.VK_T => new[] { "ţ", "ṫ", "ŧ", "™" },
- LetterKey.VK_U => new[] { "ŭ", "ύ" },
+ LetterKey.VK_R => new[] { "ṙ", "ρ", "®", "ℝ" },
+ LetterKey.VK_S => new[] { "ṡ", "σ", "\u00A7" },
+ LetterKey.VK_T => new[] { "ţ", "ṫ", "ŧ", "θ", "τ", "™" },
+ LetterKey.VK_U => new[] { "ŭ", "υ", "ύ" },
LetterKey.VK_V => new[] { "V̇" },
LetterKey.VK_W => new[] { "ẇ" },
- LetterKey.VK_X => new[] { "ẋ", "×" },
+ LetterKey.VK_X => new[] { "ẋ", "ξ", "×" },
LetterKey.VK_Y => new[] { "ẏ", "ꝡ" },
- LetterKey.VK_Z => new[] { "ʒ", "ǯ", "ℤ" },
+ LetterKey.VK_Z => new[] { "ʒ", "ǯ", "ζ", "ℤ" },
LetterKey.VK_COMMA => new[] { "∙", "₋", "⁻", "–" }, // – is in VK_MINUS for other languages, but not VK_COMMA, so we add it here.
LetterKey.VK_PERIOD => new[] { "\u0300", "\u0301", "\u0302", "\u0303", "\u0304", "\u0308", "\u030C" },
LetterKey.VK_MINUS => new[] { "~", "‐", "‑", "‒", "—", "―", "⁓", "−", "⸺", "⸻" },
@@ -526,36 +523,6 @@ namespace PowerAccent.Core
};
}
- // Greek
- private static string[] GetDefaultLetterKeyEL(LetterKey letter)
- {
- return letter switch
- {
- LetterKey.VK_A => new string[] { "α" },
- LetterKey.VK_B => new string[] { "β" },
- LetterKey.VK_C => new string[] { "χ" },
- LetterKey.VK_D => new string[] { "δ" },
- LetterKey.VK_E => new string[] { "ε", "η" },
- LetterKey.VK_F => new string[] { "φ" },
- LetterKey.VK_G => new string[] { "γ" },
- LetterKey.VK_I => new string[] { "ι" },
- LetterKey.VK_K => new string[] { "κ" },
- LetterKey.VK_L => new string[] { "λ" },
- LetterKey.VK_M => new string[] { "μ" },
- LetterKey.VK_N => new string[] { "ν" },
- LetterKey.VK_O => new string[] { "ο", "ω" },
- LetterKey.VK_P => new string[] { "π", "φ", "ψ" },
- LetterKey.VK_R => new string[] { "ρ" },
- LetterKey.VK_S => new string[] { "σ" },
- LetterKey.VK_T => new string[] { "τ", "θ" },
- LetterKey.VK_U => new string[] { "υ" },
- LetterKey.VK_X => new string[] { "ξ" },
- LetterKey.VK_Y => new string[] { "υ" },
- LetterKey.VK_Z => new string[] { "ζ" },
- _ => Array.Empty<string>(),
- };
- }
-
// Hebrew
private static string[] GetDefaultLetterKeyHE(LetterKey letter)
{
--- src/settings-ui/Settings.UI/SettingsXAML/Views/PowerAccentPage.xaml
@@ -53,7 +53,6 @@
<ComboBoxItem x:Uid="QuickAccent_SelectedLanguage_Gaeilge" />
<ComboBoxItem x:Uid="QuickAccent_SelectedLanguage_Gaidhlig" />
<ComboBoxItem x:Uid="QuickAccent_SelectedLanguage_Dutch" />
- <ComboBoxItem x:Uid="QuickAccent_SelectedLanguage_Greek" />
<ComboBoxItem x:Uid="QuickAccent_SelectedLanguage_Estonian" />
<ComboBoxItem x:Uid="QuickAccent_SelectedLanguage_Finnish" />
<ComboBoxItem x:Uid="QuickAccent_SelectedLanguage_French" />
--- src/settings-ui/Settings.UI/Strings/en-us/Resources.resw
@@ -3360,9 +3360,6 @@ Activate by holding the key for the character you want to add an accent to, then
<data name="QuickAccent_SelectedLanguage_German.Content" xml:space="preserve">
<value>German</value>
</data>
- <data name="QuickAccent_SelectedLanguage_Greek.Content" xml:space="preserve">
- <value>Greek</value>
- </data>
<data name="QuickAccent_SelectedLanguage_Hebrew.Content" xml:space="preserve">
<value>Hebrew</value>
</data>
--- src/settings-ui/Settings.UI/ViewModels/PowerAccentViewModel.cs
@@ -32,7 +32,6 @@ namespace Microsoft.PowerToys.Settings.UI.ViewModels
"GA",
"GD",
"NL",
- "EL",
"EST",
"FI",
"FR",
|
powertoys
|
microsoft
|
C#
|
C#
| 115,301
| 6,789
|
Windows system utilities to maximize productivity
|
microsoft_powertoys
|
CODE_IMPROVEMENT
|
Code change: arrow function improvement
|
c092b79c4eb466fcc4b125d7a5ab135c7f9d2ad4
|
2025-02-12 20:18:31
|
Niklas Mischkulnig
|
Turbopack: make tracing easier (#75958) - Add an env var to run Turbopack in the same process: `NEXT_TURBOPACK_USE_WORKER=0` - Put some turbo-task spans behind a feature. This alone makes traces 20% smaller. There is still one span that accounts for the time spent in turbo-tasks: 
| false
| 7
| 1
| 8
|
--- packages/next/src/build/index.ts
@@ -1428,10 +1428,7 @@ export default async function build(
duration: compilerDuration,
shutdownPromise: p,
...rest
- } = await turbopackBuild(
- process.env.NEXT_TURBOPACK_USE_WORKER === undefined ||
- process.env.NEXT_TURBOPACK_USE_WORKER !== '0'
- )
+ } = await turbopackBuild(true)
shutdownPromise = p
traceMemoryUsage('Finished build', nextBuildSpan)
--- turbopack/crates/turbo-tasks-backend/Cargo.toml
@@ -17,7 +17,6 @@ default = []
verify_serialization = []
trace_aggregation_update = []
trace_find_and_schedule = []
-trace_task_completion = []
trace_task_dirty = []
lmdb = ["dep:lmdb-rkv"]
--- turbopack/crates/turbo-tasks-backend/src/backend/mod.rs
@@ -1381,7 +1381,6 @@ impl<B: BackingStorage> TurboTasksBackendInner<B> {
drop(task);
if !queue.is_empty() || !old_edges.is_empty() {
- #[cfg(feature = "trace_task_completion")]
let _span = tracing::trace_span!("remove old edges and prepare new children").entered();
// Remove outdated edges first, before removing in_progress+dirty flag.
// We need to make sure all outdated edges are removed before the task can potentially
@@ -1444,7 +1443,6 @@ impl<B: BackingStorage> TurboTasksBackendInner<B> {
drop(task);
if has_children {
- #[cfg(feature = "trace_task_completion")]
let _span = tracing::trace_span!("connect new children").entered();
queue.execute(&mut ctx);
}
|
next.js
|
vercel
|
JavaScript
|
JavaScript
| 129,891
| 27,821
|
The React Framework
|
vercel_next.js
|
CONFIG_CHANGE
|
Very small changes
|
e704f4d3785b318d7f4d3a034424b54ce50beb88
| null |
Jaime Marquínez Ferrándiz
|
YoutubeIE: If not subtitles language is given default to English for automatic captions (related #901)
| false
| 1
| 1
| 0
|
--- InfoExtractors.py
@@ -420,7 +420,7 @@ class YoutubeIE(InfoExtractor):
def _request_automatic_caption(self, video_id, webpage):
"""We need the webpage for getting the captions url, pass it as an
argument to speed up the process."""
- sub_lang = self._downloader.params.get('subtitleslang')
+ sub_lang = self._downloader.params.get('subtitleslang') or 'en'
sub_format = self._downloader.params.get('subtitlesformat')
self.to_screen(u'%s: Looking for automatic captions' % video_id)
mobj = re.search(r';ytplayer.config = ({.*?});', webpage)
|
ytdl-org_youtube-dl.json
| null | null | null | null | null | null |
ytdl-org_youtube-dl.json
|
BUG_FIX
|
4, There is a #number in commit msg, which probably is bug num and commit msg is worded like it solve bug
|
823eca3bf6b4ac3b4fe93fe4b4cd95c58f40e428
|
2023-10-25 17:31:53
|
Tien Do Nam
|
chore: remove unexpected submodule
| false
| 0
| 1
| 1
|
--- flutter
@@ -0,0 +1 @@
+Subproject commit 6df6c897ef6907e774d58ec5caaffad48fc73260
|
localsend
|
localsend
|
Dart
|
Dart
| 58,423
| 3,136
|
An open-source cross-platform alternative to AirDrop
|
localsend_localsend
|
CONFIG_CHANGE
|
Very small changes
|
fe4920f138897138cc6eb76306677c4a5bc7980a
| null |
oni-link
|
Remove unnecessary assert() in os_dirname(). Compiler warns about buf always being nonnull. buf is per function attribute always nonnull, so buf can be removed from the assert(). But a buffer length of zero is also no problem, because it makes uv_cwd() return a failure without writing into buf. So the remaining length check can also be removed.
| false
| 0
| 2
| -2
|
--- fs.c
@@ -42,8 +42,6 @@ int os_chdir(const char *path)
int os_dirname(char_u *buf, size_t len)
FUNC_ATTR_NONNULL_ALL
{
- assert(buf && len);
-
int error_number;
if ((error_number = uv_cwd((char *)buf, &len)) != kLibuvSuccess) {
STRLCPY(buf, uv_strerror(error_number), len);
|
neovim_neovim.json
| null | null | null | null | null | null |
neovim_neovim.json
|
PERF_IMPROVEMENT
|
4, removed unnecessary assert, it wasn't breaking anything but just nice to have
|
0d6dc3e881faa25ba543e0fd5b07bd576348a6ed
|
2024-01-01 16:20:12
|
dependabot[bot]
|
chore(deps): bump io.gsonfire:gson-fire from 1.8.4 to 1.9.0 in /libraries/java (#62) Bumps [io.gsonfire:gson-fire](https://github.com/julman99/gson-fire)
from 1.8.4 to 1.9.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/julman99/gson-fire/releases">io.gsonfire:gson-fire's
releases</a>.</em></p>
<blockquote>
<h2>v1.9.0</h2>
<p>See: <a
href="https://github.com/julman99/gson-fire#190">https://github.com/julman99/gson-fire#190</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/julman99/gson-fire/commits/v1.9.0">compare
view</a></li>
</ul>
</details>
<br />
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
<details>
<summary>Dependabot commands and options</summary>
<br />
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
</details>
Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
| false
| 1
| 1
| 2
|
--- libraries/java/build.gradle
@@ -28,7 +28,7 @@ dependencies {
implementation 'com.squareup.okhttp3:okhttp:4.12.0'
implementation 'com.squareup.okhttp3:logging-interceptor:4.12.0'
implementation 'com.google.code.gson:gson:2.10.1'
- implementation 'io.gsonfire:gson-fire:1.9.0'
+ implementation 'io.gsonfire:gson-fire:1.8.4'
implementation group: 'org.apache.commons', name: 'commons-lang3', version: '3.14.0'
implementation 'org.threeten:threetenbp:1.6.8'
implementation 'javax.annotation:javax.annotation-api:1.3.2'
|
standard-webhooks
|
standard-webhooks
|
Elixir
|
Elixir
| 1,390
| 37
|
The Standard Webhooks specification
|
standard-webhooks_standard-webhooks
|
CODE_IMPROVEMENT
|
Code change: type annotation added
|
22503cdb942b3546db596a26ac3d68e02620fc56
| null |
nvbn
|
#279 Fix merge
| false
| 0
| 1
| -1
|
--- setup.py
@@ -1,5 +1,4 @@
#!/usr/bin/env python
-import sys
from setuptools import setup, find_packages
import sys
|
nvbn_thefuck.json
| null | null | null | null | null | null |
nvbn_thefuck.json
|
BUG_FIX
|
5, obvious
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.