Hash
stringlengths 40
40
| Date
stringlengths 19
20
⌀ | Author
stringlengths 2
30
| commit_message
stringlengths 3
28.8k
| IsMerge
bool 1
class | Additions
int64 0
55.2k
| Deletions
int64 0
991
| Total Changes
int64 -3
55.2k
| git_diff
stringlengths 23
47.3k
| Repository Name
stringclasses 159
values | Owner
stringclasses 85
values | Primary Language
stringclasses 20
values | Language
stringclasses 19
values | Stars
float64 218
411k
⌀ | Forks
float64 8
79k
⌀ | Description
stringclasses 96
values | Repository
stringclasses 161
values | type
stringclasses 6
values | Comment
stringlengths 7
156
⌀ |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
d227646201ec734590d71e157dac239a8727dd3c
|
2022-01-28 01:03:47
|
Aldo Salas
|
Fixing typos in Linked List README.es-ES.md (#710) Co-authored-by: Oleksii Trekhleb <[email protected]>
| false
| 17
| 17
| 34
|
--- src/data-structures/linked-list/README.es-ES.md
@@ -7,26 +7,26 @@ _Lee este artículo en otros idiomas:_
[_Português_](README.pt-BR.md)
[_English_](README.md)
-En ciencias de la computación una **lista enlazada** es una colección lineal
-de elementos de datos, en los cuales el orden lineal no es dado por
-su posición física en memoria. En cambio, cada
-elemento apunta al siguiente. Es una estructura de datos
+En ciencias de la computación una **lista enlazada** es una colección linear
+de elementos de datos, en los cuales el orden linear no es dado por
+su posción física en memoria. En cambio, cada
+elemento señala al siguiente. Es una estructura de datos
que consiste en un grupo de nodos los cuales juntos representan
-una secuencia. En su forma más sencilla, cada nodo está
+una secuencia. Bajo la forma más simple, cada nodo esta
compuesto de datos y una referencia (en otras palabras,
-un enlace) al siguiente nodo en la secuencia. Esta estructura
-permite la inserción o eliminación de elementos
+un lazo) al siguiente nodo en la secuencia. Esta estructura
+permite la inserción o remoción de elementos
desde cualquier posición en la secuencia durante la iteración.
-Las variantes más complejas agregan enlaces adicionales, permitiendo
-una eficiente inserción o eliminación desde referencias arbitrarias
-del elemento. Una desventaja de las listas lazadas es que el tiempo de
-acceso es lineal (y difícil de canalizar). Un acceso
-más rápido, como un acceso aleatorio, no es factible. Los arreglos
-tienen una mejor locazion en caché comparados con las listas lazadas.
+Variantes más complejas agregan lazos adicionales, permitiendo
+una eficiente inserción o remoción desde referencias arbitrarias
+del elemento. Una desventaja de las listas enlazadas es que el tiempo de
+acceso es linear (y difícil de canalizar). Un acceso
+más rápido, como el aleatorio, no es factible. Los arreglos
+tienen una mejor locazion comparados con las listas enlazadas.

-## Pseudocódigo para operaciones básicas
+## Pseudocódigo para operacones básicas
### Insertar
@@ -147,11 +147,11 @@ end ReverseTraversal
## Complejidades
-### Complejidad de Tiempo
+### Complejidad del Tiempo
| Acceso | Búsqueda | Inserción | Eliminación |
-| :----: | :------: | :-------: | :---------: |
-| O(n) | O(n) | O(1) | O(n) |
+| :----: | :----: | :-------: | :------: |
+| O(n) | O(n) | O(1) | O(n) |
### Complejidad Espacial
|
javascript-algorithms
|
trekhleb
|
JavaScript
|
JavaScript
| 190,336
| 30,518
|
📝 Algorithms and data structures implemented in JavaScript with explanations and links to further readings
|
trekhleb_javascript-algorithms
|
BUG_FIX
|
correcting display behavior under Wayland
|
bd39ee28621ac340e2ba67833bc6282888e89f0f
|
2024-02-01 22:34:42
|
stevleibelt
|
Fix issue that adb can not find any attached device
| false
| 6
| 0
| 6
|
--- docker-compose.yml
@@ -1,8 +1,6 @@
services:
php-cli:
image: php:8.2-apache
- privileged: true
restart: unless-stopped
volumes:
- .:/var/www/html
- - /dev/bus/usb:/dev/bus/usb
--- run.sh
@@ -65,10 +65,6 @@ function _start ()
{
_stop
_build
- if command -v adb &> /dev/null;
- then
- adb kill-server
- fi
docker compose up -d
}
|
xiaomi-hyperos-bootloader-bypass
|
mlgmxyysd
|
PHP
|
PHP
| 3,496
| 367
|
A PoC that exploits a vulnerability to bypass the Xiaomi HyperOS community restrictions of BootLoader unlocked account bindings.
|
mlgmxyysd_xiaomi-hyperos-bootloader-bypass
|
BUG_FIX
|
Obvious
|
86ad3c84cbdd8afe26fb002abe71546a4d1ba3a8
|
2023-09-23 02:06:39
|
Dustin L. Howett
|
nightly: deploy an appinstaller to an Azure storage account (!) (#16013) After the nightly build completes, we'll automatically generate a
.appinstaller and publich it plus the msixbundle to an Azure Storage
account.
I had to add step/job customization to the publish step in the full
release pipeline template.
The .appinstaller hardcodes our XAML dependency, which makes it a bit of
a pain. We can revisit this later, and publish our dependencies
directly and automatically instead of hardcoding them.
I am considering moving the appinstaller generation step to the MSIX
bundling job, but this works right now and is not too terrible.
Closes #774
| false
| 177
| 0
| 177
|
--- .github/actions/spelling/allow/microsoft.txt
@@ -4,7 +4,6 @@ advapi
altform
altforms
appendwttlogging
-appinstaller
appx
appxbundle
appxerror
--- build/config/template.appinstaller
@@ -1,38 +0,0 @@
-<?xml version="1.0" encoding="utf-8"?>
-<AppInstaller
- xmlns="http://schemas.microsoft.com/appx/appinstaller/2017/2"
- Version="1.0.0.0"
- Uri="$$ROOT$$$$NAME$$.appinstaller">
-
- <MainBundle
- Name="$$NAME$$"
- Publisher="CN=Microsoft Corporation, O=Microsoft Corporation, L=Redmond, S=Washington, C=US"
- Version="$$VERSION$$"
- Uri="$$ROOT$$$$PACKAGE$$" />
-
- <Dependencies>
- <Package
- Name="Microsoft.UI.Xaml.2.8"
- Publisher="CN=Microsoft Corporation, O=Microsoft Corporation, L=Redmond, S=Washington, C=US"
- Version="8.2305.5001.0"
- ProcessorArchitecture="x64"
- Uri="https://github.com/microsoft/microsoft-ui-xaml/releases/download/v2.8.4/Microsoft.UI.Xaml.2.8.x64.appx" />
- <Package
- Name="Microsoft.UI.Xaml.2.8"
- Publisher="CN=Microsoft Corporation, O=Microsoft Corporation, L=Redmond, S=Washington, C=US"
- Version="8.2305.5001.0"
- ProcessorArchitecture="x86"
- Uri="https://github.com/microsoft/microsoft-ui-xaml/releases/download/v2.8.4/Microsoft.UI.Xaml.2.8.x86.appx" />
- <Package
- Name="Microsoft.UI.Xaml.2.8"
- Publisher="CN=Microsoft Corporation, O=Microsoft Corporation, L=Redmond, S=Washington, C=US"
- Version="8.2305.5001.0"
- ProcessorArchitecture="arm64"
- Uri="https://github.com/microsoft/microsoft-ui-xaml/releases/download/v2.8.4/Microsoft.UI.Xaml.2.8.arm64.appx" />
- </Dependencies>
-
- <UpdateSettings>
- <OnLaunch
- HoursBetweenUpdateChecks="6" />
- </UpdateSettings>
-</AppInstaller>
--- build/pipelines/nightly.yml
@@ -10,12 +10,6 @@ schedules:
name: $(BuildDefinitionName)_$(date:yyMM).$(date:dd)$(rev:rrr)
-parameters:
- - name: publishToAzure
- displayName: "Deploy to **PUBLIC** Azure Storage"
- type: boolean
- default: true
-
extends:
template: templates-v2\pipeline-full-release-build.yml
parameters:
@@ -27,17 +21,3 @@ extends:
publishSymbolsToPublic: true
publishVpackToWindows: false
symbolExpiryTime: 15 # Nightly builds do not keep symbols for very long!
- ${{ if eq(true, parameters.publishToAzure) }}:
- extraPublishJobs:
- - template: job-deploy-to-azure-storage.yml
- parameters:
- pool:
- name: SHINE-INT-S
- dependsOn: [PublishSymbols]
- storagePublicRootURL: $(AppInstallerRootURL)
- subscription: $(AzureSubscriptionName)
- storageAccount: $(AzureStorageAccount)
- storageContainer: $(AzureStorageContainer)
- buildConfiguration: Release
- environment: production-canary
-
--- build/pipelines/templates-v2/job-deploy-to-azure-storage.yml
@@ -1,70 +0,0 @@
-parameters:
- - name: buildConfiguration
- type: string
- - name: pool
- type: object
- default: []
- - name: dependsOn
- type: object
- default: null
- - name: artifactStem
- type: string
- default: ''
- - name: variables
- type: object
- default: {}
- - name: environment
- type: string
- - name: storagePublicRootURL
- type: string
- - name: subscription
- type: string
- - name: storageAccount
- type: string
- - name: storageContainer
- type: string
-
-jobs:
-- deployment: DeployAzure
- ${{ if ne(length(parameters.pool), 0) }}:
- pool: ${{ parameters.pool }}
- displayName: Publish to Azure Storage (Prod)
- dependsOn: ${{ parameters.dependsOn }}
- variables:
- ${{ insert }}: ${{ parameters.variables }}
- environment: ${{ parameters.environment }}
- strategy:
- runOnce:
- deploy:
- steps:
- - download: none
-
- - checkout: self
- clean: true
- fetchDepth: 1
- fetchTags: false # Tags still result in depth > 1 fetch; we don't need them here
- submodules: true
- persistCredentials: True
-
- - task: DownloadPipelineArtifact@2
- displayName: Download MSIX Bundle Artifact
- inputs:
- artifactName: appxbundle-${{ parameters.buildConfiguration }}${{ parameters.artifactStem }}
- downloadPath: '$(Build.SourcesDirectory)/_out'
- itemPattern: '**/*.msixbundle'
-
- - pwsh: |-
- $b = Get-Item _out/*.msixbundle
- ./build/scripts/New-AppInstallerFromTemplateAndBundle.ps1 -BundlePath $b.FullName -AppInstallerTemplatePath ./build/config/template.appinstaller -AppInstallerRoot "${{ parameters.storagePublicRootURL }}" -OutputPath _out/Microsoft.WindowsTerminalCanary.appinstaller
- displayName: "Produce AppInstaller for MSIX bundle"
-
- - task: AzureFileCopy@5
- displayName: Publish to Storage Account
- inputs:
- sourcePath: _out/*
- Destination: AzureBlob
- azureSubscription: ${{ parameters.subscription }}
- storage: ${{ parameters.storageAccount }}
- ContainerName: ${{ parameters.storageContainer }}
- AdditionalArgumentsForBlobCopy: "--content-type application/octet-stream"
-
--- build/pipelines/templates-v2/job-merge-msix-into-bundle.yml
@@ -48,8 +48,6 @@ jobs:
BundleStemName: Microsoft.WindowsTerminal
${{ elseif eq(parameters.branding, 'Preview') }}:
BundleStemName: Microsoft.WindowsTerminalPreview
- ${{ elseif eq(parameters.branding, 'Canary') }}:
- BundleStemName: Microsoft.WindowsTerminalCanary
${{ else }}:
BundleStemName: WindowsTerminalDev
JobOutputDirectory: '$(System.ArtifactsDirectory)/bundle'
--- build/pipelines/templates-v2/pipeline-full-release-build.yml
@@ -51,9 +51,6 @@ parameters:
type: boolean
default: false
- - name: extraPublishJobs
- type: object
- default: []
- name: pool
type: object
default:
@@ -196,5 +193,4 @@ stages:
includePublicSymbolServer: ${{ parameters.publishSymbolsToPublic }}
symbolExpiryTime: ${{ parameters.symbolExpiryTime }}
- - ${{ parameters.extraPublishJobs }}
...
--- build/scripts/New-AppInstallerFromTemplateAndBundle.ps1
@@ -1,42 +0,0 @@
-[CmdletBinding()]
-Param(
- [Parameter(Mandatory,
- HelpMessage="Path to the .msixbundle")]
- [string]
- $BundlePath,
-
- [Parameter(Mandatory,
- HelpMessage="Path to the .appinstaller template")]
- [string]
- $AppInstallerTemplatePath,
-
- [string]
- $AppInstallerRoot,
-
- [Parameter(Mandatory,
- HelpMessage="Output Path")]
- [string]
- $OutputPath
-)
-
-$ErrorActionPreference = "Stop"
-
-$sentinelFile = New-TemporaryFile
-$directory = New-Item -Type Directory "$($sentinelFile.FullName)_Package"
-Remove-Item $sentinelFile -Force -EA:Ignore
-
-$bundle = (Get-Item $BundlePath)
-& tar.exe -x -f $bundle.FullName -C $directory AppxMetadata/AppxBundleManifest.xml
-
-$xml = [xml](Get-Content (Join-Path $directory "AppxMetadata\AppxBundleManifest.xml"))
-$name = $xml.Bundle.Identity.Name
-$version = $xml.Bundle.Identity.Version
-
-$doc = (Get-Content -ReadCount 0 $AppInstallerTemplatePath)
-$doc = $doc -Replace '\$\$ROOT\$\$',$AppInstallerRoot
-$doc = $doc -Replace '\$\$NAME\$\$',$name
-$doc = $doc -Replace '\$\$VERSION\$\$',$version
-$doc = $doc -Replace '\$\$PACKAGE\$\$',$bundle.Name
-$doc | Out-File -Encoding utf8NoBOM $OutputPath
-
-Get-Item $OutputPath
|
terminal
|
microsoft
|
C++
|
C++
| 97,273
| 8,477
|
The new Windows Terminal and the original Windows console host, all in the same place!
|
microsoft_terminal
|
NEW_FEAT
|
appinstaller deployed
|
13f855e3c41de9433b061a5e2f3328f71dbe3548
|
2023-11-13 10:12:57
|
2dust
|
remove android.intent.category.LEANBACK_LAUNCHER
| false
| 1
| 1
| 2
|
--- V2rayNG/app/src/main/AndroidManifest.xml
@@ -50,7 +50,7 @@
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
-<!-- <category android:name="android.intent.category.LEANBACK_LAUNCHER"/>-->
+ <category android:name="android.intent.category.LEANBACK_LAUNCHER"/>
</intent-filter>
<intent-filter>
<action android:name="android.service.quicksettings.action.QS_TILE_PREFERENCES" />
|
v2rayng
|
2dust
|
Kotlin
|
Kotlin
| 38,863
| 5,828
|
A V2Ray client for Android, support Xray core and v2fly core
|
2dust_v2rayng
|
PERF_IMPROVEMENT
|
simplify decoder draining logic
|
29f0a5cd745f5f69bb1a305e54dd2630d7b49ba8
|
2023-11-12 07:21:13
|
Will Ceolin
|
Play: Fix viewport on Safari
| false
| 2
| 2
| 4
|
--- lib/content/course_live/lesson_play.html.heex
@@ -2,7 +2,7 @@
id="select-option"
phx-submit="next"
phx-hook="LessonSoundEffect"
- class="grid-rows-[min-content_1fr_min-content] absolute top-0 left-0 grid h-full min-h-screen w-full overflow-hidden supports-[height:100dvh]:min-h-[100dvh]"
+ class="grid-rows-[min-content_1fr_min-content] absolute top-0 left-0 grid h-full min-h-screen w-full overflow-hidden"
>
<header class="px-4 pt-4">
<div class="flex gap-x-4">
@@ -79,7 +79,7 @@
icon={if no_options? || @selected_option, do: "tabler-chevron-right", else: "tabler-checks"}
phx-disable-with={dgettext("courses", "Confirming...")}
shadow
- class="w-full sm:w-fit"
+ class="w-full sm:w-fit flex-shrink-0"
>
<%= if no_options? || @selected_option, do: dgettext("courses", "Next step"), else: gettext("Confirm") %>
</.button>
|
uneebee
|
zoonk
|
Elixir
|
Elixir
| 1,339
| 83
|
Platform for creating interactive courses.
|
zoonk_uneebee
|
BUG_FIX
|
obvious
|
66b1f00c0e6fa438d2cd63cb5ad919079f33eee0
|
2025-01-15 17:47:06
|
Mickael Maison
|
KAFKA-18520: Remove ZooKeeper logic from JaasUtils (#18530)
Reviewers: Chia-Ping Tsai <[email protected]>
| false
| 0
| 55
| 55
|
--- clients/src/main/java/org/apache/kafka/common/security/JaasUtils.java
@@ -16,12 +16,67 @@
*/
package org.apache.kafka.common.security;
+import org.apache.kafka.common.KafkaException;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import javax.security.auth.login.Configuration;
+
public final class JaasUtils {
+ private static final Logger LOG = LoggerFactory.getLogger(JaasUtils.class);
public static final String JAVA_LOGIN_CONFIG_PARAM = "java.security.auth.login.config";
public static final String DISALLOWED_LOGIN_MODULES_CONFIG = "org.apache.kafka.disallowed.login.modules";
public static final String DISALLOWED_LOGIN_MODULES_DEFAULT = "com.sun.security.auth.module.JndiLoginModule";
public static final String SERVICE_NAME = "serviceName";
+ public static final String ZK_SASL_CLIENT = "zookeeper.sasl.client";
+ public static final String ZK_LOGIN_CONTEXT_NAME_KEY = "zookeeper.sasl.clientconfig";
+
+ private static final String DEFAULT_ZK_LOGIN_CONTEXT_NAME = "Client";
+ private static final String DEFAULT_ZK_SASL_CLIENT = "true";
+
private JaasUtils() {}
+ public static String zkSecuritySysConfigString() {
+ String loginConfig = System.getProperty(JAVA_LOGIN_CONFIG_PARAM);
+ String clientEnabled = System.getProperty(ZK_SASL_CLIENT, "default:" + DEFAULT_ZK_SASL_CLIENT);
+ String contextName = System.getProperty(ZK_LOGIN_CONTEXT_NAME_KEY, "default:" + DEFAULT_ZK_LOGIN_CONTEXT_NAME);
+ return "[" +
+ JAVA_LOGIN_CONFIG_PARAM + "=" + loginConfig +
+ ", " +
+ ZK_SASL_CLIENT + "=" + clientEnabled +
+ ", " +
+ ZK_LOGIN_CONTEXT_NAME_KEY + "=" + contextName +
+ "]";
+ }
+
+ public static boolean isZkSaslEnabled() {
+ // Technically a client must also check if TLS mutual authentication has been configured,
+ // but we will leave that up to the client code to determine since direct connectivity to ZooKeeper
+ // has been deprecated in many clients and we don't wish to re-introduce a ZooKeeper jar dependency here.
+ boolean zkSaslEnabled = Boolean.parseBoolean(System.getProperty(ZK_SASL_CLIENT, DEFAULT_ZK_SASL_CLIENT));
+ String zkLoginContextName = System.getProperty(ZK_LOGIN_CONTEXT_NAME_KEY, DEFAULT_ZK_LOGIN_CONTEXT_NAME);
+
+ LOG.debug("Checking login config for Zookeeper JAAS context {}", zkSecuritySysConfigString());
+
+ boolean foundLoginConfigEntry;
+ try {
+ Configuration loginConf = Configuration.getConfiguration();
+ foundLoginConfigEntry = loginConf.getAppConfigurationEntry(zkLoginContextName) != null;
+ } catch (Exception e) {
+ throw new KafkaException("Exception while loading Zookeeper JAAS login context " +
+ zkSecuritySysConfigString(), e);
+ }
+
+ if (foundLoginConfigEntry && !zkSaslEnabled) {
+ LOG.error("JAAS configuration is present, but system property " +
+ ZK_SASL_CLIENT + " is set to false, which disables " +
+ "SASL in the ZooKeeper client");
+ throw new KafkaException("Exception while determining if ZooKeeper is secure " +
+ zkSecuritySysConfigString());
+ }
+
+ return foundLoginConfigEntry;
+ }
}
|
apache-kafka
| null |
Java
|
Java
| null | null |
a distributed, open-source streaming platform designed for building real-time data pipelines and streaming applications
|
_apache-kafka
|
CODE_IMPROVEMENT
|
Non-functional code changes to improve readability, migration etc.
|
e2505b0d27ca0c9df3e149e8f79eeb885096b871
|
2025-03-06 03:01:40
|
Jefftree
|
Update OpenAPI
| false
| 0
| 809
| 809
|
--- api/discovery/aggregated_v2.json
@@ -364,6 +364,25 @@
}
],
"version": "v1"
+ },
+ {
+ "freshness": "Current",
+ "resources": [
+ {
+ "resource": "selfsubjectreviews",
+ "responseKind": {
+ "group": "",
+ "kind": "SelfSubjectReview",
+ "version": ""
+ },
+ "scope": "Cluster",
+ "singularResource": "selfsubjectreview",
+ "verbs": [
+ "create"
+ ]
+ }
+ ],
+ "version": "v1beta1"
}
]
},
--- api/discovery/apis.json
@@ -50,6 +50,10 @@
{
"groupVersion": "authentication.k8s.io/v1",
"version": "v1"
+ },
+ {
+ "groupVersion": "authentication.k8s.io/v1beta1",
+ "version": "v1beta1"
}
]
},
--- api/discovery/apis__authentication.k8s.io.json
@@ -10,6 +10,10 @@
{
"groupVersion": "authentication.k8s.io/v1",
"version": "v1"
+ },
+ {
+ "groupVersion": "authentication.k8s.io/v1beta1",
+ "version": "v1beta1"
}
]
}
--- api/discovery/apis__authentication.k8s.io__v1beta1.json
@@ -0,0 +1,16 @@
+{
+ "apiVersion": "v1",
+ "groupVersion": "authentication.k8s.io/v1beta1",
+ "kind": "APIResourceList",
+ "resources": [
+ {
+ "kind": "SelfSubjectReview",
+ "name": "selfsubjectreviews",
+ "namespaced": false,
+ "singularName": "selfsubjectreview",
+ "verbs": [
+ "create"
+ ]
+ }
+ ]
+}
--- api/openapi-spec/swagger.json
@@ -3163,6 +3163,45 @@
},
"type": "object"
},
+ "io.k8s.api.authentication.v1beta1.SelfSubjectReview": {
+ "description": "SelfSubjectReview contains the user information that the kube-apiserver has about the user making this request. When using impersonation, users will receive the user info of the user being impersonated. If impersonation or request header authentication is used, any extra keys will have their case ignored and returned as lowercase.",
+ "properties": {
+ "apiVersion": {
+ "description": "APIVersion defines the versioned schema of this representation of an object. Servers should convert recognized schemas to the latest internal value, and may reject unrecognized values. More info: https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#resources",
+ "type": "string"
+ },
+ "kind": {
+ "description": "Kind is a string value representing the REST resource this object represents. Servers may infer this from the endpoint the client submits requests to. Cannot be updated. In CamelCase. More info: https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#types-kinds",
+ "type": "string"
+ },
+ "metadata": {
+ "$ref": "#/definitions/io.k8s.apimachinery.pkg.apis.meta.v1.ObjectMeta",
+ "description": "Standard object's metadata. More info: https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#metadata"
+ },
+ "status": {
+ "$ref": "#/definitions/io.k8s.api.authentication.v1beta1.SelfSubjectReviewStatus",
+ "description": "Status is filled in by the server with the user attributes."
+ }
+ },
+ "type": "object",
+ "x-kubernetes-group-version-kind": [
+ {
+ "group": "authentication.k8s.io",
+ "kind": "SelfSubjectReview",
+ "version": "v1beta1"
+ }
+ ]
+ },
+ "io.k8s.api.authentication.v1beta1.SelfSubjectReviewStatus": {
+ "description": "SelfSubjectReviewStatus is filled by the kube-apiserver and sent back to a user.",
+ "properties": {
+ "userInfo": {
+ "$ref": "#/definitions/io.k8s.api.authentication.v1.UserInfo",
+ "description": "User attributes of the user making this request."
+ }
+ },
+ "type": "object"
+ },
"io.k8s.api.authorization.v1.FieldSelectorAttributes": {
"description": "FieldSelectorAttributes indicates a field limited access. Webhook authors are encouraged to * ensure rawSelector and requirements are not both set * consider the requirements field if set * not try to parse or consider the rawSelector field if set. This is to avoid another CVE-2022-2880 (i.e. getting different systems to agree on how exactly to parse a query is not something we want), see https://www.oxeye.io/resources/golang-parameter-smuggling-attack for more details. For the *SubjectAccessReview endpoints of the kube-apiserver: * If rawSelector is empty and requirements are empty, the request is not limited. * If rawSelector is present and requirements are empty, the rawSelector will be parsed and limited if the parsing succeeds. * If rawSelector is empty and requirements are present, the requirements should be honored * If rawSelector is present and requirements are present, the request is invalid.",
"properties": {
@@ -49694,6 +49733,123 @@
}
}
},
+ "/apis/authentication.k8s.io/v1beta1/": {
+ "get": {
+ "consumes": [
+ "application/json",
+ "application/yaml",
+ "application/vnd.kubernetes.protobuf",
+ "application/cbor"
+ ],
+ "description": "get available resources",
+ "operationId": "getAuthenticationV1beta1APIResources",
+ "produces": [
+ "application/json",
+ "application/yaml",
+ "application/vnd.kubernetes.protobuf",
+ "application/cbor"
+ ],
+ "responses": {
+ "200": {
+ "description": "OK",
+ "schema": {
+ "$ref": "#/definitions/io.k8s.apimachinery.pkg.apis.meta.v1.APIResourceList"
+ }
+ },
+ "401": {
+ "description": "Unauthorized"
+ }
+ },
+ "schemes": [
+ "https"
+ ],
+ "tags": [
+ "authentication_v1beta1"
+ ]
+ }
+ },
+ "/apis/authentication.k8s.io/v1beta1/selfsubjectreviews": {
+ "parameters": [
+ {
+ "description": "When present, indicates that modifications should not be persisted. An invalid or unrecognized dryRun directive will result in an error response and no further processing of the request. Valid values are: - All: all dry run stages will be processed",
+ "in": "query",
+ "name": "dryRun",
+ "type": "string",
+ "uniqueItems": true
+ },
+ {
+ "$ref": "#/parameters/fieldManager-Qy4HdaTW"
+ },
+ {
+ "description": "fieldValidation instructs the server on how to handle objects in the request (POST/PUT/PATCH) containing unknown or duplicate fields. Valid values are: - Ignore: This will ignore any unknown fields that are silently dropped from the object, and will ignore all but the last duplicate field that the decoder encounters. This is the default behavior prior to v1.23. - Warn: This will send a warning via the standard warning response header for each unknown field that is dropped from the object, and for each duplicate field that is encountered. The request will still succeed if there are no other errors, and will only persist the last of any duplicate fields. This is the default in v1.23+ - Strict: This will fail the request with a BadRequest error if any unknown fields would be dropped from the object, or if any duplicate fields are present. The error returned from the server will contain all unknown and duplicate fields encountered.",
+ "in": "query",
+ "name": "fieldValidation",
+ "type": "string",
+ "uniqueItems": true
+ },
+ {
+ "$ref": "#/parameters/pretty-tJGM1-ng"
+ }
+ ],
+ "post": {
+ "consumes": [
+ "*/*"
+ ],
+ "description": "create a SelfSubjectReview",
+ "operationId": "createAuthenticationV1beta1SelfSubjectReview",
+ "parameters": [
+ {
+ "in": "body",
+ "name": "body",
+ "required": true,
+ "schema": {
+ "$ref": "#/definitions/io.k8s.api.authentication.v1beta1.SelfSubjectReview"
+ }
+ }
+ ],
+ "produces": [
+ "application/json",
+ "application/yaml",
+ "application/vnd.kubernetes.protobuf",
+ "application/cbor"
+ ],
+ "responses": {
+ "200": {
+ "description": "OK",
+ "schema": {
+ "$ref": "#/definitions/io.k8s.api.authentication.v1beta1.SelfSubjectReview"
+ }
+ },
+ "201": {
+ "description": "Created",
+ "schema": {
+ "$ref": "#/definitions/io.k8s.api.authentication.v1beta1.SelfSubjectReview"
+ }
+ },
+ "202": {
+ "description": "Accepted",
+ "schema": {
+ "$ref": "#/definitions/io.k8s.api.authentication.v1beta1.SelfSubjectReview"
+ }
+ },
+ "401": {
+ "description": "Unauthorized"
+ }
+ },
+ "schemes": [
+ "https"
+ ],
+ "tags": [
+ "authentication_v1beta1"
+ ],
+ "x-kubernetes-action": "post",
+ "x-kubernetes-group-version-kind": {
+ "group": "authentication.k8s.io",
+ "kind": "SelfSubjectReview",
+ "version": "v1beta1"
+ }
+ }
+ },
"/apis/authorization.k8s.io/": {
"get": {
"consumes": [
--- api/openapi-spec/v3/apis__authentication.k8s.io__v1beta1_openapi.json
@@ -0,0 +1,610 @@
+{
+ "components": {
+ "schemas": {
+ "io.k8s.api.authentication.v1.UserInfo": {
+ "description": "UserInfo holds the information about the user needed to implement the user.Info interface.",
+ "properties": {
+ "extra": {
+ "additionalProperties": {
+ "items": {
+ "default": "",
+ "type": "string"
+ },
+ "type": "array"
+ },
+ "description": "Any additional information provided by the authenticator.",
+ "type": "object"
+ },
+ "groups": {
+ "description": "The names of groups this user is a part of.",
+ "items": {
+ "default": "",
+ "type": "string"
+ },
+ "type": "array",
+ "x-kubernetes-list-type": "atomic"
+ },
+ "uid": {
+ "description": "A unique value that identifies this user across time. If this user is deleted and another user by the same name is added, they will have different UIDs.",
+ "type": "string"
+ },
+ "username": {
+ "description": "The name that uniquely identifies this user among all active users.",
+ "type": "string"
+ }
+ },
+ "type": "object"
+ },
+ "io.k8s.api.authentication.v1beta1.SelfSubjectReview": {
+ "description": "SelfSubjectReview contains the user information that the kube-apiserver has about the user making this request. When using impersonation, users will receive the user info of the user being impersonated. If impersonation or request header authentication is used, any extra keys will have their case ignored and returned as lowercase.",
+ "properties": {
+ "apiVersion": {
+ "description": "APIVersion defines the versioned schema of this representation of an object. Servers should convert recognized schemas to the latest internal value, and may reject unrecognized values. More info: https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#resources",
+ "type": "string"
+ },
+ "kind": {
+ "description": "Kind is a string value representing the REST resource this object represents. Servers may infer this from the endpoint the client submits requests to. Cannot be updated. In CamelCase. More info: https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#types-kinds",
+ "type": "string"
+ },
+ "metadata": {
+ "allOf": [
+ {
+ "$ref": "#/components/schemas/io.k8s.apimachinery.pkg.apis.meta.v1.ObjectMeta"
+ }
+ ],
+ "default": {},
+ "description": "Standard object's metadata. More info: https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#metadata"
+ },
+ "status": {
+ "allOf": [
+ {
+ "$ref": "#/components/schemas/io.k8s.api.authentication.v1beta1.SelfSubjectReviewStatus"
+ }
+ ],
+ "default": {},
+ "description": "Status is filled in by the server with the user attributes."
+ }
+ },
+ "type": "object",
+ "x-kubernetes-group-version-kind": [
+ {
+ "group": "authentication.k8s.io",
+ "kind": "SelfSubjectReview",
+ "version": "v1beta1"
+ }
+ ]
+ },
+ "io.k8s.api.authentication.v1beta1.SelfSubjectReviewStatus": {
+ "description": "SelfSubjectReviewStatus is filled by the kube-apiserver and sent back to a user.",
+ "properties": {
+ "userInfo": {
+ "allOf": [
+ {
+ "$ref": "#/components/schemas/io.k8s.api.authentication.v1.UserInfo"
+ }
+ ],
+ "default": {},
+ "description": "User attributes of the user making this request."
+ }
+ },
+ "type": "object"
+ },
+ "io.k8s.apimachinery.pkg.apis.meta.v1.APIResource": {
+ "description": "APIResource specifies the name of a resource and whether it is namespaced.",
+ "properties": {
+ "categories": {
+ "description": "categories is a list of the grouped resources this resource belongs to (e.g. 'all')",
+ "items": {
+ "default": "",
+ "type": "string"
+ },
+ "type": "array",
+ "x-kubernetes-list-type": "atomic"
+ },
+ "group": {
+ "description": "group is the preferred group of the resource. Empty implies the group of the containing resource list. For subresources, this may have a different value, for example: Scale\".",
+ "type": "string"
+ },
+ "kind": {
+ "default": "",
+ "description": "kind is the kind for the resource (e.g. 'Foo' is the kind for a resource 'foo')",
+ "type": "string"
+ },
+ "name": {
+ "default": "",
+ "description": "name is the plural name of the resource.",
+ "type": "string"
+ },
+ "namespaced": {
+ "default": false,
+ "description": "namespaced indicates if a resource is namespaced or not.",
+ "type": "boolean"
+ },
+ "shortNames": {
+ "description": "shortNames is a list of suggested short names of the resource.",
+ "items": {
+ "default": "",
+ "type": "string"
+ },
+ "type": "array",
+ "x-kubernetes-list-type": "atomic"
+ },
+ "singularName": {
+ "default": "",
+ "description": "singularName is the singular name of the resource. This allows clients to handle plural and singular opaquely. The singularName is more correct for reporting status on a single item and both singular and plural are allowed from the kubectl CLI interface.",
+ "type": "string"
+ },
+ "storageVersionHash": {
+ "description": "The hash value of the storage version, the version this resource is converted to when written to the data store. Value must be treated as opaque by clients. Only equality comparison on the value is valid. This is an alpha feature and may change or be removed in the future. The field is populated by the apiserver only if the StorageVersionHash feature gate is enabled. This field will remain optional even if it graduates.",
+ "type": "string"
+ },
+ "verbs": {
+ "description": "verbs is a list of supported kube verbs (this includes get, list, watch, create, update, patch, delete, deletecollection, and proxy)",
+ "items": {
+ "default": "",
+ "type": "string"
+ },
+ "type": "array"
+ },
+ "version": {
+ "description": "version is the preferred version of the resource. Empty implies the version of the containing resource list For subresources, this may have a different value, for example: v1 (while inside a v1beta1 version of the core resource's group)\".",
+ "type": "string"
+ }
+ },
+ "required": [
+ "name",
+ "singularName",
+ "namespaced",
+ "kind",
+ "verbs"
+ ],
+ "type": "object"
+ },
+ "io.k8s.apimachinery.pkg.apis.meta.v1.APIResourceList": {
+ "description": "APIResourceList is a list of APIResource, it is used to expose the name of the resources supported in a specific group and version, and if the resource is namespaced.",
+ "properties": {
+ "apiVersion": {
+ "description": "APIVersion defines the versioned schema of this representation of an object. Servers should convert recognized schemas to the latest internal value, and may reject unrecognized values. More info: https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#resources",
+ "type": "string"
+ },
+ "groupVersion": {
+ "default": "",
+ "description": "groupVersion is the group and version this APIResourceList is for.",
+ "type": "string"
+ },
+ "kind": {
+ "description": "Kind is a string value representing the REST resource this object represents. Servers may infer this from the endpoint the client submits requests to. Cannot be updated. In CamelCase. More info: https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#types-kinds",
+ "type": "string"
+ },
+ "resources": {
+ "description": "resources contains the name of the resources and if they are namespaced.",
+ "items": {
+ "allOf": [
+ {
+ "$ref": "#/components/schemas/io.k8s.apimachinery.pkg.apis.meta.v1.APIResource"
+ }
+ ],
+ "default": {}
+ },
+ "type": "array",
+ "x-kubernetes-list-type": "atomic"
+ }
+ },
+ "required": [
+ "groupVersion",
+ "resources"
+ ],
+ "type": "object",
+ "x-kubernetes-group-version-kind": [
+ {
+ "group": "",
+ "kind": "APIResourceList",
+ "version": "v1"
+ }
+ ]
+ },
+ "io.k8s.apimachinery.pkg.apis.meta.v1.FieldsV1": {
+ "description": "FieldsV1 stores a set of fields in a data structure like a Trie, in JSON format.\n\nEach key is either a '.' representing the field itself, and will always map to an empty set, or a string representing a sub-field or item. The string will follow one of these four formats: 'f:<name>', where <name> is the name of a field in a struct, or key in a map 'v:<value>', where <value> is the exact json formatted value of a list item 'i:<index>', where <index> is position of a item in a list 'k:<keys>', where <keys> is a map of a list item's key fields to their unique values If a key maps to an empty Fields value, the field that key represents is part of the set.\n\nThe exact format is defined in sigs.k8s.io/structured-merge-diff",
+ "type": "object"
+ },
+ "io.k8s.apimachinery.pkg.apis.meta.v1.ManagedFieldsEntry": {
+ "description": "ManagedFieldsEntry is a workflow-id, a FieldSet and the group version of the resource that the fieldset applies to.",
+ "properties": {
+ "apiVersion": {
+ "description": "APIVersion defines the version of this resource that this field set applies to. The format is \"group/version\" just like the top-level APIVersion field. It is necessary to track the version of a field set because it cannot be automatically converted.",
+ "type": "string"
+ },
+ "fieldsType": {
+ "description": "FieldsType is the discriminator for the different fields format and version. There is currently only one possible value: \"FieldsV1\"",
+ "type": "string"
+ },
+ "fieldsV1": {
+ "allOf": [
+ {
+ "$ref": "#/components/schemas/io.k8s.apimachinery.pkg.apis.meta.v1.FieldsV1"
+ }
+ ],
+ "description": "FieldsV1 holds the first JSON version format as described in the \"FieldsV1\" type."
+ },
+ "manager": {
+ "description": "Manager is an identifier of the workflow managing these fields.",
+ "type": "string"
+ },
+ "operation": {
+ "description": "Operation is the type of operation which lead to this ManagedFieldsEntry being created. The only valid values for this field are 'Apply' and 'Update'.",
+ "type": "string"
+ },
+ "subresource": {
+ "description": "Subresource is the name of the subresource used to update that object, or empty string if the object was updated through the main resource. The value of this field is used to distinguish between managers, even if they share the same name. For example, a status update will be distinct from a regular update using the same manager name. Note that the APIVersion field is not related to the Subresource field and it always corresponds to the version of the main resource.",
+ "type": "string"
+ },
+ "time": {
+ "allOf": [
+ {
+ "$ref": "#/components/schemas/io.k8s.apimachinery.pkg.apis.meta.v1.Time"
+ }
+ ],
+ "description": "Time is the timestamp of when the ManagedFields entry was added. The timestamp will also be updated if a field is added, the manager changes any of the owned fields value or removes a field. The timestamp does not update when a field is removed from the entry because another manager took it over."
+ }
+ },
+ "type": "object"
+ },
+ "io.k8s.apimachinery.pkg.apis.meta.v1.ObjectMeta": {
+ "description": "ObjectMeta is metadata that all persisted resources must have, which includes all objects users must create.",
+ "properties": {
+ "annotations": {
+ "additionalProperties": {
+ "default": "",
+ "type": "string"
+ },
+ "description": "Annotations is an unstructured key value map stored with a resource that may be set by external tools to store and retrieve arbitrary metadata. They are not queryable and should be preserved when modifying objects. More info: https://kubernetes.io/docs/concepts/overview/working-with-objects/annotations",
+ "type": "object"
+ },
+ "creationTimestamp": {
+ "allOf": [
+ {
+ "$ref": "#/components/schemas/io.k8s.apimachinery.pkg.apis.meta.v1.Time"
+ }
+ ],
+ "description": "CreationTimestamp is a timestamp representing the server time when this object was created. It is not guaranteed to be set in happens-before order across separate operations. Clients may not set this value. It is represented in RFC3339 form and is in UTC.\n\nPopulated by the system. Read-only. Null for lists. More info: https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#metadata"
+ },
+ "deletionGracePeriodSeconds": {
+ "description": "Number of seconds allowed for this object to gracefully terminate before it will be removed from the system. Only set when deletionTimestamp is also set. May only be shortened. Read-only.",
+ "format": "int64",
+ "type": "integer"
+ },
+ "deletionTimestamp": {
+ "allOf": [
+ {
+ "$ref": "#/components/schemas/io.k8s.apimachinery.pkg.apis.meta.v1.Time"
+ }
+ ],
+ "description": "DeletionTimestamp is RFC 3339 date and time at which this resource will be deleted. This field is set by the server when a graceful deletion is requested by the user, and is not directly settable by a client. The resource is expected to be deleted (no longer visible from resource lists, and not reachable by name) after the time in this field, once the finalizers list is empty. As long as the finalizers list contains items, deletion is blocked. Once the deletionTimestamp is set, this value may not be unset or be set further into the future, although it may be shortened or the resource may be deleted prior to this time. For example, a user may request that a pod is deleted in 30 seconds. The Kubelet will react by sending a graceful termination signal to the containers in the pod. After that 30 seconds, the Kubelet will send a hard termination signal (SIGKILL) to the container and after cleanup, remove the pod from the API. In the presence of network partitions, this object may still exist after this timestamp, until an administrator or automated process can determine the resource is fully terminated. If not set, graceful deletion of the object has not been requested.\n\nPopulated by the system when a graceful deletion is requested. Read-only. More info: https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#metadata"
+ },
+ "finalizers": {
+ "description": "Must be empty before the object is deleted from the registry. Each entry is an identifier for the responsible component that will remove the entry from the list. If the deletionTimestamp of the object is non-nil, entries in this list can only be removed. Finalizers may be processed and removed in any order. Order is NOT enforced because it introduces significant risk of stuck finalizers. finalizers is a shared field, any actor with permission can reorder it. If the finalizer list is processed in order, then this can lead to a situation in which the component responsible for the first finalizer in the list is waiting for a signal (field value, external system, or other) produced by a component responsible for a finalizer later in the list, resulting in a deadlock. Without enforced ordering finalizers are free to order amongst themselves and are not vulnerable to ordering changes in the list.",
+ "items": {
+ "default": "",
+ "type": "string"
+ },
+ "type": "array",
+ "x-kubernetes-list-type": "set",
+ "x-kubernetes-patch-strategy": "merge"
+ },
+ "generateName": {
+ "description": "GenerateName is an optional prefix, used by the server, to generate a unique name ONLY IF the Name field has not been provided. If this field is used, the name returned to the client will be different than the name passed. This value will also be combined with a unique suffix. The provided value has the same validation rules as the Name field, and may be truncated by the length of the suffix required to make the value unique on the server.\n\nIf this field is specified and the generated name exists, the server will return a 409.\n\nApplied only if Name is not specified. More info: https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#idempotency",
+ "type": "string"
+ },
+ "generation": {
+ "description": "A sequence number representing a specific generation of the desired state. Populated by the system. Read-only.",
+ "format": "int64",
+ "type": "integer"
+ },
+ "labels": {
+ "additionalProperties": {
+ "default": "",
+ "type": "string"
+ },
+ "description": "Map of string keys and values that can be used to organize and categorize (scope and select) objects. May match selectors of replication controllers and services. More info: https://kubernetes.io/docs/concepts/overview/working-with-objects/labels",
+ "type": "object"
+ },
+ "managedFields": {
+ "description": "ManagedFields maps workflow-id and version to the set of fields that are managed by that workflow. This is mostly for internal housekeeping, and users typically shouldn't need to set or understand this field. A workflow can be the user's name, a controller's name, or the name of a specific apply path like \"ci-cd\". The set of fields is always in the version that the workflow used when modifying the object.",
+ "items": {
+ "allOf": [
+ {
+ "$ref": "#/components/schemas/io.k8s.apimachinery.pkg.apis.meta.v1.ManagedFieldsEntry"
+ }
+ ],
+ "default": {}
+ },
+ "type": "array",
+ "x-kubernetes-list-type": "atomic"
+ },
+ "name": {
+ "description": "Name must be unique within a namespace. Is required when creating resources, although some resources may allow a client to request the generation of an appropriate name automatically. Name is primarily intended for creation idempotence and configuration definition. Cannot be updated. More info: https://kubernetes.io/docs/concepts/overview/working-with-objects/names#names",
+ "type": "string"
+ },
+ "namespace": {
+ "description": "Namespace defines the space within which each name must be unique. An empty namespace is equivalent to the \"default\" namespace, but \"default\" is the canonical representation. Not all objects are required to be scoped to a namespace - the value of this field for those objects will be empty.\n\nMust be a DNS_LABEL. Cannot be updated. More info: https://kubernetes.io/docs/concepts/overview/working-with-objects/namespaces",
+ "type": "string"
+ },
+ "ownerReferences": {
+ "description": "List of objects depended by this object. If ALL objects in the list have been deleted, this object will be garbage collected. If this object is managed by a controller, then an entry in this list will point to this controller, with the controller field set to true. There cannot be more than one managing controller.",
+ "items": {
+ "allOf": [
+ {
+ "$ref": "#/components/schemas/io.k8s.apimachinery.pkg.apis.meta.v1.OwnerReference"
+ }
+ ],
+ "default": {}
+ },
+ "type": "array",
+ "x-kubernetes-list-map-keys": [
+ "uid"
+ ],
+ "x-kubernetes-list-type": "map",
+ "x-kubernetes-patch-merge-key": "uid",
+ "x-kubernetes-patch-strategy": "merge"
+ },
+ "resourceVersion": {
+ "description": "An opaque value that represents the internal version of this object that can be used by clients to determine when objects have changed. May be used for optimistic concurrency, change detection, and the watch operation on a resource or set of resources. Clients must treat these values as opaque and passed unmodified back to the server. They may only be valid for a particular resource or set of resources.\n\nPopulated by the system. Read-only. Value must be treated as opaque by clients and . More info: https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#concurrency-control-and-consistency",
+ "type": "string"
+ },
+ "selfLink": {
+ "description": "Deprecated: selfLink is a legacy read-only field that is no longer populated by the system.",
+ "type": "string"
+ },
+ "uid": {
+ "description": "UID is the unique in time and space value for this object. It is typically generated by the server on successful creation of a resource and is not allowed to change on PUT operations.\n\nPopulated by the system. Read-only. More info: https://kubernetes.io/docs/concepts/overview/working-with-objects/names#uids",
+ "type": "string"
+ }
+ },
+ "type": "object"
+ },
+ "io.k8s.apimachinery.pkg.apis.meta.v1.OwnerReference": {
+ "description": "OwnerReference contains enough information to let you identify an owning object. An owning object must be in the same namespace as the dependent, or be cluster-scoped, so there is no namespace field.",
+ "properties": {
+ "apiVersion": {
+ "default": "",
+ "description": "API version of the referent.",
+ "type": "string"
+ },
+ "blockOwnerDeletion": {
+ "description": "If true, AND if the owner has the \"foregroundDeletion\" finalizer, then the owner cannot be deleted from the key-value store until this reference is removed. See https://kubernetes.io/docs/concepts/architecture/garbage-collection/#foreground-deletion for how the garbage collector interacts with this field and enforces the foreground deletion. Defaults to false. To set this field, a user needs \"delete\" permission of the owner, otherwise 422 (Unprocessable Entity) will be returned.",
+ "type": "boolean"
+ },
+ "controller": {
+ "description": "If true, this reference points to the managing controller.",
+ "type": "boolean"
+ },
+ "kind": {
+ "default": "",
+ "description": "Kind of the referent. More info: https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#types-kinds",
+ "type": "string"
+ },
+ "name": {
+ "default": "",
+ "description": "Name of the referent. More info: https://kubernetes.io/docs/concepts/overview/working-with-objects/names#names",
+ "type": "string"
+ },
+ "uid": {
+ "default": "",
+ "description": "UID of the referent. More info: https://kubernetes.io/docs/concepts/overview/working-with-objects/names#uids",
+ "type": "string"
+ }
+ },
+ "required": [
+ "apiVersion",
+ "kind",
+ "name",
+ "uid"
+ ],
+ "type": "object",
+ "x-kubernetes-map-type": "atomic"
+ },
+ "io.k8s.apimachinery.pkg.apis.meta.v1.Time": {
+ "description": "Time is a wrapper around time.Time which supports correct marshaling to YAML and JSON. Wrappers are provided for many of the factory methods that the time package offers.",
+ "format": "date-time",
+ "type": "string"
+ }
+ },
+ "securitySchemes": {
+ "BearerToken": {
+ "description": "Bearer Token authentication",
+ "in": "header",
+ "name": "authorization",
+ "type": "apiKey"
+ }
+ }
+ },
+ "info": {
+ "title": "Kubernetes",
+ "version": "unversioned"
+ },
+ "openapi": "3.0.0",
+ "paths": {
+ "/apis/authentication.k8s.io/v1beta1/": {
+ "get": {
+ "description": "get available resources",
+ "operationId": "getAuthenticationV1beta1APIResources",
+ "responses": {
+ "200": {
+ "content": {
+ "application/cbor": {
+ "schema": {
+ "$ref": "#/components/schemas/io.k8s.apimachinery.pkg.apis.meta.v1.APIResourceList"
+ }
+ },
+ "application/json": {
+ "schema": {
+ "$ref": "#/components/schemas/io.k8s.apimachinery.pkg.apis.meta.v1.APIResourceList"
+ }
+ },
+ "application/vnd.kubernetes.protobuf": {
+ "schema": {
+ "$ref": "#/components/schemas/io.k8s.apimachinery.pkg.apis.meta.v1.APIResourceList"
+ }
+ },
+ "application/yaml": {
+ "schema": {
+ "$ref": "#/components/schemas/io.k8s.apimachinery.pkg.apis.meta.v1.APIResourceList"
+ }
+ }
+ },
+ "description": "OK"
+ },
+ "401": {
+ "description": "Unauthorized"
+ }
+ },
+ "tags": [
+ "authentication_v1beta1"
+ ]
+ }
+ },
+ "/apis/authentication.k8s.io/v1beta1/selfsubjectreviews": {
+ "parameters": [
+ {
+ "description": "When present, indicates that modifications should not be persisted. An invalid or unrecognized dryRun directive will result in an error response and no further processing of the request. Valid values are: - All: all dry run stages will be processed",
+ "in": "query",
+ "name": "dryRun",
+ "schema": {
+ "type": "string",
+ "uniqueItems": true
+ }
+ },
+ {
+ "description": "fieldManager is a name associated with the actor or entity that is making these changes. The value must be less than or 128 characters long, and only contain printable characters, as defined by https://golang.org/pkg/unicode/#IsPrint.",
+ "in": "query",
+ "name": "fieldManager",
+ "schema": {
+ "type": "string",
+ "uniqueItems": true
+ }
+ },
+ {
+ "description": "fieldValidation instructs the server on how to handle objects in the request (POST/PUT/PATCH) containing unknown or duplicate fields. Valid values are: - Ignore: This will ignore any unknown fields that are silently dropped from the object, and will ignore all but the last duplicate field that the decoder encounters. This is the default behavior prior to v1.23. - Warn: This will send a warning via the standard warning response header for each unknown field that is dropped from the object, and for each duplicate field that is encountered. The request will still succeed if there are no other errors, and will only persist the last of any duplicate fields. This is the default in v1.23+ - Strict: This will fail the request with a BadRequest error if any unknown fields would be dropped from the object, or if any duplicate fields are present. The error returned from the server will contain all unknown and duplicate fields encountered.",
+ "in": "query",
+ "name": "fieldValidation",
+ "schema": {
+ "type": "string",
+ "uniqueItems": true
+ }
+ },
+ {
+ "description": "If 'true', then the output is pretty printed. Defaults to 'false' unless the user-agent indicates a browser or command-line HTTP tool (curl and wget).",
+ "in": "query",
+ "name": "pretty",
+ "schema": {
+ "type": "string",
+ "uniqueItems": true
+ }
+ }
+ ],
+ "post": {
+ "description": "create a SelfSubjectReview",
+ "operationId": "createAuthenticationV1beta1SelfSubjectReview",
+ "requestBody": {
+ "content": {
+ "*/*": {
+ "schema": {
+ "$ref": "#/components/schemas/io.k8s.api.authentication.v1beta1.SelfSubjectReview"
+ }
+ }
+ },
+ "required": true
+ },
+ "responses": {
+ "200": {
+ "content": {
+ "application/cbor": {
+ "schema": {
+ "$ref": "#/components/schemas/io.k8s.api.authentication.v1beta1.SelfSubjectReview"
+ }
+ },
+ "application/json": {
+ "schema": {
+ "$ref": "#/components/schemas/io.k8s.api.authentication.v1beta1.SelfSubjectReview"
+ }
+ },
+ "application/vnd.kubernetes.protobuf": {
+ "schema": {
+ "$ref": "#/components/schemas/io.k8s.api.authentication.v1beta1.SelfSubjectReview"
+ }
+ },
+ "application/yaml": {
+ "schema": {
+ "$ref": "#/components/schemas/io.k8s.api.authentication.v1beta1.SelfSubjectReview"
+ }
+ }
+ },
+ "description": "OK"
+ },
+ "201": {
+ "content": {
+ "application/cbor": {
+ "schema": {
+ "$ref": "#/components/schemas/io.k8s.api.authentication.v1beta1.SelfSubjectReview"
+ }
+ },
+ "application/json": {
+ "schema": {
+ "$ref": "#/components/schemas/io.k8s.api.authentication.v1beta1.SelfSubjectReview"
+ }
+ },
+ "application/vnd.kubernetes.protobuf": {
+ "schema": {
+ "$ref": "#/components/schemas/io.k8s.api.authentication.v1beta1.SelfSubjectReview"
+ }
+ },
+ "application/yaml": {
+ "schema": {
+ "$ref": "#/components/schemas/io.k8s.api.authentication.v1beta1.SelfSubjectReview"
+ }
+ }
+ },
+ "description": "Created"
+ },
+ "202": {
+ "content": {
+ "application/cbor": {
+ "schema": {
+ "$ref": "#/components/schemas/io.k8s.api.authentication.v1beta1.SelfSubjectReview"
+ }
+ },
+ "application/json": {
+ "schema": {
+ "$ref": "#/components/schemas/io.k8s.api.authentication.v1beta1.SelfSubjectReview"
+ }
+ },
+ "application/vnd.kubernetes.protobuf": {
+ "schema": {
+ "$ref": "#/components/schemas/io.k8s.api.authentication.v1beta1.SelfSubjectReview"
+ }
+ },
+ "application/yaml": {
+ "schema": {
+ "$ref": "#/components/schemas/io.k8s.api.authentication.v1beta1.SelfSubjectReview"
+ }
+ }
+ },
+ "description": "Accepted"
+ },
+ "401": {
+ "description": "Unauthorized"
+ }
+ },
+ "tags": [
+ "authentication_v1beta1"
+ ],
+ "x-kubernetes-action": "post",
+ "x-kubernetes-group-version-kind": {
+ "group": "authentication.k8s.io",
+ "kind": "SelfSubjectReview",
+ "version": "v1beta1"
+ }
+ }
+ }
+ }
+}
|
kubernetes
|
kubernetes
|
Go
|
Go
| 113,460
| 40,344
|
Production-Grade Container Scheduling and Management
|
kubernetes_kubernetes
|
CODE_IMPROVEMENT
|
seems like redundant files have been deleted
|
b3da2ff8f494c91fb19779d64db47fff521a155e
|
2024-10-19 22:31:03
|
Blackspirits
|
Translated using Weblate (Portuguese) Translation: Jellyfin/Jellyfin Translate-URL: https://translate.jellyfin.org/projects/jellyfin/jellyfin-core/pt/
| false
| 12
| 12
| 24
|
--- Emby.Server.Implementations/Localization/Core/pt.json
@@ -18,7 +18,7 @@
"Channels": "Canais",
"UserDownloadingItemWithValues": "{0} está sendo baixado {1}",
"VersionNumber": "Versão {0}",
- "ValueHasBeenAddedToLibrary": "{0} foi adicionado à sua mediateca",
+ "ValueHasBeenAddedToLibrary": "{0} foi adicionado à sua biblioteca multimédia",
"UserStoppedPlayingItemWithValues": "{0} terminou a reprodução de {1} em {2}",
"UserStartedPlayingItemWithValues": "{0} está reproduzindo {1} em {2}",
"UserPolicyUpdatedWithName": "A política do usuário {0} foi alterada",
@@ -71,8 +71,8 @@
"Latest": "Mais Recente",
"LabelRunningTimeValue": "Duração: {0}",
"LabelIpAddressValue": "Endereço de IP: {0}",
- "ItemRemovedWithName": "{0} foi removido da mediateca",
- "ItemAddedWithName": "{0} foi adicionado à mediateca",
+ "ItemRemovedWithName": "{0} foi removido da biblioteca",
+ "ItemAddedWithName": "{0} foi adicionado à biblioteca",
"Inherit": "Herdar",
"HomeVideos": "Vídeos Caseiros",
"HeaderRecordingGroups": "Grupos de Gravação",
@@ -93,33 +93,33 @@
"AppDeviceValues": "Aplicação: {0}, Dispositivo: {1}",
"TaskCleanCache": "Limpar Diretório de Cache",
"TasksApplicationCategory": "Aplicação",
- "TasksLibraryCategory": "Mediateca",
+ "TasksLibraryCategory": "Biblioteca",
"TasksMaintenanceCategory": "Manutenção",
"TaskRefreshChannels": "Atualizar Canais",
"TaskUpdatePlugins": "Atualizar Plugins",
"TaskCleanLogsDescription": "Deletar arquivos de log que existe a mais de {0} dias.",
"TaskCleanLogs": "Limpar diretório de logs",
- "TaskRefreshLibrary": "Analisar mediateca",
+ "TaskRefreshLibrary": "Escanear biblioteca de mídias",
"TaskRefreshChapterImagesDescription": "Cria miniaturas para vídeos que têm capítulos.",
"TaskCleanCacheDescription": "Apaga ficheiros em cache que já não são usados pelo sistema.",
"TasksChannelsCategory": "Canais de Internet",
"TaskRefreshChapterImages": "Extrair Imagens do Capítulo",
"TaskDownloadMissingSubtitlesDescription": "Pesquisa na Internet as legendas em falta com base na configuração de metadados.",
- "TaskDownloadMissingSubtitles": "Transferir legendas em falta",
+ "TaskDownloadMissingSubtitles": "Download das legendas em falta",
"TaskRefreshChannelsDescription": "Atualiza as informações do canal da Internet.",
"TaskCleanTranscodeDescription": "Apagar os ficheiros com mais de um dia, de Transcode.",
"TaskCleanTranscode": "Limpar o diretório de Transcode",
"TaskUpdatePluginsDescription": "Baixa e instala as atualizações para plug-ins configurados para atualização automática.",
- "TaskRefreshPeopleDescription": "Atualizar metadados para elenco e equipa técnica da tua mediateca.",
+ "TaskRefreshPeopleDescription": "Atualiza os metadados para atores e diretores na tua biblioteca de media.",
"TaskRefreshPeople": "Atualizar pessoas",
- "TaskRefreshLibraryDescription": "Analisar a mediateca para novos ficheiros e atualizar os metadados.",
+ "TaskRefreshLibraryDescription": "Pesquisa sua biblioteca de media por novos arquivos e atualiza os metadados.",
"TaskCleanActivityLog": "Limpar registro de atividade",
"Undefined": "Indefinido",
"Forced": "Forçado",
"Default": "Predefinição",
"TaskCleanActivityLogDescription": "Apaga itens no registro com idade acima do que é configurado.",
"TaskOptimizeDatabase": "Otimizar base de dados",
- "TaskOptimizeDatabaseDescription": "Otimiza e liberta espaço livre na base de dados. A execução desta tarefa depois de analisar a mediateca ou efetuar outras alterações que impliquem modificações na base de dados pode melhorar o desempenho.",
+ "TaskOptimizeDatabaseDescription": "Base de dados compacta e corta espaço livre. A execução desta tarefa depois de digitalizar a biblioteca ou de fazer outras alterações que impliquem modificações na base de dados pode melhorar o desempenho.",
"External": "Externo",
"HearingImpaired": "Problemas auditivos",
"TaskKeyframeExtractor": "Extrator de quadro-chave",
@@ -133,7 +133,7 @@
"TaskDownloadMissingLyrics": "Transferir letra em falta",
"TaskDownloadMissingLyricsDescription": "Transferir letra para músicas",
"TaskMoveTrickplayImagesDescription": "Transfere ficheiros de miniatura de vídeo, conforme as definições da biblioteca.",
- "TaskExtractMediaSegments": "Analisar segmentos de multimédia",
- "TaskExtractMediaSegmentsDescription": "Extrai ou obtém segmentos de multimédia a partir de plugins com suporte para MediaSegment.",
- "TaskMoveTrickplayImages": "Migrar a localização da imagem do Trickplay"
+ "TaskExtractMediaSegments": "Varrimento de segmentos da média",
+ "TaskExtractMediaSegmentsDescription": "Extrai ou obtém segmentos de média de extensões com suporte a MediaSegment.",
+ "TaskMoveTrickplayImages": "Migração de miniaturas de vídeo"
}
|
jellyfin
|
jellyfin
|
C#
|
C#
| 37,617
| 3,375
|
The Free Software Media System - Server Backend & API
|
jellyfin_jellyfin
|
CONFIG_CHANGE
|
translations changed
|
6391c3dfbf19064b67cb0bae7d9b70a476260456
|
2023-03-22 20:21:27
|
schochastics
|
added ordering description
| false
| 3
| 1
| 4
|
--- README.md
@@ -5,9 +5,7 @@
> Inspired by [Awesome Network
> Analysis](https://github.com/briatte/awesome-network-analysis) and others.
-The order of entries within categories is either alphabetically or
-chronologically.
-**Please add your resources according to the respective ordering**
+All entries within a category are ordered alphabetically.

|
awesome-computational-social-science
|
gesiscss
|
R
|
R
| 648
| 83
|
A list of awesome resources for Computational Social Science
|
gesiscss_awesome-computational-social-science
|
DOC_CHANGE
|
changes in readme
|
2b2660951dc97e5e6945e641e57fd2815d53d1de
|
2024-01-23 20:21:14
|
PesBandi
|
[QuickAccent]Add E with breve and the pilcrow sign (#31071)
| false
| 2
| 2
| 4
|
--- src/modules/poweraccent/PowerAccent.Core/Languages.cs
@@ -154,7 +154,7 @@ namespace PowerAccent.Core
LetterKey.VK_B => new[] { "ḃ" },
LetterKey.VK_C => new[] { "ċ", "°C", "©", "ℂ", "∁" },
LetterKey.VK_D => new[] { "ḍ", "ḋ", "∂" },
- LetterKey.VK_E => new[] { "∈", "∃", "∄", "∉", "ĕ" },
+ LetterKey.VK_E => new[] { "∈", "∃", "∄", "∉" },
LetterKey.VK_F => new[] { "ḟ", "°F" },
LetterKey.VK_G => new[] { "ģ", "ǧ", "ġ", "ĝ", "ǥ" },
LetterKey.VK_H => new[] { "ḣ", "ĥ", "ħ" },
@@ -164,7 +164,7 @@ namespace PowerAccent.Core
LetterKey.VK_M => new[] { "ṁ" },
LetterKey.VK_N => new[] { "ņ", "ṅ", "ⁿ", "ℕ" },
LetterKey.VK_O => new[] { "ȯ", "∅" },
- LetterKey.VK_P => new[] { "ṗ", "℗", "∏", "¶" },
+ LetterKey.VK_P => new[] { "ṗ", "℗", "∏" },
LetterKey.VK_Q => new[] { "ℚ" },
LetterKey.VK_R => new[] { "ṙ", "®", "ℝ" },
LetterKey.VK_S => new[] { "ṡ", "§", "∑" },
|
powertoys
|
microsoft
|
C#
|
C#
| 115,301
| 6,789
|
Windows system utilities to maximize productivity
|
microsoft_powertoys
|
NEW_FEAT
|
new symbol added
|
b0b172d47fe821cd319cebcf7ccf5859d842ac91
|
2024-10-09 22:05:14
|
Hadley Wickham
|
Reduce test duplication by extracting repeated code out into functions (#106)
| false
| 242
| 320
| 562
|
--- tests/testthat/_snaps/provider-claude.md
@@ -1,9 +1,4 @@
-# defaults are reported
-
- Code
- . <- chat_claude()
-
-# all tool variations work
+# can make an async tool call
Code
chat$chat("Great. Do it again.")
@@ -12,10 +7,10 @@
! Can't use async tools with `$chat()` or `$stream()`.
i Async tools are supported, but you must use `$chat_async()` or `$stream_async()`.
-# can use images
+# can use inline images
Code
- . <- chat$chat("What's in this image?", image_remote)
+ . <- chat$chat("What's in this image?", remote_image)
Condition
Error:
! Claude doesn't support remote images
--- tests/testthat/_snaps/provider-gemini.md
@@ -1,11 +1,4 @@
-# defaults are reported
-
- Code
- . <- chat_gemini()
- Message
- Using model = "gemini-1.5-flash".
-
-# all tool variations work
+# can make an async tool call
Code
chat$chat("Great. Do it again.")
@@ -14,7 +7,7 @@
! Can't use async tools with `$chat()` or `$stream()`.
i Async tools are supported, but you must use `$chat_async()` or `$stream_async()`.
-# can use images
+# can use images (inline and remote)
Code
. <- chat$chat("What's in this image?", image_remote)
--- tests/testthat/_snaps/provider-openai.md
@@ -1,11 +1,11 @@
-# defaults are reported
+# default model is reported
Code
- . <- chat_openai()
+ . <- chat_openai()$chat("Hi")
Message
Using model = "gpt-4o-mini".
-# all tool variations work
+# can make an async tool call
Code
chat$chat("Great. Do it again.")
--- tests/testthat/helper-provider.R
@@ -1,152 +0,0 @@
-retry_test <- function(code, retries = 1) {
- code <- enquo(code)
-
- i <- 1
- while (i <= retries) {
- tryCatch(
- {
- return(eval(get_expr(code), get_env(code)))
- break
- },
- expectation_failure = function(cnd) NULL
- )
- i <- i + 1
- }
-
- eval(get_expr(code), get_env(code))
-}
-
-# Turns ------------------------------------------------------------------
-
-test_turns_system <- function(chat_fun) {
- system_prompt <- "Return very minimal output, AND ONLY USE UPPERCASE."
-
- chat <- chat_fun(system_prompt = system_prompt)
- resp <- chat$chat("What is the name of Winnie the Pooh's human friend?")
- expect_match(resp, "CHRISTOPHER ROBIN")
- expect_length(chat$turns(), 2)
-
- chat <- chat_fun(turns = list(Turn("system", system_prompt)))
- resp <- chat$chat("What is the name of Winnie the Pooh's human friend?")
- expect_match(resp, "CHRISTOPHER ROBIN")
- expect_length(chat$turns(), 2)
-}
-
-test_turns_existing <- function(chat_fun) {
- chat <- chat_fun(turns = list(
- Turn("system", "Return very minimal output; no punctuation."),
- Turn("user", "List the names of any 8 of Santa's 9 reindeer."),
- Turn("assistant", "Dasher, Dancer, Vixen, Comet, Cupid, Donner, Blitzen, and Rudolph.")
- ))
- expect_length(chat$turns(), 2)
-
- resp <- chat$chat("Who is the remaining one? Just give the name")
- expect_match(resp, "Prancer")
- expect_length(chat$turns(), 4)
-}
-
-# Tool calls -------------------------------------------------------------
-
-test_tools_simple <- function(chat_fun) {
- chat <- chat_fun(system_prompt = "Be very terse, not even punctuation.")
- chat$register_tool(ToolDef(
- function() "2024-01-01",
- name = "get_date",
- description = "Gets the current date"
- ))
-
- result <- chat$chat("What's the current date in YMD format?")
- expect_match(result, "2024-01-01")
-
- result <- chat$chat("What month is it?")
- expect_match(result, "January")
-}
-
-test_tools_async <- function(chat_fun) {
- chat <- chat_fun(system_prompt = "Be very terse, not even punctuation.")
- chat$register_tool(ToolDef(
- coro::async(function() "2024-01-01"),
- name = "get_date",
- description = "Gets the current date"
- ))
-
- result <- sync(chat$chat_async("What's the current date in YMD format?"))
- expect_match(result, "2024-01-01")
-
- expect_snapshot(chat$chat("Great. Do it again."), error = TRUE)
-}
-
-test_tools_parallel <- function(chat_fun) {
- chat <- chat_fun(system_prompt = "Be very terse, not even punctuation.")
- favourite_color <- function(person) {
- if (person == "Joe") "sage green" else "red"
- }
- chat$register_tool(ToolDef(
- function(person) if (person == "Joe") "sage green" else "red",
- name = "favourite_color",
- description = "Returns a person's favourite colour",
- arguments = list(person = ToolArg("string", "Name of a person")),
- strict = TRUE
- ))
-
- result <- chat$chat("
- What are Joe and Hadley's favourite colours?
- Answer like name1: colour1, name2: colour2
- ")
- expect_match(result, "Joe: sage green")
- expect_match(result, "Hadley: red")
- expect_length(chat$turns(), 4)
-}
-
-test_tools_sequential <- function(chat_fun, total_calls) {
- chat <- chat_fun(system_prompt = "Be very terse, not even punctuation.")
- chat$register_tool(ToolDef(
- function() 2024,
- name = "get_year",
- description = "Get the current year"
- ))
- chat$register_tool(ToolDef(
- function(year) if (year == 2024) "Susan" else "I don't know",
- name = "popular_name",
- description = "Gets the most popular name for a year",
- arguments = list(year = ToolArg("integer", "Year"))
- ))
-
- result <- chat$chat("What was the most popular name this year.")
- expect_match(result, "Susan")
- expect_length(chat$turns(), total_calls)
-}
-
-
-# Images -----------------------------------------------------------------
-
-test_images_inline <- function(chat_fun) {
- chat <- chat_fun()
- response <- chat$chat(
- "What's in this image? (Be sure to mention the outside shape)",
- content_image_file(system.file("httr2.png", package = "elmer"))
- )
- expect_match(response, "hex")
- expect_match(response, "baseball")
-}
-
-test_images_remote <- function(chat_fun) {
- chat <- chat_fun()
- response <- chat$chat(
- "What's in this image? (Be sure to mention the outside shape)",
- content_image_url("https://httr2.r-lib.org/logo.png")
- )
- expect_match(response, "hex")
- expect_match(response, "baseball")
-}
-
-test_images_remote_error <- function(chat_fun) {
- chat <- chat_fun()
-
- image_remote <- content_image_url("https://httr2.r-lib.org/logo.png")
- expect_snapshot(
- . <- chat$chat("What's in this image?", image_remote),
- error = TRUE
- )
- expect_length(chat$turns(), 0)
-}
--- tests/testthat/test-provider-claude.R
@@ -3,13 +3,10 @@ test_that("can make simple request", {
resp <- chat$chat("What is 1 + 1?")
expect_match(resp, "2")
expect_equal(chat$last_turn()@tokens, c(26, 5))
-})
-test_that("can make simple async request", {
- chat <- chat_claude("Be as terse as possible; no punctuation")
resp <- sync(chat$chat_async("What is 1 + 1?"))
expect_match(resp, "2")
- expect_equal(chat$last_turn()@tokens, c(26, 5))
+ expect_equal(chat$last_turn()@tokens, c(43, 5))
})
test_that("can make simple streaming request", {
@@ -21,31 +18,115 @@ test_that("can make simple streaming request", {
expect_match(paste0(unlist(resp), collapse = ""), "2")
})
-# Common provider interface -----------------------------------------------
+test_that("system prompt can be passed explicitly or as a turn", {
+ system_prompt <- "Return very minimal output, AND ONLY USE UPPERCASE."
-test_that("defaults are reported", {
- expect_snapshot(. <- chat_claude())
+ chat <- chat_claude(system_prompt = system_prompt)
+ resp <- chat$chat("What is the name of Winnie the Pooh's human friend?")
+ expect_match(resp, "CHRISTOPHER ROBIN")
+
+ chat <- chat_claude(turns = list(Turn("system", system_prompt)))
+ resp <- chat$chat("What is the name of Winnie the Pooh's human friend?")
+ expect_match(resp, "CHRISTOPHER ROBIN")
})
-test_that("respects turns interface", {
- chat_fun <- chat_claude
+test_that("existing conversation history is used", {
+ chat <- chat_claude(turns = list(
+ Turn("system", "Return very minimal output; no punctuation."),
+ Turn("user", "List the names of any 8 of Santa's 9 reindeer."),
+ Turn("assistant", "Dasher, Dancer, Vixen, Comet, Cupid, Donner, Blitzen, and Rudolph.")
+ ))
+ expect_length(chat$turns(), 2)
- test_turns_system(chat_fun)
- test_turns_existing(chat_fun)
+ resp <- chat$chat("Who is the remaining one? Just give the name")
+ expect_equal(resp, "Prancer")
+ expect_length(chat$turns(), 4)
})
-test_that("all tool variations work", {
- chat_fun <- chat_claude
+# Tool calls -------------------------------------------------------------------
+
+test_that("can make a simple tool call", {
+ chat <- chat_claude(system_prompt = "Be very terse, not even punctuation.")
+ chat$register_tool(ToolDef(
+ function() "2024-01-01",
+ name = "get_date",
+ description = "Gets the current date"
+ ))
- test_tools_simple(chat_fun)
- test_tools_async(chat_fun)
- test_tools_parallel(chat_fun)
- test_tools_sequential(chat_fun, total_calls = 6)
+ result <- chat$chat("What's the current date in YMD format?")
+ expect_match(result, "2024-01-01")
+
+ result <- chat$chat("What month is it?")
+ expect_match(result, "January")
})
-test_that("can use images", {
- chat_fun <- chat_claude
+test_that("can make an async tool call", {
+ chat <- chat_claude(system_prompt = "Be very terse, not even punctuation.")
+ chat$register_tool(ToolDef(
+ coro::async(function() "2024-01-01"),
+ name = "get_date",
+ description = "Gets the current date"
+ ))
+
+ result <- sync(chat$chat_async("What's the current date in YMD format?"))
+ expect_match(result, "2024-01-01")
+
+ expect_snapshot(chat$chat("Great. Do it again."), error = TRUE)
+})
+
+test_that("can call multiple tools in parallel", {
+ chat <- chat_claude(system_prompt = "Be very terse, not even punctuation.")
+ chat$register_tool(ToolDef(
+ function(person) if (person == "Joe") "sage green" else "red",
+ name = "favourite_color",
+ description = "Returns a person's favourite colour",
+ arguments = list(person = ToolArg("string", "Name of a person")),
+ strict = TRUE
+ ))
+
+ result <- chat$chat("
+ What are Joe and Hadley's favourite colours?
+ Answer like name1: colour1, name2: colour2
+ ")
+ expect_identical(result, "Joe: sage green, Hadley: red")
+ expect_length(chat$turns(), 4)
+})
+
+test_that("can call multiple tools in sequence", {
+ chat <- chat_claude(system_prompt = "Be very terse, not even punctuation.")
+ chat$register_tool(ToolDef(
+ function() 2024,
+ name = "get_year",
+ description = "Get the current year"
+ ))
+ chat$register_tool(ToolDef(
+ function(year) if (year == 2024) "Susan" else "I don't know",
+ name = "popular_name",
+ description = "Gets the most popular name for a year",
+ arguments = list(year = ToolArg("integer", "Year"))
+ ))
+
+ result <- chat$chat("What was the most popular name this year?")
+ expect_match(result, "Susan")
+ expect_length(chat$turns(), 6)
+})
+
+# Images -----------------------------------------------------------------
+
+test_that("can use inline images", {
+ chat <- chat_claude(model = "claude-3-5-sonnet-20240620")
+ response <- chat$chat(
+ "What's in this image? (Be sure to mention the outside shape)",
+ content_image_file(system.file("httr2.png", package = "elmer"))
+ )
+ expect_match(response, "hex")
+ expect_match(response, "baseball")
+ expect_length(chat$turns(), 2)
- test_images_inline(chat_fun)
- test_images_remote_error(chat_fun)
+ remote_image <- content_image_url("https://httr2.r-lib.org/logo.png")
+ expect_snapshot(
+ . <- chat$chat("What's in this image?", remote_image),
+ error = TRUE
+ )
+ expect_length(chat$turns(), 2)
})
--- tests/testthat/test-provider-gemini.R
@@ -1,17 +1,12 @@
-# Getting started --------------------------------------------------------
-
test_that("can make simple request", {
chat <- chat_gemini("Be as terse as possible; no punctuation")
resp <- chat$chat("What is 1 + 1?")
expect_match(resp, "2")
expect_equal(chat$last_turn()@tokens, c(17, 1))
-})
-test_that("can make simple async request", {
- chat <- chat_gemini("Be as terse as possible; no punctuation")
resp <- sync(chat$chat_async("What is 1 + 1?"))
expect_match(resp, "2")
- expect_equal(chat$last_turn()@tokens, c(17, 1))
+ expect_equal(chat$last_turn()@tokens, c(30, 1))
})
test_that("can make simple streaming request", {
@@ -23,36 +18,113 @@ test_that("can make simple streaming request", {
expect_match(paste0(unlist(resp), collapse = ""), "2")
})
-# Common provider interface -----------------------------------------------
+test_that("system prompt can be passed explicitly or as a turn", {
+ system_prompt <- "Return very minimal output, AND ONLY USE UPPERCASE."
+
+ chat <- chat_gemini(system_prompt = system_prompt)
+ resp <- chat$chat("What is the name of Winnie the Pooh's human friend?")
+ expect_match(resp, "CHRISTOPHER ROBIN")
-test_that("defaults are reported", {
- expect_snapshot(. <- chat_gemini())
+ chat <- chat_gemini(turns = list(Turn("system", system_prompt)))
+ resp <- chat$chat("What is the name of Winnie the Pooh's human friend?")
+ expect_match(resp, "CHRISTOPHER ROBIN")
})
-test_that("respects turns interface", {
- chat_fun <- chat_gemini
+test_that("existing conversation history is used", {
+ chat <- chat_gemini(turns = list(
+ Turn("system", "Return very minimal output; no punctuation."),
+ Turn("user", "List the names of any 8 of Santa's 9 reindeer."),
+ Turn("assistant", "Dasher, Dancer, Vixen, Comet, Cupid, Donner, Blitzen, and Rudolph.")
+ ))
- test_turns_system(chat_fun)
- test_turns_existing(chat_fun)
+ resp <- chat$chat("Who is the remaining one? Just give the name")
+ expect_match(resp, "Prancer")
})
-test_that("all tool variations work", {
- chat_fun <- chat_gemini
+# Tool calls -------------------------------------------------------------------
- test_tools_simple(chat_fun)
- test_tools_async(chat_fun)
- test_tools_parallel(chat_fun)
+test_that("can make a simple tool call", {
+ chat <- chat_gemini(system_prompt = "Be very terse, not even punctuation.")
+ chat$register_tool(ToolDef(
+ function() "2024-01-01",
+ name = "get_date",
+ description = "Gets the current date"
+ ))
- # <10% of the time, it uses only 6 calls, suggesting that it's made a poor
- # choice. Running it twice (i.e. retrying 1) should reduce failure rate to <1%
- retry_test(
- test_tools_sequential(chat_fun, total_calls = 8)
- )
+ result <- chat$chat("What's the current date?")
+ expect_match(result, "2024-01-01")
+
+ result <- chat$chat("What month is it?")
+ expect_match(result, "January")
+})
+
+test_that("can make an async tool call", {
+ chat <- chat_gemini(system_prompt = "Be very terse, not even punctuation.")
+ chat$register_tool(ToolDef(
+ coro::async(function() "2024-01-01"),
+ name = "get_date",
+ description = "Gets the current date"
+ ))
+
+ result <- sync(chat$chat_async("What's the current date?"))
+ expect_match(result, "2024-01-01")
+
+ expect_snapshot(chat$chat("Great. Do it again."), error = TRUE)
+})
+
+test_that("can call multiple tools in parallel", {
+ chat <- chat_gemini(system_prompt = "Be very terse, not even punctuation.")
+ chat$register_tool(ToolDef(
+ function(person) if (person == "Joe") "sage green" else "red",
+ name = "favourite_color",
+ description = "Returns a person's favourite colour",
+ arguments = list(person = ToolArg("string", "Name of a person")),
+ strict = TRUE
+ ))
+
+ result <- chat$chat("
+ What are Joe and Hadley's favourite colours?
+ Answer like name1: colour1, name2: colour2
+ ")
+ expect_match(result, "Joe: sage green, Hadley: red")
+ expect_length(chat$turns(), 4)
+})
+
+test_that("can call multiple tools in sequence", {
+ chat <- chat_gemini()
+ chat$register_tool(ToolDef(
+ function() 2024,
+ name = "get_year",
+ description = "Get the current year"
+ ))
+ chat$register_tool(ToolDef(
+ function(year) if (year == 2024) "Susan" else "I don't know",
+ name = "popular_name",
+ description = "Gets the most popular name for a year",
+ arguments = list(year = ToolArg("integer", "Year"))
+ ))
+
+ result <- chat$chat("What's the most popular name this year?")
+ expect_match(result, "Susan")
+ expect_length(chat$turns(), 6)
})
-test_that("can use images", {
- chat_fun <- chat_gemini
+# Images -----------------------------------------------------------------
- test_images_inline(chat_fun)
- test_images_remote_error(chat_fun)
+test_that("can use images (inline and remote)", {
+ chat <- chat_gemini()
+ response <- chat$chat(
+ "What's in this image? (Be sure to mention the outside shape)",
+ content_image_file(system.file("httr2.png", package = "elmer"))
+ )
+ expect_match(response, "hex")
+ expect_match(response, "baseball")
+ expect_length(chat$turns(), 2)
+
+ image_remote <- content_image_url("https://httr2.r-lib.org/logo.png")
+ expect_snapshot(
+ . <- chat$chat("What's in this image?", image_remote),
+ error = TRUE
+ )
+ expect_length(chat$turns(), 2)
})
--- tests/testthat/test-provider-openai.R
@@ -1,17 +1,16 @@
-# Getting started --------------------------------------------------------
+test_that("default model is reported", {
+ expect_snapshot(. <- chat_openai()$chat("Hi"))
+})
test_that("can make simple request", {
chat <- chat_openai("Be as terse as possible; no punctuation")
resp <- chat$chat("What is 1 + 1?")
expect_match(resp, "2")
expect_equal(chat$last_turn()@tokens, c(27, 1))
-})
-test_that("can make simple async request", {
- chat <- chat_openai("Be as terse as possible; no punctuation")
resp <- sync(chat$chat_async("What is 1 + 1?"))
expect_match(resp, "2")
- expect_equal(chat$last_turn()@tokens, c(27, 1))
+ expect_equal(chat$last_turn()@tokens, c(44, 1))
})
test_that("can make simple streaming request", {
@@ -23,31 +22,121 @@ test_that("can make simple streaming request", {
expect_match(paste0(unlist(resp), collapse = ""), "2")
})
-# Common provider interface -----------------------------------------------
-test_that("defaults are reported", {
- expect_snapshot(. <- chat_openai())
+test_that("system prompt can be passed explicitly or as a turn", {
+ system_prompt <- "Return very minimal output, AND ONLY USE UPPERCASE."
+
+ chat <- chat_openai(system_prompt = system_prompt)
+ resp <- chat$chat("What is the name of Winnie the Pooh's human friend?")
+ expect_match(resp, "CHRISTOPHER ROBIN")
+ expect_length(chat$turns(), 2)
+
+ chat <- chat_openai(turns = list(Turn("system", system_prompt)))
+ resp <- chat$chat("What is the name of Winnie the Pooh's human friend?")
+ expect_match(resp, "CHRISTOPHER ROBIN")
+ expect_length(chat$turns(), 2)
})
-test_that("respects turns interface", {
- chat_fun <- chat_openai
+test_that("existing conversation history is used", {
+ chat <- chat_openai(turns = list(
+ Turn("system", "Return very minimal output; no punctuation."),
+ Turn("user", "List the names of any 8 of Santa's 9 reindeer."),
+ Turn("assistant", "Dasher, Dancer, Vixen, Comet, Cupid, Donner, Blitzen, and Rudolph.")
+ ))
+ expect_length(chat$turns(), 2)
- test_turns_system(chat_fun)
- test_turns_existing(chat_fun)
+ resp <- chat$chat("Who is the remaining one? Just give the name")
+ expect_equal(resp, "Prancer")
+ expect_length(chat$turns(), 4)
})
-test_that("all tool variations work", {
- chat_fun <- chat_openai
+# Tool calls -------------------------------------------------------------------
+
+test_that("can make a simple tool call", {
+ chat <- chat_openai(system_prompt = "Be very terse, not even punctuation.")
+ chat$register_tool(ToolDef(
+ function() "2024-01-01",
+ name = "get_date",
+ description = "Gets the current date"
+ ))
+
+ result <- chat$chat("What's the current date in YMD format?")
+ expect_match(result, "2024-01-01")
- test_tools_simple(chat_fun)
- test_tools_async(chat_fun)
- test_tools_parallel(chat_fun)
- test_tools_sequential(chat_fun, total_calls = 6)
+ result <- chat$chat("What month is it?")
+ expect_match(result, "January")
})
-test_that("can use images", {
- chat_fun <- chat_openai
+test_that("can make an async tool call", {
+ chat <- chat_openai(system_prompt = "Be very terse, not even punctuation.")
+ chat$register_tool(ToolDef(
+ coro::async(function() "2024-01-01"),
+ name = "get_date",
+ description = "Gets the current date"
+ ))
+
+ result <- sync(chat$chat_async("What's the current date in YMD format?"))
+ expect_match(result, "2024-01-01")
+
+ expect_snapshot(chat$chat("Great. Do it again."), error = TRUE)
+})
+
+test_that("can call multiple tools in parallel", {
+ chat <- chat_openai(system_prompt = "Be very terse, not even punctuation.")
+ favourite_color <- function(person) {
+ if (person == "Joe") "sage green" else "red"
+ }
+ chat$register_tool(ToolDef(
+ function(person) if (person == "Joe") "sage green" else "red",
+ name = "favourite_color",
+ description = "Returns a person's favourite colour",
+ arguments = list(person = ToolArg("string", "Name of a person")),
+ strict = TRUE
+ ))
+
+ result <- chat$chat("
+ What are Joe and Hadley's favourite colours?
+ Answer like name1: colour1, name2: colour2
+ ")
+ expect_identical(result, "Joe: sage green, Hadley: red")
+ expect_length(chat$turns(), 4)
+})
+
+test_that("can call multiple tools in sequence", {
+ chat <- chat_openai(system_prompt = "Be very terse, not even punctuation.")
+ chat$register_tool(ToolDef(
+ function() 2024,
+ name = "get_year",
+ description = "Get the current year"
+ ))
+ chat$register_tool(ToolDef(
+ function(year) if (year == 2024) "Susan" else "I don't know",
+ name = "popular_name",
+ description = "Gets the most popular name for a year",
+ arguments = list(year = ToolArg("integer", "Year"))
+ ))
+
+ result <- chat$chat("What was the most popular name this year.")
+ expect_equal(result, "Susan")
+ expect_length(chat$turns(), 6)
+})
+
+# Images -----------------------------------------------------------------
+
+test_that("can use images (inline and remote)", {
+ chat <- chat_openai(model = "gpt-4o-mini")
+ response <- chat$chat(
+ "What's in this image? (Be sure to mention the outside shape)",
+ content_image_file(system.file("httr2.png", package = "elmer"))
+ )
+ expect_match(response, "hex")
+ expect_match(response, "baseball")
- test_images_inline(chat_fun)
- test_images_remote(chat_fun)
+ chat <- chat_openai(model = "gpt-4o-mini")
+ response <- chat$chat(
+ "What's in this image? (Be sure to mention the outside shape)",
+ content_image_url("https://httr2.r-lib.org/logo.png")
+ )
+ expect_match(response, "hex")
+ expect_match(response, "baseball")
})
|
ellmer
|
tidyverse
|
R
|
R
| 401
| 55
|
Call LLM APIs from R
|
tidyverse_ellmer
|
CODE_IMPROVEMENT
|
repeated code is replaced by function
|
9959217cc35b05248c252a0dba5301689dd24ce2
|
2024-09-27 20:40:51
|
fufesou
|
chore (#9491) Signed-off-by: fufesou <[email protected]>
| false
| 2
| 2
| 4
|
--- src/lang/cn.rs
@@ -563,7 +563,7 @@ pub static ref T: std::collections::HashMap<&'static str, &'static str> =
("Plug out all", "拔出所有"),
("True color (4:4:4)", "真彩模式(4:4:4)"),
("Enable blocking user input", "允许阻止用户输入"),
- ("id_input_tip", "可以输入 ID、直连 IP,或域名和端口号(<域名>:<端口号>)。\n要访问另一台服务器上的设备,请附加服务器地址(<ID>@<服务器地址>?key=<密钥>)。比如,\[email protected]:21117?key=5Qbwsde3unUcJBtrx9ZkvUmwFNoExHzpryHuPUdqlWM=。\n要访问公共服务器上的设备,请输入 \"<ID>@public\",无需密钥。\n\n如果您想要在首次连接时,强制走中继连接,请在 ID 的后面添加 \"/r\",例如,\"9123456234/r\"。"),
+ ("id_input_tip", "可以输入 ID、直连 IP,或域名和端口号(<域名>:<端口号>)。\n要访问另一台服务器上的设备,请附加服务器地址(<ID>@<服务器地址>?key=<密钥>)。比如,\[email protected]:21117?key=5Qbwsde3unUcJBtrx9ZkvUmwFNoExHzpryHuPUdqlWM=。\n要访问公共服务器上的设备,请输入 \"<ID>@public\", 无需密钥。\n\n如果您想要在首次连接时,强制走中继连接,请在 ID 的后面添加 \"/r\",例如,\"9123456234/r\"。"),
("privacy_mode_impl_mag_tip", "模式 1"),
("privacy_mode_impl_virtual_display_tip", "模式 2"),
("Enter privacy mode", "进入隐私模式"),
@@ -647,6 +647,6 @@ pub static ref T: std::collections::HashMap<&'static str, &'static str> =
("one-way-file-transfer-tip", "被控端启用了单向文件传输"),
("Authentication Required", "需要身份验证"),
("Authenticate", "认证"),
- ("web_id_input_tip", "可以输入同一个服务器内的 ID,web 客户端不支持直接 IP 访问。\n要访问另一台服务器上的设备,请附加服务器地址(<ID>@<服务器地址>?key=<密钥>)。比如,\[email protected]:21117?key=5Qbwsde3unUcJBtrx9ZkvUmwFNoExHzpryHuPUdqlWM=。\n要访问公共服务器上的设备,请输入 \"<ID>@public\",无需密钥。"),
+ ("web_id_input_tip", "可以输入同一个服务器内的 ID, web 客户端不支持直接 IP 访问。\n要访问另一台服务器上的设备,请附加服务器地址(<ID>@<服务器地址>?key=<密钥>)。比如,\[email protected]:21117?key=5Qbwsde3unUcJBtrx9ZkvUmwFNoExHzpryHuPUdqlWM=。\n要访问公共服务器上的设备,请输入 \"<ID>@public\", 无需密钥。"),
].iter().cloned().collect();
}
|
rustdesk
|
rustdesk
|
Rust
|
Rust
| 83,345
| 11,693
|
An open-source remote desktop application designed for self-hosting, as an alternative to TeamViewer.
|
rustdesk_rustdesk
|
CODE_IMPROVEMENT
|
Obvious
|
d61883547c058d9e19fa04269fec48e2f4f8d494
|
2022-06-27 05:10:31
|
Mike Bostock
|
7.5.0
| false
| 1
| 1
| 2
|
--- package.json
@@ -1,6 +1,6 @@
{
"name": "d3",
- "version": "7.5.0",
+ "version": "7.4.5",
"description": "Data-Driven Documents",
"homepage": "https://d3js.org",
"repository": {
|
d3
|
d3
|
Shell
|
Shell
| 109,977
| 22,868
|
Bring data to life with SVG, Canvas and HTML. :bar_chart::chart_with_upwards_trend::tada:
|
d3_d3
|
CODE_IMPROVEMENT
|
refactoring done in json file
|
0b5b95eff5e18c1e162d2b30b66a7be2bed1cbc2
|
2024-11-09 02:19:13
|
Nicholas Tindle
|
build: add launch.json debugging for vscode (#8496) Co-authored-by: Aarushi <[email protected]>
| false
| 67
| 0
| 67
|
--- .vscode/launch.json
@@ -1,67 +0,0 @@
-{
- "version": "0.2.0",
- "configurations": [
- {
- "name": "Frontend: Server Side",
- "type": "node-terminal",
- "request": "launch",
- "cwd": "${workspaceFolder}/autogpt_platform/frontend",
- "command": "yarn dev"
- },
- {
- "name": "Frontend: Client Side",
- "type": "msedge",
- "request": "launch",
- "url": "http://localhost:3000"
- },
- {
- "name": "Frontend: Full Stack",
- "type": "node-terminal",
-
- "request": "launch",
- "command": "yarn dev",
- "cwd": "${workspaceFolder}/autogpt_platform/frontend",
- "serverReadyAction": {
- "pattern": "- Local:.+(https?://.+)",
- "uriFormat": "%s",
- "action": "debugWithEdge"
- }
- },
- {
- "name": "Backend",
- "type": "debugpy",
- "request": "launch",
- "module": "backend.app",
- // "env": {
- // "ENV": "dev"
- // },
- "envFile": "${workspaceFolder}/backend/.env",
- "justMyCode": false,
- "cwd": "${workspaceFolder}/autogpt_platform/backend"
- },
- {
- "name": "Marketplace",
- "type": "debugpy",
- "request": "launch",
- "module": "autogpt_platform.market.main",
- "env": {
- "ENV": "dev"
- },
- "envFile": "${workspaceFolder}/market/.env",
- "justMyCode": false,
- "cwd": "${workspaceFolder}/market"
- }
- ],
- "compounds": [
- {
- "name": "Everything",
- "configurations": ["Backend", "Frontend: Full Stack"],
- // "preLaunchTask": "${defaultBuildTask}",
- "stopAll": true,
- "presentation": {
- "hidden": false,
- "order": 0
- }
- }
- ]
-}
|
autogpt
|
significant-gravitas
|
Python
|
Python
| 172,255
| 45,197
|
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
|
significant-gravitas_autogpt
|
CONFIG_CHANGE
|
Obvious
|
50c659fcba19f6d51d465815cadcea00d8abb02b
|
2025-03-14 23:58:59
|
Marijn Schouten
|
Clarify "owned data" in E0515.md This clarifies the explanation of why this is not allowed and also what to do instead. Fixes 62071 PS There was suggestion of adding a link to the book. I did not yet do that, but if desired that could be added.
| false
| 6
| 3
| 9
|
--- compiler/rustc_error_codes/src/error_codes/E0515.md
@@ -17,13 +17,10 @@ fn get_dangling_iterator<'a>() -> Iter<'a, i32> {
}
```
-Local variables, function parameters and temporaries are all dropped before
-the end of the function body. A returned reference (or struct containing a
-reference) to such a dropped value would immediately be invalid. Therefore
-it is not allowed to return such a reference.
+Local variables, function parameters and temporaries are all dropped before the
+end of the function body. So a reference to them cannot be returned.
-Consider returning a value that takes ownership of local data instead of
-referencing it:
+Consider returning an owned value instead:
```
use std::vec::IntoIter;
|
rust
|
rust-lang
|
Rust
|
Rust
| 101,693
| 13,172
|
Empowering everyone to build reliable and efficient software.
|
rust-lang_rust
|
DOC_CHANGE
|
Obvious
|
39a7730b1311fb78642c36686820d8d1c79e58e4
|
2024-10-20 06:15:20
|
Sebastian Markbåge
|
Rename SSRManifest to ServerConsumerManifest (#31299) This config is more generally applicable to all server-side Flight
Clients and not just SSR.
| false
| 106
| 100
| 206
|
--- fixtures/flight/server/global.js
@@ -130,7 +130,7 @@ async function renderApp(req, res, next) {
buildPath = path.join(__dirname, '../build/');
}
// Read the module map from the virtual file system.
- const serverConsumerManifest = JSON.parse(
+ const ssrManifest = JSON.parse(
await virtualFs.readFile(
path.join(buildPath, 'react-ssr-manifest.json'),
'utf8'
@@ -160,20 +160,14 @@ async function renderApp(req, res, next) {
rscResponse.pipe(rscResponse1);
rscResponse.pipe(rscResponse2);
- const {formState} = await createFromNodeStream(
- rscResponse1,
- serverConsumerManifest
- );
+ const {formState} = await createFromNodeStream(rscResponse1, ssrManifest);
rscResponse1.end();
let cachedResult;
let Root = () => {
if (!cachedResult) {
// Read this stream inside the render.
- cachedResult = createFromNodeStream(
- rscResponse2,
- serverConsumerManifest
- );
+ cachedResult = createFromNodeStream(rscResponse2, ssrManifest);
}
return React.use(cachedResult).root;
};
--- packages/react-client/src/ReactFlightClient.js
@@ -20,7 +20,7 @@ import type {LazyComponent} from 'react/src/ReactLazy';
import type {
ClientReference,
ClientReferenceMetadata,
- ServerConsumerModuleMap,
+ SSRModuleMap,
StringDecoder,
ModuleLoading,
} from './ReactFlightClientConfig';
@@ -269,7 +269,7 @@ export type FindSourceMapURLCallback = (
) => null | string;
export type Response = {
- _bundlerConfig: ServerConsumerModuleMap,
+ _bundlerConfig: SSRModuleMap,
_moduleLoading: ModuleLoading,
_callServer: CallServerCallback,
_encodeFormAction: void | EncodeFormActionCallback,
@@ -1420,7 +1420,7 @@ function missingCall() {
function ResponseInstance(
this: $FlowFixMe,
- bundlerConfig: ServerConsumerModuleMap,
+ bundlerConfig: SSRModuleMap,
moduleLoading: ModuleLoading,
callServer: void | CallServerCallback,
encodeFormAction: void | EncodeFormActionCallback,
@@ -1485,7 +1485,7 @@ function ResponseInstance(
}
export function createResponse(
- bundlerConfig: ServerConsumerModuleMap,
+ bundlerConfig: SSRModuleMap,
moduleLoading: ModuleLoading,
callServer: void | CallServerCallback,
encodeFormAction: void | EncodeFormActionCallback,
--- packages/react-client/src/forks/ReactFlightClientConfig.custom.js
@@ -26,7 +26,7 @@
declare const $$$config: any;
export opaque type ModuleLoading = mixed;
-export opaque type ServerConsumerModuleMap = mixed;
+export opaque type SSRModuleMap = mixed;
export opaque type ServerManifest = mixed;
export opaque type ServerReferenceId = string;
export opaque type ClientReferenceMetadata = mixed;
--- packages/react-client/src/forks/ReactFlightClientConfig.dom-bun.js
@@ -15,7 +15,7 @@ export * from 'react-client/src/ReactClientConsoleConfigPlain';
export * from 'react-dom-bindings/src/shared/ReactFlightClientConfigDOM';
export opaque type ModuleLoading = mixed;
-export opaque type ServerConsumerModuleMap = mixed;
+export opaque type SSRModuleMap = mixed;
export opaque type ServerManifest = mixed;
export opaque type ServerReferenceId = string;
export opaque type ClientReferenceMetadata = mixed;
--- packages/react-client/src/forks/ReactFlightClientConfig.dom-legacy.js
@@ -15,7 +15,7 @@ export * from 'react-client/src/ReactClientConsoleConfigBrowser';
export type Response = any;
export opaque type ModuleLoading = mixed;
-export opaque type ServerConsumerModuleMap = mixed;
+export opaque type SSRModuleMap = mixed;
export opaque type ServerManifest = mixed;
export opaque type ServerReferenceId = string;
export opaque type ClientReferenceMetadata = mixed;
--- packages/react-client/src/forks/ReactFlightClientConfig.markup.js
@@ -16,7 +16,7 @@ export * from 'react-markup/src/ReactMarkupLegacyClientStreamConfig.js';
export * from 'react-client/src/ReactClientConsoleConfigPlain';
export type ModuleLoading = null;
-export type ServerConsumerModuleMap = null;
+export type SSRModuleMap = null;
export opaque type ServerManifest = null;
export opaque type ServerReferenceId = string;
export opaque type ClientReferenceMetadata = null;
@@ -33,7 +33,7 @@ export function prepareDestinationForModule(
}
export function resolveClientReference<T>(
- bundlerConfig: ServerConsumerModuleMap,
+ bundlerConfig: SSRModuleMap,
metadata: ClientReferenceMetadata,
): ClientReference<T> {
throw new Error(
--- packages/react-server-dom-esm/src/client/ReactFlightClientConfigBundlerESM.js
@@ -14,7 +14,7 @@ import type {
} from 'shared/ReactTypes';
import type {ModuleLoading} from 'react-client/src/ReactFlightClientConfig';
-export type ServerConsumerModuleMap = string; // Module root path
+export type SSRModuleMap = string; // Module root path
export type ServerManifest = string; // Module root path
@@ -48,7 +48,7 @@ export function prepareDestinationForModule(
}
export function resolveClientReference<T>(
- bundlerConfig: ServerConsumerModuleMap,
+ bundlerConfig: SSRModuleMap,
metadata: ClientReferenceMetadata,
): ClientReference<T> {
const baseURL = bundlerConfig;
--- packages/react-server-dom-turbopack/src/__tests__/ReactFlightTurbopackDOMEdge-test.js
@@ -97,7 +97,7 @@ describe('ReactFlightTurbopackDOMEdge', () => {
turbopackMap,
);
const response = ReactServerDOMClient.createFromReadableStream(stream, {
- serverManifest: {
+ ssrManifest: {
moduleMap: translationMap,
moduleLoading: null,
},
--- packages/react-server-dom-turbopack/src/client/ReactFlightClientConfigBundlerNode.js
@@ -24,7 +24,7 @@ import {
} from '../shared/ReactFlightImportMetadata';
import {prepareDestinationWithChunks} from 'react-client/src/ReactFlightClientConfig';
-export type ServerConsumerModuleMap = {
+export type SSRModuleMap = {
[clientId: string]: {
[clientExportName: string]: ClientReference<any>,
},
@@ -58,7 +58,7 @@ export function prepareDestinationForModule(
}
export function resolveClientReference<T>(
- bundlerConfig: ServerConsumerModuleMap,
+ bundlerConfig: SSRModuleMap,
metadata: ClientReferenceMetadata,
): ClientReference<T> {
const moduleExports = bundlerConfig[metadata[ID]];
--- packages/react-server-dom-turbopack/src/client/ReactFlightClientConfigBundlerTurbopack.js
@@ -30,7 +30,7 @@ import {prepareDestinationWithChunks} from 'react-client/src/ReactFlightClientCo
import {loadChunk} from 'react-client/src/ReactFlightClientConfig';
-export type ServerConsumerModuleMap = null | {
+export type SSRModuleMap = null | {
[clientId: string]: {
[clientExportName: string]: ClientReferenceManifestEntry,
},
@@ -63,7 +63,7 @@ export function prepareDestinationForModule(
}
export function resolveClientReference<T>(
- bundlerConfig: ServerConsumerModuleMap,
+ bundlerConfig: SSRModuleMap,
metadata: ClientReferenceMetadata,
): ClientReference<T> {
if (bundlerConfig) {
--- packages/react-server-dom-turbopack/src/client/ReactFlightDOMClientEdge.js
@@ -17,12 +17,12 @@ import type {
import type {ReactServerValue} from 'react-client/src/ReactFlightReplyClient';
import type {
- ServerConsumerModuleMap,
+ SSRModuleMap,
ModuleLoading,
} from 'react-client/src/ReactFlightClientConfig';
-type ServerConsumerManifest = {
- moduleMap: ServerConsumerModuleMap,
+type SSRManifest = {
+ moduleMap: SSRModuleMap,
moduleLoading: ModuleLoading,
};
@@ -66,7 +66,7 @@ type EncodeFormActionCallback = <A>(
) => ReactCustomFormAction;
export type Options = {
- serverManifest: ServerConsumerManifest,
+ ssrManifest: SSRManifest,
nonce?: string,
encodeFormAction?: EncodeFormActionCallback,
temporaryReferences?: TemporaryReferenceSet,
@@ -77,8 +77,8 @@ export type Options = {
function createResponseFromOptions(options: Options) {
return createResponse(
- options.serverManifest.moduleMap,
- options.serverManifest.moduleLoading,
+ options.ssrManifest.moduleMap,
+ options.ssrManifest.moduleLoading,
noServerCall,
options.encodeFormAction,
typeof options.nonce === 'string' ? options.nonce : undefined,
--- packages/react-server-dom-turbopack/src/client/ReactFlightDOMClientNode.js
@@ -15,12 +15,12 @@ import type {
} from 'react-client/src/ReactFlightClient';
import type {
- ServerConsumerModuleMap,
+ SSRModuleMap,
ModuleLoading,
} from 'react-client/src/ReactFlightClientConfig';
-type ServerConsumerManifest = {
- moduleMap: ServerConsumerModuleMap,
+type SSRManifest = {
+ moduleMap: SSRModuleMap,
moduleLoading: ModuleLoading,
};
@@ -66,12 +66,12 @@ export type Options = {
function createFromNodeStream<T>(
stream: Readable,
- serverConsumerManifest: ServerConsumerManifest,
+ ssrManifest: SSRManifest,
options?: Options,
): Thenable<T> {
const response: Response = createResponse(
- serverConsumerManifest.moduleMap,
- serverConsumerManifest.moduleLoading,
+ ssrManifest.moduleMap,
+ ssrManifest.moduleLoading,
noServerCall,
options ? options.encodeFormAction : undefined,
options && typeof options.nonce === 'string' ? options.nonce : undefined,
--- packages/react-server-dom-webpack/src/ReactFlightWebpackPlugin.js
@@ -58,7 +58,7 @@ type Options = {
clientReferences?: ClientReferencePath | $ReadOnlyArray<ClientReferencePath>,
chunkName?: string,
clientManifestFilename?: string,
- serverConsumerManifestFilename?: string,
+ ssrManifestFilename?: string,
};
const PLUGIN_NAME = 'React Server Plugin';
@@ -67,7 +67,7 @@ export default class ReactFlightWebpackPlugin {
clientReferences: $ReadOnlyArray<ClientReferencePath>;
chunkName: string;
clientManifestFilename: string;
- serverConsumerManifestFilename: string;
+ ssrManifestFilename: string;
constructor(options: Options) {
if (!options || typeof options.isServer !== 'boolean') {
@@ -105,8 +105,8 @@ export default class ReactFlightWebpackPlugin {
}
this.clientManifestFilename =
options.clientManifestFilename || 'react-client-manifest.json';
- this.serverConsumerManifestFilename =
- options.serverConsumerManifestFilename || 'react-ssr-manifest.json';
+ this.ssrManifestFilename =
+ options.ssrManifestFilename || 'react-ssr-manifest.json';
}
apply(compiler: any) {
@@ -239,18 +239,18 @@ export default class ReactFlightWebpackPlugin {
const clientManifest: {
[string]: ImportManifestEntry,
} = {};
- type ServerConsumerModuleMap = {
+ type SSRModuleMap = {
[string]: {
[string]: {specifier: string, name: string},
},
};
- const moduleMap: ServerConsumerModuleMap = {};
+ const moduleMap: SSRModuleMap = {};
const ssrBundleConfig: {
moduleLoading: {
prefix: string,
crossOrigin: string | null,
},
- moduleMap: ServerConsumerModuleMap,
+ moduleMap: SSRModuleMap,
} = {
moduleLoading: {
prefix: compilation.outputOptions.publicPath || '',
@@ -374,7 +374,7 @@ export default class ReactFlightWebpackPlugin {
);
const ssrOutput = JSON.stringify(ssrBundleConfig, null, 2);
compilation.emitAsset(
- _this.serverConsumerManifestFilename,
+ _this.ssrManifestFilename,
new sources.RawSource(ssrOutput, false),
);
},
--- packages/react-server-dom-webpack/src/__tests__/ReactFlightDOMEdge-test.js
@@ -195,7 +195,7 @@ describe('ReactFlightDOMEdge', () => {
ReactServerDOMServer.renderToReadableStream(<App />, webpackMap),
);
const response = ReactServerDOMClient.createFromReadableStream(stream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: translationMap,
moduleLoading: webpackModuleLoading,
},
@@ -235,7 +235,7 @@ describe('ReactFlightDOMEdge', () => {
const result = await ReactServerDOMClient.createFromReadableStream(
stream2,
{
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -263,7 +263,7 @@ describe('ReactFlightDOMEdge', () => {
const result = await ReactServerDOMClient.createFromReadableStream(
stream2,
{
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -333,7 +333,7 @@ describe('ReactFlightDOMEdge', () => {
expect(timesRendered).toBeLessThan(5);
const model = await ReactServerDOMClient.createFromReadableStream(stream2, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -405,7 +405,7 @@ describe('ReactFlightDOMEdge', () => {
const model = await serverAct(() =>
ReactServerDOMClient.createFromReadableStream(stream2, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -462,7 +462,7 @@ describe('ReactFlightDOMEdge', () => {
passThrough(ReactServerDOMServer.renderToReadableStream(buffers)),
);
const result = await ReactServerDOMClient.createFromReadableStream(stream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -482,7 +482,7 @@ describe('ReactFlightDOMEdge', () => {
passThrough(ReactServerDOMServer.renderToReadableStream(blob)),
);
const result = await ReactServerDOMClient.createFromReadableStream(stream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -512,7 +512,7 @@ describe('ReactFlightDOMEdge', () => {
passThrough(ReactServerDOMServer.renderToReadableStream(formData)),
);
const result = await ReactServerDOMClient.createFromReadableStream(stream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -547,7 +547,7 @@ describe('ReactFlightDOMEdge', () => {
const resultPromise = ReactServerDOMClient.createFromReadableStream(
stream,
{
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -587,7 +587,7 @@ describe('ReactFlightDOMEdge', () => {
const result = await serverAct(() =>
ReactServerDOMClient.createFromReadableStream(stream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -635,7 +635,7 @@ describe('ReactFlightDOMEdge', () => {
// Parsing the root blocks because the module hasn't loaded yet
const result = await serverAct(() =>
ReactServerDOMClient.createFromReadableStream(stream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -689,7 +689,7 @@ describe('ReactFlightDOMEdge', () => {
const result = await ReactServerDOMClient.createFromReadableStream(
stream2,
{
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -777,7 +777,7 @@ describe('ReactFlightDOMEdge', () => {
const result = await ReactServerDOMClient.createFromReadableStream(
stream1,
{
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -836,7 +836,7 @@ describe('ReactFlightDOMEdge', () => {
const result = await ReactServerDOMClient.createFromReadableStream(
stream1,
{
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -891,7 +891,7 @@ describe('ReactFlightDOMEdge', () => {
const rootModel = await serverAct(() =>
ReactServerDOMClient.createFromReadableStream(stream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -995,7 +995,7 @@ describe('ReactFlightDOMEdge', () => {
const rootModel = await serverAct(() =>
ReactServerDOMClient.createFromReadableStream(stream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -1079,7 +1079,7 @@ describe('ReactFlightDOMEdge', () => {
}
const response = ReactServerDOMClient.createFromReadableStream(prelude, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -1146,7 +1146,7 @@ describe('ReactFlightDOMEdge', () => {
}
const response = ReactServerDOMClient.createFromReadableStream(prelude, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
--- packages/react-server-dom-webpack/src/__tests__/ReactFlightDOMForm-test.js
@@ -166,7 +166,7 @@ describe('ReactFlightDOMForm', () => {
}
const rscStream = ReactServerDOMServer.renderToReadableStream(<App />);
const response = ReactServerDOMClient.createFromReadableStream(rscStream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -232,7 +232,7 @@ describe('ReactFlightDOMForm', () => {
}
const rscStream = ReactServerDOMServer.renderToReadableStream(<App />);
const response = ReactServerDOMClient.createFromReadableStream(rscStream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -272,7 +272,7 @@ describe('ReactFlightDOMForm', () => {
}
const rscStream = ReactServerDOMServer.renderToReadableStream(<App />);
const response = ReactServerDOMClient.createFromReadableStream(rscStream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -343,7 +343,7 @@ describe('ReactFlightDOMForm', () => {
webpackMap,
);
const response = ReactServerDOMClient.createFromReadableStream(rscStream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -391,7 +391,7 @@ describe('ReactFlightDOMForm', () => {
webpackMap,
);
const response = ReactServerDOMClient.createFromReadableStream(rscStream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -444,7 +444,7 @@ describe('ReactFlightDOMForm', () => {
webpackMap,
);
const response = ReactServerDOMClient.createFromReadableStream(rscStream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -469,7 +469,7 @@ describe('ReactFlightDOMForm', () => {
const postbackResponse = ReactServerDOMClient.createFromReadableStream(
postbackRscStream,
{
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -539,7 +539,7 @@ describe('ReactFlightDOMForm', () => {
const response = ReactServerDOMClient.createFromReadableStream(
rscStream,
{
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -569,7 +569,7 @@ describe('ReactFlightDOMForm', () => {
const postbackResponse = ReactServerDOMClient.createFromReadableStream(
postbackRscStream,
{
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -601,7 +601,7 @@ describe('ReactFlightDOMForm', () => {
const postbackResponse2 = ReactServerDOMClient.createFromReadableStream(
postbackRscStream2,
{
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -660,7 +660,7 @@ describe('ReactFlightDOMForm', () => {
webpackMap,
);
const response = ReactServerDOMClient.createFromReadableStream(rscStream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -687,7 +687,7 @@ describe('ReactFlightDOMForm', () => {
const postbackResponse = ReactServerDOMClient.createFromReadableStream(
postbackRscStream,
{
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -738,7 +738,7 @@ describe('ReactFlightDOMForm', () => {
webpackMap,
);
const response = ReactServerDOMClient.createFromReadableStream(rscStream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -766,7 +766,7 @@ describe('ReactFlightDOMForm', () => {
const postbackResponse = ReactServerDOMClient.createFromReadableStream(
postbackRscStream,
{
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -794,7 +794,7 @@ describe('ReactFlightDOMForm', () => {
const postbackResponse2 = ReactServerDOMClient.createFromReadableStream(
postbackRscStream2,
{
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -838,7 +838,7 @@ describe('ReactFlightDOMForm', () => {
webpackMap,
);
const response = ReactServerDOMClient.createFromReadableStream(rscStream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -891,7 +891,7 @@ describe('ReactFlightDOMForm', () => {
webpackMap,
);
const response = ReactServerDOMClient.createFromReadableStream(rscStream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -928,7 +928,7 @@ describe('ReactFlightDOMForm', () => {
webpackMap,
);
const response = ReactServerDOMClient.createFromReadableStream(rscStream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -954,7 +954,7 @@ describe('ReactFlightDOMForm', () => {
const postbackResponse = ReactServerDOMClient.createFromReadableStream(
postbackRscStream,
{
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -1007,7 +1007,7 @@ describe('ReactFlightDOMForm', () => {
webpackMap,
);
const response = ReactServerDOMClient.createFromReadableStream(rscStream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -1032,7 +1032,7 @@ describe('ReactFlightDOMForm', () => {
);
const postbackResponse =
await ReactServerDOMClient.createFromReadableStream(postbackRscStream, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
--- packages/react-server-dom-webpack/src/__tests__/ReactFlightDOMNode-test.js
@@ -130,7 +130,7 @@ describe('ReactFlightDOMNode', () => {
'*': ssrMetadata,
},
};
- const serverConsumerManifest = {
+ const ssrManifest = {
moduleMap: translationMap,
moduleLoading: webpackModuleLoading,
};
@@ -151,7 +151,7 @@ describe('ReactFlightDOMNode', () => {
if (response) return use(response);
response = ReactServerDOMClient.createFromNodeStream(
readable,
- serverConsumerManifest,
+ ssrManifest,
);
return use(response);
}
@@ -255,7 +255,7 @@ describe('ReactFlightDOMNode', () => {
'*': ssrMetadata,
},
};
- const serverConsumerManifest = {
+ const ssrManifest = {
moduleMap: translationMap,
moduleLoading: webpackModuleLoading,
};
@@ -276,7 +276,7 @@ describe('ReactFlightDOMNode', () => {
if (response) return use(response);
response = ReactServerDOMClient.createFromNodeStream(
readable,
- serverConsumerManifest,
+ ssrManifest,
{
nonce: 'r4nd0m',
},
@@ -426,7 +426,7 @@ describe('ReactFlightDOMNode', () => {
}
const response = ReactServerDOMClient.createFromNodeStream(prelude, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
@@ -492,7 +492,7 @@ describe('ReactFlightDOMNode', () => {
}
const response = ReactServerDOMClient.createFromNodeStream(prelude, {
- serverConsumerManifest: {
+ ssrManifest: {
moduleMap: null,
moduleLoading: null,
},
--- packages/react-server-dom-webpack/src/client/ReactFlightClientConfigBundlerNode.js
@@ -24,7 +24,7 @@ import {
} from '../shared/ReactFlightImportMetadata';
import {prepareDestinationWithChunks} from 'react-client/src/ReactFlightClientConfig';
-export type ServerConsumerModuleMap = {
+export type SSRModuleMap = {
[clientId: string]: {
[clientExportName: string]: ClientReference<any>,
},
@@ -58,7 +58,7 @@ export function prepareDestinationForModule(
}
export function resolveClientReference<T>(
- bundlerConfig: ServerConsumerModuleMap,
+ bundlerConfig: SSRModuleMap,
metadata: ClientReferenceMetadata,
): ClientReference<T> {
const moduleExports = bundlerConfig[metadata[ID]];
--- packages/react-server-dom-webpack/src/client/ReactFlightClientConfigBundlerWebpack.js
@@ -30,7 +30,7 @@ import {prepareDestinationWithChunks} from 'react-client/src/ReactFlightClientCo
import {loadChunk} from 'react-client/src/ReactFlightClientConfig';
-export type ServerConsumerModuleMap = null | {
+export type SSRModuleMap = null | {
[clientId: string]: {
[clientExportName: string]: ClientReferenceManifestEntry,
},
@@ -63,7 +63,7 @@ export function prepareDestinationForModule(
}
export function resolveClientReference<T>(
- bundlerConfig: ServerConsumerModuleMap,
+ bundlerConfig: SSRModuleMap,
metadata: ClientReferenceMetadata,
): ClientReference<T> {
if (bundlerConfig) {
--- packages/react-server-dom-webpack/src/client/ReactFlightDOMClientEdge.js
@@ -17,12 +17,12 @@ import type {
import type {ReactServerValue} from 'react-client/src/ReactFlightReplyClient';
import type {
- ServerConsumerModuleMap,
+ SSRModuleMap,
ModuleLoading,
} from 'react-client/src/ReactFlightClientConfig';
-type ServerConsumerManifest = {
- moduleMap: ServerConsumerModuleMap,
+type SSRManifest = {
+ moduleMap: SSRModuleMap,
moduleLoading: ModuleLoading,
};
@@ -66,7 +66,7 @@ type EncodeFormActionCallback = <A>(
) => ReactCustomFormAction;
export type Options = {
- serverConsumerManifest: ServerConsumerManifest,
+ ssrManifest: SSRManifest,
nonce?: string,
encodeFormAction?: EncodeFormActionCallback,
temporaryReferences?: TemporaryReferenceSet,
@@ -77,8 +77,8 @@ export type Options = {
function createResponseFromOptions(options: Options) {
return createResponse(
- options.serverConsumerManifest.moduleMap,
- options.serverConsumerManifest.moduleLoading,
+ options.ssrManifest.moduleMap,
+ options.ssrManifest.moduleLoading,
noServerCall,
options.encodeFormAction,
typeof options.nonce === 'string' ? options.nonce : undefined,
--- packages/react-server-dom-webpack/src/client/ReactFlightDOMClientNode.js
@@ -15,12 +15,12 @@ import type {
} from 'react-client/src/ReactFlightClient';
import type {
- ServerConsumerModuleMap,
+ SSRModuleMap,
ModuleLoading,
} from 'react-client/src/ReactFlightClientConfig';
-type ServerConsumerManifest = {
- moduleMap: ServerConsumerModuleMap,
+type SSRManifest = {
+ moduleMap: SSRModuleMap,
moduleLoading: ModuleLoading,
};
@@ -67,12 +67,12 @@ export type Options = {
function createFromNodeStream<T>(
stream: Readable,
- serverConsumerManifest: ServerConsumerManifest,
+ ssrManifest: SSRManifest,
options?: Options,
): Thenable<T> {
const response: Response = createResponse(
- serverConsumerManifest.moduleMap,
- serverConsumerManifest.moduleLoading,
+ ssrManifest.moduleMap,
+ ssrManifest.moduleLoading,
noServerCall,
options ? options.encodeFormAction : undefined,
options && typeof options.nonce === 'string' ? options.nonce : undefined,
|
react
|
facebook
|
JavaScript
|
JavaScript
| 232,878
| 47,794
|
The library for web and native user interfaces.
|
facebook_react
|
CODE_IMPROVEMENT
|
refactoring done to rename SSRManifest
|
fb301b5767b6c4d2b363d1ed405d71bfce835158
|
2023-09-09 14:12:33
|
gonglja
|
Fix print error (#729) * fix(codes/cpp): Memory leak fix: the space was not freed when pop removed the element.
* fix(codes/cpp): Fix access error when printArray(arr, 0)
* Update PrintUtil.hpp
* fix(codes/c): Fix some errors of cmake build
* feat(codes/c): Add hashing_search.c
* styles(codes/c): Modify function description
* styles(codes/c): Modify binary_search.c code style
* fix(codes/c): Fix the problem in binary_tree_bfs.c and the problem that the memory is not released.
* feat: Add preorder_traversal_i_compact.c
* feat(codes/c): Add head_sort.c
* feat(codes/c): Add bucket_sort.c
* feat(codes/c): Add binary_search_edge.c
* fix(codes/c): Add programs that are not managed by cmake (c code)
* feat(codes/c): Add selection_sort.c
* style(codes/c): Change swap in selection_sort.c to `selectionSort`
* styles(codes/c): Change style.
* fix(codes/c): Fix some formatting errors and temporarily remove backtracking chapters
* fix(codes/c): Fix space_complexity.c build error.
* feat(codes/c): Add array_binary_tree.c
* feat(code/c): Update push_back and pop_back in vector.h
* styles(codes/c): Adjust format.
* fix(codes/cpp/chapter_greedy): Fix print error.
---------
Co-authored-by: Yudong Jin <[email protected]>
| false
| 2
| 2
| 4
|
--- codes/cpp/chapter_greedy/coin_change_greedy.cpp
@@ -40,7 +40,7 @@ int main() {
coins = {1, 20, 50};
amt = 60;
res = coinChangeGreedy(coins, amt);
- cout << "\ncoins = ";
+ cout << "\ncoins = [";
printVector(coins);
cout << "amt = " << amt << endl;
cout << "凑到 " << amt << " 所需的最少硬币数量为 " << res << endl;
@@ -50,7 +50,7 @@ int main() {
coins = {1, 49, 50};
amt = 98;
res = coinChangeGreedy(coins, amt);
- cout << "\ncoins = ";
+ cout << "\ncoins = [";
printVector(coins);
cout << "amt = " << amt << endl;
cout << "凑到 " << amt << " 所需的最少硬币数量为 " << res << endl;
|
hello-algo
|
krahets
|
Java
|
Java
| 109,696
| 13,651
|
《Hello 算法》:动画图解、一键运行的数据结构与算法教程。支持 Python, Java, C++, C, C#, JS, Go, Swift, Rust, Ruby, Kotlin, TS, Dart 代码。简体版和繁体版同步更新,English version ongoing
|
krahets_hello-algo
|
NEW_FEAT
|
Obvious
|
6702efad0b727869145510fdd6aa3eb94f2eab64
|
2023-02-23 22:46:33
|
taycaldwell
|
Update README.md
| false
| 1
| 1
| 2
|
--- README.md
@@ -2,7 +2,7 @@
# Base node
-Base is a secure, low-cost, developer-friendly Ethereum L2 built to bring the next billion users to web3. It's built on Optimism’s open-source [OP Stack](https://stack.optimism.io/).
+Base is a secure, low-cost, developer-friendly Ethereum L2 built to bring the next billion users to web3. It's built on Optimism’s open-source [OP Stack](https://optimism.io).
This repository contains the relevant Docker builds to run your own node on the Base network.
|
node
|
base
|
Shell
|
Shell
| 68,555
| 2,658
|
Everything required to run your own Base node
|
base_node
|
DOC_CHANGE
|
Obvious
|
e0964be674d503c3a55380877b9c793e72f43dc7
|
2023-03-14 03:41:42
|
Wout De Puysseleir
|
Only copy the files from the copy directory
| false
| 213
| 212
| 425
|
--- lib/mix/tasks/configure_esbuild.ex
@@ -10,7 +10,7 @@ defmodule Mix.Tasks.LiveSvelte.ConfigureEsbuild do
Mix.Project.deps_paths(depth: 1)
|> Map.fetch!(:live_svelte)
- |> Path.join("assets/copy/**/*{.js}")
+ |> Path.join("assets/**/*{.js}")
|> Path.wildcard()
|> Enum.each(fn file ->
split = Path.split(file)
@@ -21,7 +21,6 @@ defmodule Mix.Tasks.LiveSvelte.ConfigureEsbuild do
|> Stream.with_index()
|> Stream.reject(fn {_item, i} -> assets_index > i end)
|> Enum.map(&elem(&1, 0))
- |> Enum.reject(&(&1 == "copy"))
|> Path.join()
Mix.Generator.copy_file(file, path)
--- lib/ssr.ex
@@ -3,5 +3,5 @@ defmodule LiveSvelte.SSR do
def render(name, nil, slots), do: render(name, %{}, slots)
def render(name, props, slots),
- do: NodeJS.call!({"js/render", "render"}, [name, props, slots])
+ do: NodeJS.call!({"svelte/render", "render"}, [name, props, slots])
end
|
live_svelte
|
woutdp
|
Elixir
|
Elixir
| 1,416
| 58
|
Svelte inside Phoenix LiveView with seamless end-to-end reactivity
|
woutdp_live_svelte
|
BUG_FIX
|
obvious
|
8b651b9e31ba794d1a008e71e8a5569d7ff307ca
| null |
Chaoqian Xu
|
feat: pub use tauri_runtime as runtime (#2926) Co-authored-by: Lucas Nogueira <[email protected]>
| false
| 1
| 1
| 0
|
--- lib.rs
@@ -38,7 +38,7 @@ mod hooks;
mod manager;
pub mod plugin;
pub mod window;
-use tauri_runtime as runtime;
+pub use tauri_runtime as runtime;
pub mod settings;
mod state;
#[cfg(feature = "updater")]
|
tauri-apps_tauri.json
| null | null | null | null | null | null |
tauri-apps_tauri.json
|
NEW_FEAT
|
5, obvious
|
a2986acc1aca28e1a0260d6add6657898d19538c
| null |
piper
|
correct order of argument for tf.image.resize_image_with_crop_or_pad
| false
| 1
| 1
| 0
|
--- cifar10_input.py
@@ -237,7 +237,7 @@ def inputs(eval_data, data_dir, batch_size):
# Image processing for evaluation.
# Crop the central [height, width] of the image.
resized_image = tf.image.resize_image_with_crop_or_pad(reshaped_image,
- width, height)
+ height, width)
# Subtract off the mean and divide by the variance of the pixels.
float_image = tf.image.per_image_standardization(resized_image)
|
tensorflow_models.json
| null | null | null | null | null | null |
tensorflow_models.json
|
CODE_IMPROVEMENT
|
4, improvement in code
|
322c51e1cf093e2cb8089a13834fe7f0474b8ab5
|
2022-01-12 01:03:43
|
Aidan Holland
|
Update Censys Link
| false
| 1
| 1
| 2
|
--- README.md
@@ -1388,7 +1388,7 @@ API | Description | Auth | HTTPS | CORS |
| [BitWarden](https://bitwarden.com/help/api/) | Best open-source password manager | `OAuth` | Yes | Unknown |
| [Botd](https://github.com/fingerprintjs/botd) | Botd is a browser library for JavaScript bot detection | `apiKey` | Yes | Yes |
| [Bugcrowd](https://docs.bugcrowd.com/api/getting-started/) | Bugcrowd API for interacting and tracking the reported issues programmatically | `apiKey` | Yes | Unknown |
-| [Censys](https://search.censys.io/api) | Search engine for Internet connected host and devices | `apiKey` | Yes | No |
+| [Censys.io](https://censys.io/api) | Search engine for Internet connected host and devices | `apiKey` | Yes | No |
| [Classify](https://classify-web.herokuapp.com/#/api) | Encrypting & decrypting text messages | No | Yes | Yes |
| [Complete Criminal Checks](https://completecriminalchecks.com/Developers) | Provides data of offenders from all U.S. States and Pureto Rico | `apiKey` | Yes | Yes |
| [CRXcavator](https://crxcavator.io/apidocs) | Chrome extension risk scoring | `apiKey` | Yes | Unknown |
|
public-apis
|
public-apis
|
Python
|
Python
| 329,015
| 34,881
|
A collective list of free APIs
|
public-apis_public-apis
|
DOC_CHANGE
|
Obvious
|
6959395d5e2d99435828e5ef23ed2860662c9410
|
2025-03-19 03:15:38
|
Alexander.Likhachev
|
[BTA] Add a helper method to generate random project ID
| false
| 13
| 2
| 15
|
--- compiler/build-tools/kotlin-build-tools-api-tests/src/main/kotlin/compilation/model/Project.kt
@@ -21,7 +21,7 @@ class Project(
val defaultStrategyConfig: CompilerExecutionStrategyConfiguration,
val projectDirectory: Path,
) {
- val projectId = ProjectId.RandomProjectUUID()
+ val projectId = ProjectId.ProjectUUID(UUID.randomUUID())
private val invalidModuleNameCharactersRegex = """[\\/\r\n\t]""".toRegex()
fun module(
--- compiler/build-tools/kotlin-build-tools-api/api/kotlin-build-tools-api.api
@@ -52,11 +52,6 @@ public abstract interface class org/jetbrains/kotlin/buildtools/api/KotlinLogger
}
public abstract interface class org/jetbrains/kotlin/buildtools/api/ProjectId {
- public static final field Companion Lorg/jetbrains/kotlin/buildtools/api/ProjectId$Companion;
-}
-
-public final class org/jetbrains/kotlin/buildtools/api/ProjectId$Companion {
- public final fun RandomProjectUUID ()Lorg/jetbrains/kotlin/buildtools/api/ProjectId$ProjectUUID;
}
public final class org/jetbrains/kotlin/buildtools/api/ProjectId$ProjectUUID : org/jetbrains/kotlin/buildtools/api/ProjectId {
--- compiler/build-tools/kotlin-build-tools-api/src/main/kotlin/org/jetbrains/kotlin/buildtools/api/ProjectId.kt
@@ -15,10 +15,4 @@ import java.util.*
*/
public sealed interface ProjectId {
public data class ProjectUUID(public val uuid: UUID) : ProjectId
-
- public companion object {
- @ExperimentalBuildToolsApi
- @Suppress("FunctionName") // constructor-like method
- public fun RandomProjectUUID(): ProjectUUID = ProjectUUID(UUID.randomUUID())
- }
}
\ No newline at end of file
--- libraries/tools/kotlin-maven-plugin/src/main/java/org/jetbrains/kotlin/maven/K2JVMCompileMojo.java
@@ -245,7 +245,7 @@ public class K2JVMCompileMojo extends KotlinCompileMojoBase<K2JVMCompilerArgumen
List<File> sourceRoots
) throws MojoExecutionException {
try {
- ProjectId projectId = ProjectId.Companion.RandomProjectUUID();
+ ProjectId projectId = new ProjectId.ProjectUUID(UUID.randomUUID());
CompilationService compilationService = getCompilationService();
CompilerExecutionStrategyConfiguration strategyConfig = compilationService.makeCompilerExecutionStrategyConfiguration();
strategyConfig.useInProcessStrategy();
|
kotlin
|
jetbrains
|
Kotlin
|
Kotlin
| 50,115
| 5,861
|
The Kotlin Programming Language.
|
jetbrains_kotlin
|
NEW_FEAT
|
Introduce a new functionality
|
f0f2d2ee70a7b1e64d08827f961add853c8b718c
| null |
Ryan Dahl
|
Fix readdirSync in docs
| false
| 1
| 1
| 0
|
--- api.txt
@@ -637,7 +637,7 @@ Asynchronous readdir(3). Reads the contents of a directory.
The callback gets two arguments +(err, files)+ where +files+ is an array of
the names of the files in the directory excluding +"."+ and +".."+.
-+fs.readdir(path, callback)+ ::
++fs.readdirSync(path)+ ::
Synchronous readdir(3). Returns an array of filenames excluding +"."+ and
+".."+.
|
nodejs_node.json
| null | null | null | null | null | null |
nodejs_node.json
|
CONFIG_CHANGE
|
5, obvious
|
564b01cf73c4209e769f65752e2b20279647d7b3
| null |
Benjamin Coe
|
fix: use babel-core/register rather than babel-register
| false
| 1
| 1
| 0
|
--- package.json
@@ -26,7 +26,7 @@
},
"nyc": {
"require": [
- "babel-register"
+ "babel-core/register"
],
"exclude": [
"gulpfile.js",
|
vercel_next.js.json
| null | null | null | null | null | null |
vercel_next.js.json
|
BUG_FIX
|
5, fix written in commits msg
|
5b3393b6a2920c4f410ee636777533c77752106e
|
2024-11-14 06:07:21
|
Michael Yang
|
fix(mllama): sync backend between batches
| false
| 11
| 0
| 11
|
--- llama/llama.go
@@ -598,10 +598,6 @@ func (c *Context) SetCrossAttention(state bool) {
C.llama_set_cross_attention(c.c, C.bool(state))
}
-func (c *Context) Synchronize() {
- C.llama_synchronize(c.c)
-}
-
// sampling
// TODO: this is a temporary wrapper to allow calling C++ code from CGo
type SamplingContext struct {
--- llama/runner/runner.go
@@ -427,13 +427,6 @@ func (s *Server) processBatch(tokenBatch *llama.Batch, embedBatch *llama.Batch)
return
}
- if crossAttention {
- // synchronize state to ensure the cross attention batch is complete.
- // needed specifically for multi-GPU systems otherwise an inflight
- // task may be incorrectly invalidated causing a crash
- s.lc.Synchronize()
- }
-
for i, seq := range s.seqs {
if seq == nil {
continue
|
ollama
|
ollama
|
Go
|
Go
| 131,099
| 10,753
|
Get up and running with Llama 3.3, DeepSeek-R1, Phi-4, Gemma 2, and other large language models.
|
ollama_ollama
|
BUG_FIX
|
Obvious
|
5870fbd0f7c0c0f7abf4a5bcd24c6f023e66c053
|
2024-03-27 17:53:16
|
Easy
|
更新说明
| false
| 21
| 0
| 21
|
--- README.md
@@ -1,10 +1,5 @@
# 《一人企业方法论》第二版
-## 对第一版的改进
-
-1. 从长文到一本近6万字的小书,从有感而发的分享到两年迭代而得的完整方法论
-1. 不再局限在独立开发,发展为更为通用的方法论,即使不懂技术的读者也可用于经营副业
-

## 作者信息
--- words.sh
@@ -1,16 +0,0 @@
-#!/bin/bash
-
-# 设置mdbook的源文件目录路径
-mdbook_src_dir="src"
-
-# 初始化字符数变量
-total_chars=0
-
-# 遍历目录下的所有.md文件并统计字符
-for file in $(find "$mdbook_src_dir" -name '*.md'); do
- chars=$(wc -m <"$file")
- total_chars=$((total_chars + chars))
-done
-
-# 输出总字符数
-echo "Total characters: $total_chars"
|
one-person-businesses-methodology-v2.0
|
easychen
|
PHP
|
PHP
| 5,272
| 464
|
《一人企业方法论》第二版,也适合做其他副业(比如自媒体、电商、数字商品)的非技术人群。
|
easychen_one-person-businesses-methodology-v2.0
|
NEW_FEAT
|
Obvious
|
6b41c9a01e5e38ae9c5361da2bd2eccc7afe0842
|
2022-01-26 01:42:13
|
tfpgh
|
Directly link to organizers repo (#646)
| false
| 1
| 1
| 2
|
--- README.md
@@ -10,7 +10,7 @@ If you enjoy the papers, perhaps stop by a local chapter meetup and join in on t
### Chapters
-Here are our official chapters. Let us know if you are interested in [starting one](https://github.com/papers-we-love/organizers) in your city!
+Here are our official chapters. Let us know if you are interested in [starting one](https://github.com/papers-we-love/papers-we-love/wiki/Creating-a-PWL-chapter) in your city!
* [Amsterdam](http://www.meetup.com/papers-we-love-amsterdam/)
* [Athens](https://www.meetup.com/Papers-We-Love-Athens)
|
papers-we-love
|
papers-we-love
|
Shell
|
Shell
| 91,347
| 5,859
|
Papers from the computer science community to read and discuss.
|
papers-we-love_papers-we-love
|
DOC_CHANGE
|
changes in readme
|
7fb0900e4c3b7b8f02a0c8ef09e4b978e0ca3423
|
2024-07-07 20:22:29
|
Easy
|
修正编译后电子书在部分阅读器上看不到图片的问题。
| false
| 160
| 160
| 320
|
--- src/README.md
@@ -10,4 +10,4 @@
## 一人企业方法论
-
\ No newline at end of file
+
\ No newline at end of file
--- src/assets-and-passive-income.md
@@ -5,7 +5,7 @@
对二十一世纪来说最宝贵的是人才,但对一人企业来说,最宝贵的却是时间。最稀缺的就是最宝贵的。正如本书前文中所述的,对于只有一个全职员工的企业,即使我们每天工作10小时,一周也只有50个工时。如果是以Side Project的方式来做,可用工时会更少。
-
+
实现平台的日薪
@@ -16,19 +16,19 @@
资产和被动收入
-------
-
+
《穷爸爸富爸爸》
在《穷爸爸富爸爸》中,罗伯特-清崎说过一句很经典的话:富人不为钱工作。要想规模化财富,就要让钱为你工作。解决办法也很简单,就是不断持有资产。
-
+
《穷爸爸富爸爸》中对资产的定义
书中对资产的定义是:「资产就是能把钱放进你口袋里的东西」。换言之,如果某样东西能在你不工作的时候为你带来持续的收入,那么它就可以被认为是资产。本书将沿用这个定义。
-
+
让资产为你工作
@@ -41,7 +41,7 @@
资产可以「把钱放进你的口袋」,这可是会下金蛋的鸡,当然是人人都想要了。既然人人想要,就难以轻易获得了。
-
+
如何获得资产
@@ -51,7 +51,7 @@
如果你手上钱比较多(这种情况比较少),那么可以通过投资或收购的方式来获得资产。这将帮我们跨过试错环节,直接去收购那些已经被市场认可,开始有不错的MRR(每月周期性收入)的资产。
-
+
indiemaker.co
@@ -59,7 +59,7 @@ indiemaker.co
例如,在Indie Maker这个网站上,我们可以看到许多程序员会销售他们的Side project。
-
+
已经产生MRR的项目
@@ -71,7 +71,7 @@ indiemaker.co
#### 收购其他资产
-
+
除了软件和SaaS产品之外,我们还可以考虑购买其他类型的资产。如网站、版权、房产、股份、账号以及一些自媒体平台。这会帮我们节省大量时间。
@@ -87,7 +87,7 @@ indiemaker.co
#### 数字商品创作
-
+
数字商品创作的种类
@@ -99,7 +99,7 @@ indiemaker.co
#### 通过NoCode创造互联网应用
-
+
NoCode创作实例
@@ -126,7 +126,7 @@ NoCode创作实例
#### 开源+AI定制
-
+
通过GPT给WordPress写插件
@@ -147,7 +147,7 @@ NoCode创作实例
这里给大家分享一个我自己的标准,它包含以下五个评估项:投入、产出、持续性、风险和门槛。
-
+
资产的量化评估
@@ -159,7 +159,7 @@ NoCode创作实例
很多产出不大的资产,如果投入极低,也是可以操作的。
-
+
B站视频
@@ -171,7 +171,7 @@ B站视频
「被动收入」这个说法容易让人误解为我们持有资产以后,什么事都不需要做了,资产就会源源不断地为我们挣钱。但实际上,每次投入能带来的产出往往是随着时间递减的,因为市场会变化、需求会更新、潮流会轮回、设备会陈旧。我们必须再次投入,才能恢复其产出。持续性则描述了投入之后,可以持续带来收入的时间。
-
+
持续性高的资产
--- src/building-software-products-or-services-from-scratch.md
@@ -5,7 +5,7 @@
产品流程
----
-
+
更适合一人企业的产品流程
@@ -39,7 +39,7 @@
接下来,我们就来看看,怎么从这个还有些模糊的想法中提出一个明确的价值主张,然后围绕它进行商业模式规划、功能和界面设计、验证和迭代开发,最终使其成为一个商业产品。
-
+
福利单词APP
@@ -52,7 +52,7 @@
### 价值主张
-
+
福利单词的价值主张
@@ -64,7 +64,7 @@
### 客户细分
-
+
福利单词的客户细分
@@ -78,7 +78,7 @@
### 价值主张的细化
-
+
细化后的福利单词价值主张
@@ -98,7 +98,7 @@
### 渠道通路
-
+
福利单词的渠道通路
@@ -106,7 +106,7 @@
### 客户关系
-
+
福利单词的客户关系
@@ -114,7 +114,7 @@
### 关键活动
-
+
福利单词的关键活动
@@ -158,7 +158,7 @@
### 成本与收益
-
+
福利单词的成本和收益
@@ -168,7 +168,7 @@
最后我们来看看完整的商业模式画布:
-
+
福利单词的完整商业画布
@@ -209,13 +209,13 @@
很多书里面都强调说,用户画像的头像要尽可能真实,最好用真人头像。但需要注意在网上乱找真人头像容易导致肖像权问题,这里给大家推荐一个通过AI生成真人头像的网站,叫做 thispersondoesnotexist.com。
-
+
thispersondoesnotexist.com
不过这个网站生成的多是欧美人,对国内的产品来讲,反而各种违和。我更喜欢使用日系的动漫捏脸网站来做,比如 [charat.me](https://charat.me/) 这个网站。
-
+
charat.me
@@ -223,15 +223,15 @@ charat.me
有了头像,再配上角色的说明和需求关键字,我们就有了一个简单好用的用户画像。下边是我们制作好的三个画像:
-
+
用户画像:王小康
-
+
用户画像:章小留
-
+
用户画像:卢小白
@@ -359,7 +359,7 @@ charat.me
- 图鉴模式:放到第二期,也可能是第三期。
- 语音回放:放到第二期。
-
+
使用思维导图构建功能列表
@@ -369,7 +369,7 @@ charat.me
确定好某一期的功能列表后,可以把各个功能归类到界面里。新建一个思维导图,写上显而易见的各个界面,然后把功能放到界面下去。
-
+
将功能归类到界面
@@ -384,7 +384,7 @@ charat.me
### 什么是 Adobe XD
-
+
AdobeXD
@@ -404,13 +404,13 @@ Adobe之前准备收购Figma,于是放弃了XD的更新,但后来又收购
点击左侧的菜单里面倒数第2个画板的按钮,
-
+
画板按钮
这时候在屏幕最右边就会出来一系列预置的画板尺寸。
-
+
画板预设
@@ -420,7 +420,7 @@ Adobe之前准备收购Figma,于是放弃了XD的更新,但后来又收购
然后按住 CTRL或者CMD + D,就可以直接复制画板。我们把第一个画板叫做背单词界面,然后开始设计。
-
+
复制画板
@@ -428,24 +428,24 @@ Adobe之前准备收购Figma,于是放弃了XD的更新,但后来又收购
先来制作背单词时,字母没有输入完时显示的遮罩效果。选择左侧工具栏中的矩形
-
+
矩形工具\
工具,画出一个覆盖全部画板的长方形。然后调节填充颜色为黑色,透明度为 30%。
-
+
遮罩的制作
然后我们到 unsplash.com 这个无版权网站上,找一只猫的图片,把它也放进来。
-
+
添加猫图
这时候猫是在遮罩上方的,所以它挡住了遮罩。
-
+
调整图层顺序
@@ -455,14 +455,14 @@ Adobe之前准备收购Figma,于是放弃了XD的更新,但后来又收购
接下来,在遮罩上边,我们来放上单词释义和输入框。点击最左侧工具栏中的
-
+
文字工具\
图标,切换到文字工具。
然后输入文字释义。
-
+
添加文字
@@ -470,7 +470,7 @@ Adobe之前准备收购Figma,于是放弃了XD的更新,但后来又收购
然后我们放上之前设计好的 Logo,加上单词输入框。
-
+
添加单词输入框
@@ -478,19 +478,19 @@ Adobe之前准备收购Figma,于是放弃了XD的更新,但后来又收购
#### 虚拟键盘
-
+
虚拟键盘
虚拟键盘的制作在 XD 中也很简单,直接用矩形工具绘制就行。需要注意的是圆角的做法。
-
+
圆角的设置
其实很简单,在右侧的属性设置里边,把圆角从0 改为 5 就可以了。在做好一个按钮后,我们可以按住 Shift 同时选中按钮和上边的文字,在右键菜单中将其编组(Group);然后按 CTRL或者CMD + D 就可以复制按钮。
-
+
批量分布和对齐
@@ -500,7 +500,7 @@ Adobe之前准备收购Figma,于是放弃了XD的更新,但后来又收购
再下来,我们需要在界面中引入图标。既然是矢量界面,当然是矢量图标最好。前边我们已经介绍过 thenounproject.com 了,它还为 pro 用户提供了一个客户端。在这个客户端里边可以非常方便的复制图标。
-
+
矢量图标
--- src/crowdsourcing-capability.md
@@ -13,7 +13,7 @@
为了解决这个矛盾,我们需要拥有众包的能力。简单来说,众包就是将任务分包给很多人,通常是我们产品的用户。在这种方式下,我们不需要雇佣员工,所以员工数量不会增加,可以很好地满足我们一人企业在规模上面的要求。
-
+
众包能力在基础设施中的位置和重要性
@@ -24,7 +24,7 @@
首先,并不是所有的任务都可以众包,它们需要非常明确,而我们分包给的任务执行者需要具备相应的能力、时间和意愿。所以总体来讲,我们需要处理掉所有自己、自动化和AI能处理的事情,最后将处理不了的部分拿出来进行众包。
-
+
众包系统的构成要件
--- src/define-opb.md
@@ -9,13 +9,13 @@
在这里,我想给出一个我自己的定义:「一人企业是以个体或个人品牌为主导的业务体」。
-
+
什么是一人企业
之所以选择称之为「业务体」,而非「公司」,是因为存在一个普遍的误解,即许多人认为必须要注册一家公司。尽管后来我将其称之为「企业」,这种误解仍旧存在。
-
+
一人企业的基本定义
@@ -29,7 +29,7 @@
很多人把「一人企业」和「个体户」划上等号,这是不对的。它们有根本上的不同。
-
+
一人企业和个体户的区别
@@ -37,7 +37,7 @@
### 一人企业不等于创业公司
-
+
一人企业和创业公司的区别
@@ -66,13 +66,13 @@
在详细讨论一人企业的定义之后,我们来看一下一人企业的特点。
-
+
一人企业的特点
它有三个显著的特点:员工数量少,资源有限,以及面对的竞争环境是以小博大。
-
+
员工少的优缺点
@@ -96,7 +96,7 @@
### 资源有限
-
+
资源有限的优缺点
@@ -108,7 +108,7 @@
50个工时意味着什么?举个例子,如果你要制作一个网页并添加细节,可能需要大约10个小时,而这还不包括测试和后期的一些优化工作。也就是说,如果以网站开发为例,一周大约可以完成3-5个页面。简略计算,两个小功能大约就能占用一周的时间。
-
+
方糖多钱外包报价计算器
@@ -134,7 +134,7 @@
### 以小博大
-
+
一人企业必须以小博大
--- src/discovery-of-by-product-advantages.md
@@ -19,7 +19,7 @@
### 工作流和方法论
-
+
主业最大的副产品:工作流和方法论
@@ -31,7 +31,7 @@
比如书籍出版,不但适合销售方法论,也是获得行业影响力的最佳渠道之一。甚至即使作为译者、也能「大树底下好乘凉」。
-
+
《故事写作大师班》
@@ -49,7 +49,7 @@
这个原本只是为了充分利用副产品的举措,后来帮助我度过了全职独立开发时没有收入的那一段时间。我后来又将录制和制作技术教程的方法论做成了[网课变现课](https://stack.ftqq.com/course/detail/5),实现了复合副产品优势。
-
+
网课变现课
@@ -64,7 +64,7 @@
我2018年时曾出版过一本两万字左右的《[程序员跳槽全攻略](https://read.douban.com/ebook/7611058/)》,定价在一两块钱,累计收益扣完税有6千左右。
-
+
关于电子书的正式出版,可以参考我之前写的[《如何通过互联网出版一本小书》](https://github.com/easychen/howto-make-more-money/blob/master/ch11.md)一文中关于[BookDNA](https://www.bookdna.cn/)的介绍,注意因为文章写作时间较早,部分内容可能过时,需要自行确认。
@@ -78,13 +78,13 @@
OpenAI在2023年11月推出了Custom GPT,允许用户通过指定提示词、上传文档、添加API作为工具等方式,将通用GPT定制为专用版本。并在 2024年1月10日推出了GPT商店,采用与苹果应用商店类似的商业模式,创作者可以在其中盈利。
-
+
GPT Store
经常活跃在微博上的[宝玉XP](https://weibo.com/u/1727858283)老师,经常使用GPT翻译大量科技文章和视频字幕。他将自己的方法论制作为Custom GPT后发布在 GPT 商店,在2024年1月登上了 GPT Store Writing分类的第一名。
-
+
科技文章翻译GPT
@@ -94,7 +94,7 @@ PS:可以[点这里](https://weibo.com/1727858283/4981299343787589)查看这
通常AI产品并不能完成一个完整的工作流,于是出现了各种AI工作流软件。一个典型的例子是ComfyUI,它是一个基于Stable Diffusion的流程工具,可以完整覆盖除了图片生成以外的其他工作流。
-
+
ComfyUI
@@ -110,7 +110,7 @@ ComfyUI
### 行业知识库
-
+
行业知识库
@@ -142,7 +142,7 @@ ComfyUI
在传统行业和二三线城市,人脉关系至为重要,甚至是很多人的生存之道。但作为互联网从业者,我们可能会认为,在这个行业里,人脉关系并不是主要资源,大部分的业务还是规则驱动的。但后来我意识到,光是认识的人足够多,就可以成为撮合型业务的基础。
-
+
人脉关系的副产品形式
@@ -155,7 +155,7 @@ ComfyUI
例如,有些人每天不得不花费一到两个小时在通勤上,这段时间内,他们可能会选择在地铁上刷手机。这种生活方式能够带来的副产品之一就是阅读。
-
+
湾区日报
@@ -167,7 +167,7 @@ ComfyUI
不过,看起来湾区日报在2023年11月停止更新。其挑战之一在于,尽管文章摘要使用中文,但许多链接的内容都是英文,这对于一些读者来说可能较为困难。湾区日报创建时,人工智能还不够成熟,如果我们现在考虑使用人工智能技术来重构,那么可以将其打造成足够自动化的Reading as a Service。
-
+
Reading as a Service
--- src/infrastructure-user-pool-reach-capability.md
@@ -5,7 +5,7 @@
在一人企业方法论2.0中,我们讲了三个方面。
-
+
一人企业方法论的三大内容
@@ -19,7 +19,7 @@
按照这个说法,一人企业就像图中所示,在一个一人企业里有很多一人业务。
-
+
每个业务各自为战
@@ -27,7 +27,7 @@
因为精力被分散到各个业务上,所以一人业务之间必须要协同。一些公共的功能和资源需要抽取出来,放到一个地方公用,我们称之为基础设施。这就是我们今天要重点讲的内容。
-
+
基础设施
@@ -44,7 +44,7 @@
我们测试后发现,对一人企业来说,最低成本的用户登录方案是第三方登录。
-
+
第三方登录
@@ -69,7 +69,7 @@
所以,我们做用户池必然要有触达能力,即能够给用户发送消息或以其他方式告知他,这样才能做到流量复用和产品的二次销售。
-
+
触达方式
--- src/managing-and-utilizing-uncertaint.md
@@ -9,7 +9,7 @@
这种对不确定性的前置处理,在行业中已经成为一种通用做法,也可以说是行业最佳实践。它主要通过最小可行产品(MVP)的方式实现,但也可以通过MVP的其他变体形式实现,比如落地页和众筹。MVP背后的思想是先验证核心假设,然后再进入产品开发阶段。
-
+
一人企业画布中的核心假设
@@ -21,7 +21,7 @@
最理想的情况下,一个MVP,可以同时验证多个假设。但如果无法通过一个MVP验证所有假设,我们可以开发多个MVP,每个MVP验证特定的假设。
-
+
不确定性前置
@@ -35,7 +35,7 @@
这里需要说明的是,「渠道通路」的验证本身就充满不确定性,同样的渠道,投入的市场费用不同、验证结果会不同;同样的市场费用,营销方案不同,验证结果还会不同。因此,采用众筹这种完全真实的方式进行验证,结果才更为可信。而且,众筹不但可以验证需求,还同步完成了订单(因为用户要预付款);如果我们将达标金额设置为项目的开发成本,那么可以确保开发期间的最低收益。这在全职运营一人业务时,显得极为重要。
-
+
一个通过众筹排除的需求,节省了三个月到六个月时间
@@ -50,7 +50,7 @@
反脆弱性是一个很值得深思的概念,它是由纳西姆-尼古拉斯-塔勒布(Nassim Nicholas Taleb)在同名书籍中提出概念。
-
+
《反脆弱》
@@ -74,13 +74,13 @@
《反脆弱》中还有个很重要的概念:不对称交易。来自于书中一个故事:一位哲学家为了证明哲学家也可以用知识挣钱,于是他以非常低的价格预定了附近橄榄油压榨机的使用权。当橄榄季节来临,需求激增,而其他人无法租到压榨机时,他就能以高价转租这些机器,从而获得巨额利润。赚钱之后,这位哲学家又回到了哲学世界世界中。
-
+
哲学家预定榨油机的故事
这个故事不但有趣,其核心也展示了一种反脆弱性:通过构建一个不对称的交易模式来利用不确定性。可以参考下图:
-
+
有限的损失和「无限」的收益对比图
@@ -90,7 +90,7 @@
在了解了背后的逻辑后,我们也可以自己来构建不对称交易。比如我们可以设计一个「生男生女的占卜」交易,规则是提前预付,如果结果不准那么退钱。
-
+
一个占卜生男生女的不对称交易
@@ -98,7 +98,7 @@
这个封建迷信的例子当然不建议大家去实施,不过其实如果你仔细思考,会发现其实众筹也是完全类似的逻辑。只不过加上了一些额外规则。
-
+
众筹也是一种不对称交易
@@ -131,7 +131,7 @@
为了规避这种风险,我们通常采取自建+多平台发布的策略。
-
+
自建+多发策略规避平台风险
--- src/non-competition-strategy.md
@@ -25,7 +25,7 @@
内容管理系统(CMS)是独立开发者非常喜欢涉足的一个领域,但有一个开源巨无霸 ------ WordPress。虽然很多人以为它已经过时,但根据2023年的数据,WordPress仍占据了整个网站建设市场45%的份额,并拥有6万个免费插件和9千个免费主题。这还不包括大约1万个付费主题,而这些付费主题通常价格不高。
-
+
WordPress庞大的开源生态
@@ -35,7 +35,7 @@ WordPress庞大的开源生态
可能有人会质疑,开发插件和主题能赚多少钱呢?
-
+
ThemeFroest上的畅销主题
@@ -45,7 +45,7 @@ ThemeFroest上的畅销主题
#### Notion
-
+
Notion首页
@@ -59,7 +59,7 @@ Notion是一个最近几年迅速崛起的文档平台,通过提供API,许
#### WebFlow
-
+
WebFlow首页
@@ -91,13 +91,13 @@ WebFlow首页
比如我们不会觉得《王国维点评红楼梦》和《胡适点评红楼梦》的内容是一样的,可以有很多人从不同的角度来点评和解读。
-
+
很多版本的红楼梦点评
甚至,同一个故事,以不同的方式重复讲述,只要把握技巧,读者都能听得津津有味,不会觉得千篇一律。金庸的小说已经被翻拍成电影电视剧很多次了,但每次都能吸引不少观众。
-
+
一个故事可以有很多种讲法
@@ -127,7 +127,7 @@ WebFlow首页
比如,虽然许多人可能在做相同的事情,讲述相同的内容,但由于每个人的个性和风格不同,消费者的体验也完全不同。这种差异化消除了明显的竞争关系。
-
+
个人IP成为产品特色
@@ -139,7 +139,7 @@ WebFlow首页
改变商业逻辑的一个例子是手办盲盒。本来可以花一份钱买齐全套的手办,以盲盒形式售卖,既增加了用户对已经获得手办的重复购买、也保持了用户对未获得手办的购买欲望。
-
+
以盲盒形式售卖的手办
@@ -152,7 +152,7 @@ WebFlow首页
比如,我之前尝试创造了一款介于课程和游戏之间的产品《萌猫、室友、前端课》,它可以让用户在玩游戏的过程中学习一些前端开发的知识,这就属于游戏和课程的混搭型产品。
-
+
可以学习前端知识的游戏
--- src/opb-methodology-new-version-and-author.md
@@ -7,7 +7,7 @@
经过两到三年的不断迭代,我们现在有了2.0版的「一人企业方法论」。这个名字之所以把「公司」换成了「企业」,是因为「一人公司」容易与公司法里面一人制公司产生混淆,所以我把它改成了「企业」这个词。
-
+
新版方法论
@@ -30,7 +30,7 @@
### 独立开发者
-
+
Easy开发的独立项目
@@ -48,7 +48,7 @@ Easy开发的独立项目
所以,我还有一个身份:课程主讲人。
-
+
方糖技能站首页
@@ -60,7 +60,7 @@ Easy开发的独立项目
另外一方面,我对怎么靠技术挣钱这件事情,尤其是以个体的方式、优雅地挣钱很感兴趣。最开始,我写了[《程序员如何优雅地挣零花钱》](https://github.com/easychen/howto-make-more-money)。因为之前都是面向程序员的,所以我把这些文章和电子书都发布到了GitHub上,它们累积到现在大约有16K的Star(加星)。
-
+
Easy关于副业和一人企业的分享
@@ -77,7 +77,7 @@ Easy关于副业和一人企业的分享
换个角度看,这或许也有好的方面。如果我已经成为超级个体,那么我很可能就不会花这么多时间来反思和总结方法论,并将其作为副产品分享出来。
-
+
成为超级个体过程中的几个前置节点
@@ -106,7 +106,7 @@ Easy关于副业和一人企业的分享
在本书里边,我目前能够分享的,也就是「从螺丝钉到一般个体(工作自由)」的这一部分。希望后续有机会能补全「从一般个体到超级个体」的部分。
-
+
分享的作用和意义
--- src/product-pool-and-payment-capability.md
@@ -11,7 +11,7 @@
因此,我们将这些业务整合为一系列业务组件,同时用「产品池」用来容纳它们。
-
+
从多业务到产品池+业务组件
@@ -34,7 +34,7 @@
像XorPay,这实际上是一个小微支付平台,在这个平台上,你可以接入微信和支付宝。可以自行申请开通,店面地址可以用住宅地址,店面照片可以用大门照片。另外我记得它有一个一百的开通费用,可以和客服确认下。
-
+
XorPay的FAQ中关于开通费用的说明
@@ -59,7 +59,7 @@ XorPay的FAQ中关于开通费用的说明
简单来说,它就是将每一个业务看成一个应用,然后给这个应用分配一个参数。用户带着这个参数跳转到收银页面进行支付。支付完成后,收银台使用这个订单ID进行转向,用户再拿着这个支付完成的订单ID到业务中进行验证,整个支付就完成了。
-
+
收银台的流程示意图
@@ -75,7 +75,7 @@ XorPay的FAQ中关于开通费用的说明
它提供了完善的支付和商品相关的功能,但缺少一些国内常用的支付厂商。
-
+
WooCommerce支付网关设置界面
@@ -92,7 +92,7 @@ WooCommerce支付网关设置界面
在比较主流的创业方法论中,核心验证的是两个东西:一个是价值主张,即我们的商品对于我们的目标用户到底有没有价值;第二是渠道通路,即我们的产品对这部分人有价值,但它能增长吗?它能达到我们想要的规模吗?
-
+
两个最重要的假设
@@ -130,7 +130,7 @@ WooCommerce支付网关设置界面
发货的话,做得比较简陋一点的话我们可以把软件或者视频放到网盘上,通过群公告提供给大家链接,也可以直接放到群文件里面,这个就看群对应的功能。
-
+
通过群实现众筹
@@ -144,19 +144,19 @@ WooCommerce支付网关设置界面
需要修改的地方只有两个。第一是,我们可能需要添加一个简码,用来输出这个商品的销量和是否达标的统计,比如说像下图这样。
-
+
商品众筹简码渲染效果
其次,我们需要一个批量退款按钮。但这个其实都是可选的,没有这个按钮其实也可以退款,只不过麻烦一点。
-
+
给WooCommerce添加批量退款按钮
所以呢,不要觉得众筹很麻烦,也不要觉得这个众筹必须要有系统,它是一种思维方式,希望大家把它用起来。这对一人企业和资源紧缺的小团队来讲,真的是非常非常重要的。
-
+
众筹能力本质上是支付能力
--- src/race-track-selection-for-opb.md
@@ -9,7 +9,7 @@
这种选择对于大公司和创业公司而言,可能是理所当然的,因为大公司具有人力和资本的双重杠杆,而创业公司则依靠资本杠杆。然而,对于一人企业来说,如果按照同样的逻辑去选择赛道,那么毫无疑问会直接与大公司和创业公司撞上,产生直接的竞争,这显然不是我们所希望的。
-
+
大众刚需:拥挤又易被碾压的赛道
@@ -21,13 +21,13 @@
以营养配餐为例,对于大众来说,这可能是一个弱需求,因为偶尔的营养不足和过剩并不是他们最关心的;大多数人更注重食物的口味。然而,对于母婴群体来说,营养就是他们的第一关注点、是刚性需求,远比口味更重要。
-
+
大众的弱需求,可能是小众的强需求
通过这样的策略,我们可以在大众的弱需求中找到一部分实际上是刚需的细分市场,这正是一人企业应该着重关注的。如果你发现这个细分市场依然充满竞争,那么可以进一步细分下去。一人企业的优势就是,即使这个市场非常小,但往往对一人企业来说也足够。
-
+
一人企业可以选择更为小众的市场
--- src/setup-a-one-person-business-infrastructure.md
@@ -2,7 +2,7 @@
通过前文的讨论,我们认为一人企业的基础设施需要具备三个核心容器和四项关键能力:用户池、内容池、产品池;触达能力、支付能力、自动化能力和众包能力。
-
+
基础设施的三池四能力
@@ -13,7 +13,7 @@
首先要做的选择是,是自行开发或雇人开发(外包);还是基于开源项目搭建,比如基于 WordPress 搭建。
-
+
搭建方式的选择
@@ -41,7 +41,7 @@ WordPress的REST接口也非常完善,可以通过插件在API层面实现功
另一个潜在问题是,使用WordPress运营大规模的用户或内容网站时,性能可能成问题。尽管大部分性能问题由插件导致,但这仍是一个挑战。不过我们的产品用户数距离这个规模很远,一直在努力遇到这个问题中。
-
+
搭建方式优缺点比较
@@ -49,7 +49,7 @@ WordPress的REST接口也非常完善,可以通过插件在API层面实现功
我们认为的最佳实践是,首先基于WordPress做我们业务的最小可行产品(MVP)和众筹,同时也可以把产品的官网和文档放在这个平台上。随着业务的增长和用户量的增加,我们再自行开发。
-
+
最佳实践
@@ -74,7 +74,7 @@ WordPress的REST接口也非常完善,可以通过插件在API层面实现功
#### 微信账号整合
-
+
1. 我们添加了邀请码注册功能。这样产品还没有上线时,可以进行内测,只有知道邀请码的人才能登录。
2. 因为我们要支持个人运营,所以我们通过消息上行的方式来实现未认证公众号的登录。
@@ -82,7 +82,7 @@ WordPress的REST接口也非常完善,可以通过插件在API层面实现功
#### 微信支付
-
+
1. 标准的微信支付能力设置。
2. 收银台功能,包含支付APP和订单管理。只需一个页面跳转和一个HTTP请求验证就可以完成交易。
@@ -90,7 +90,7 @@ WordPress的REST接口也非常完善,可以通过插件在API层面实现功
#### 消息推送
-
+
1. 实现了针对文章分类的订阅。您可以订阅某一个分类和该分类的更新,然后在文章发布时,可以点击推送按钮来推送给订阅该分类的用户。
2. 用户还可以通过一个管理界面来对这些订阅进行管理。
@@ -98,7 +98,7 @@ WordPress的REST接口也非常完善,可以通过插件在API层面实现功
#### 商品众筹
-
+
在WooCommerce上实现的简版众筹。
@@ -110,7 +110,7 @@ WordPress的REST接口也非常完善,可以通过插件在API层面实现功
我们来看一下回顾对比下一人企业基础设施的结构和能力。
-
+
可以看到,方糖OPB插件最整体覆盖是很高的,除了被红线框出的几处:
@@ -120,7 +120,7 @@ WordPress的REST接口也非常完善,可以通过插件在API层面实现功
### 配合 BudCoder 和 FlowDeer 使用
-
+
在搭配 BudCoder 和 FlowDeer 这两个工具后,我们可以将覆盖度进一步提高。
--- src/snowballing-and-chain-propagation.md
@@ -2,7 +2,7 @@
我们继续探讨底层逻辑。这次聊聊滚雪球和链式传播。
-
+
复利和滚雪球
@@ -17,7 +17,7 @@
即使不主动投资,在我国大多数银行的定期存款,也可以选择利息在到期后与本金合并重新存入,从而实现复利效应。当然,因为利率本身不高、本金不多、周期又太长,这种财富增长的速度差强人意。
-
+
被动理财是精灵族的天赋技能
@@ -41,7 +41,7 @@
- UGC:发布内容引发用户讨论,将用户讨论整理作为新的内容发布,再整理新内容的讨论
- 付费式增长引擎:向渠道付费为商业产品获取用户,将从用户身上变现的钱再投入渠道,获取更多用户
-
+
用已有资产收益投入新资产
@@ -52,7 +52,7 @@
链式传播:人传人
--------
-
+
链式传播
@@ -68,7 +68,7 @@
《引爆点》(The Tipping Point)这本书由马尔科姆-格拉德威尔(Malcolm Gladwell)所著,首次出版于2000年。主要探讨了小的变化如何能够引发大的社会效应,即所谓的「引爆点」理论。作者借鉴流行病学的概念,通过将社会现象和病毒的传播类比展开研究。
-
+
《引爆点》
@@ -87,7 +87,7 @@
《疯传》(Contagious: Why Things Catch On)这本书由乔纳-伯格(Jonah Berger)所著,于2013年出版。在书中,作者提出了使事物变得具有传染性的六大原则,即伯格的「STEPPS」模型。个人认为,这本书则可以看做对《引爆点》的绝佳补充,两者构成了相对完整的、又极具可操作性的链式传播理论。
-
+
《疯传》
@@ -102,13 +102,13 @@
六大原则中,关于情绪的研究非常有意思,同时也有些反直觉:满足感和悲伤并不会促进传播。
-
+
情绪的唤醒效果
这些原则已经被大量证明和使用,当我们不知道怎么策划一个传播时,可以把每一个原则都过一遍,往往会很快获得灵感。我们也了设计「链式传播画布」,帮助大家更全面地思考。
-
+
OPB链式传播画布
--- src/start-from-side-project.md
@@ -5,7 +5,7 @@
从优势开始
-----
-
+
首先,我要给大家的建议是,不要一时冲动就辞职。最佳的做法是从副业开始,从优势开始。这里涉及到两个核心优势:副产品优势和低成本优势。
@@ -34,7 +34,7 @@
对于程序员和技术人群来讲,副业在职业安全上还有额外的意义。
-
+
技术债务驱动的架构更新
@@ -49,7 +49,7 @@
一个好的模式是,从副业开始,不断增长MRR(月周期收入),逐步剥离对主业的依赖,直到副业和主业解耦以后,再考虑将副业转为主业。我们称这个模式为「渐进式成长」。
-
+
渐进式成长
--- src/structured-advantage.md
@@ -7,7 +7,7 @@
所谓的结构化优势,是指针对竞品的「固有弱点」构建的优势。固有弱点一词,源于《不对称创新》这本有点年代的书籍。尽管我不能确定该概念是否最初由此提出,我确实是在这本书中首次接触到。
-
+
《不对称创新》
@@ -15,7 +15,7 @@
固有弱点是指那些难以纠正的弱点。这种难以纠正并非因为不可见,而是由于价值网络的特性。使得即使看得到,却改不了。通常,这些弱点源自两个方面:一是商业模式;二是竞争优势。
-
+
固有弱点的常见来源
@@ -23,7 +23,7 @@
以我们之前讨论过的百视达的线上业务为例,其商业模式本身存在冲突,新模式的引入(线上业务)会摧毁旧的模式(线下门店)。这种情况下,公司会考虑,为何要为了一个(还未完全成熟的)新模式去改变旧的模式,尤其是当旧模式仍然能带来可观收入时。另一方面,即使高层下定了决心,只要价值网络没有更新,执行起来也会是困难重重。这就来自商业模式的固有弱点。
-
+
《网飞传奇》
@@ -46,7 +46,7 @@
我们来看一个对程序员讲很实用的例子。
-
+
复合副产品优势
@@ -68,7 +68,7 @@
#### 官方不愿意做
-
+
基于第三方优势的聚合红包应用
@@ -82,7 +82,7 @@
聚合是第三方优势的一个常见套路。因为它在第三方优势的基础上,往往还能为用户带来便利。
-
+
LLM结果聚合
@@ -94,7 +94,7 @@ LLM结果聚合
低成本是一人企业的一个巨大的结构化优势。我们在讨论中不停地提到一个概念,那就是「最低可行利润」。只要达到这个最低可行利润,我们就能收支平衡,即便不赚钱,如果能满足我们的兴趣和爱好,我们也可以持续地经营下去。哪怕是五年、十年,直到最终取得成功。如果是以副业形态存在的一人业务,我们甚至还可以补贴一部分成本来维持。这让我们能以长期主义的角度来思考问题。
-
+
并不是所有人都能够做长远的打算
@@ -110,7 +110,7 @@ LLM结果聚合
创业公司因为需要依靠融资来发展,将增长视为核心指标,更天然的缺乏耐心。在社交书签服务领域的Delicious和Pinboard的经历是一个很好的例子。
-
+
Delicious和Pinboard
@@ -126,7 +126,7 @@ Delicious和Pinboard的故事展示了一人企业如何在维持最低成本的
专属服务优势是一人企业的又一个结构化优势。其核心逻辑是,由于客户数量有限,我们能够为每一位客户提供个性化服务,前提是每位客户的客单价足够高。
-
+
转服服务优势
--- src/what-is-the-ideal-one-person-business-infrastructure.md
@@ -25,7 +25,7 @@
平台有自己的红线和潜规则,甚至会出现一些失误。如果不小心碰到这些问题,可能会导致封号。对于个人用户,这可能没什么问题,但对于经营者,可能会导致数年构建的业务前功尽弃,所有努力化为乌有。
-
+
### 产品形态和品牌
@@ -48,7 +48,7 @@
价格高是一个典型问题,因为绝大部分的SaaS是面向企业的。
-
+
典型SaaS服务的定价
@@ -58,7 +58,7 @@
另一个问题是很多基础设施不对个人开放,比如支付。
-
+
微信支付申请材料要求
@@ -68,13 +68,13 @@
一个潜在风险是如果我们不能自行建设基础设施,而是使用服务商架设的在线版本,会有两种潜在风险。一是企业突然倒闭,即使退钱,但我们的业务就突然崩溃,没有补救方案。
-
+
invision倒闭
另一个风险是企业突然涨价,因为业务已经在平台上,短时间内迁不出去,所以当服务商涨价时,议价能力很弱。如果我们赚到了很多钱,分润给服务商也无所谓,但对一人企业来说,如果费用过高,整体可能没有收益,甚至亏损。
-
+
企业微信突然对外部联系人收费
--- src/why-scalability-is-possible.md
@@ -7,7 +7,7 @@
新杠杆
---
-
+
《纳瓦尔宝典》
@@ -15,7 +15,7 @@
### 传统杠杆
-
+
传统杠杆
@@ -29,7 +29,7 @@
而新杠杆则是指那些复制边际成本为零的产品,如多媒体内容、版权、代码等。
-
+
新旧杠杆的对比
@@ -37,7 +37,7 @@
而新杠杆在定义上,就是冲着「无(边际)成本规划化」去的。因为对于复制和发布成本为零的产品,一旦生产完成,无论复制多少份,都不会再增加成本。这是一个简单的乘法,0乘以多少结果都是0。
-
+
新杠杆为何可以规模化
@@ -50,7 +50,7 @@
新基础设施
-----
-
+
逐渐成熟的新型基础设施
@@ -73,7 +73,7 @@
### 《百万美金的一人企业》
-
+
《百万美金的一人企业》
@@ -90,7 +90,7 @@
### Trenders.vc的报告
-[](https://trends.vc/archive/)
+[](https://trends.vc/archive/)
trends.vc趋势报告
@@ -106,7 +106,7 @@ Trenders.vc是一个针对独立开发者的趋势报告服务,以下是[Trend
### IndieHackers的案例库
-[](https://www.indiehackers.com/stories)
+[](https://www.indiehackers.com/stories)
IH案例库
--- src/why-thinking-big-is-possible.md
@@ -2,7 +2,7 @@
在「规划一人企业」部分,我想与大家分享三个重要方面的内容:底层逻辑、赛道选择,以及竞争策略。
-
+
规划一人企业的三大思考方向
@@ -23,7 +23,7 @@
时势背后,是边界的变动。
-
+
原有商业逻辑
@@ -31,7 +31,7 @@
但是,由于技术的变化、政策的调整、消费习惯的改变等因素,现在出现了新的商业逻辑。这个新的商业逻辑可能像一条新的分支,对原有的商业逻辑中造成了分流。
-
+
新的商业逻辑
@@ -44,7 +44,7 @@
边界变动的类型
-------
-
+
边界变动的常见类型
@@ -77,13 +77,13 @@
你可能会想,既然出现了新的商业路径,那么,那些处于原商业路径上的公司并非无知,他们难道看不见吗?为什么他们不参与,不将其实施呢?
-
+
《创新者的窘境》
在《创新者的窘境》一书中,有专门论述这一问题,推荐大家有空读一读。
-
+
边界变动和颠覆式创新
@@ -101,7 +101,7 @@
处于主导地位的公司往往会过度关注现有的产品、技术和客户群,而对颠覆性创新反应迟缓,从而最终被市场取代。这就是「创新者的窘境」。这一逻辑不仅适用于一人企业,也是创业公司的核心逻辑。
-
+
《网飞传奇》
|
one-person-businesses-methodology-v2.0
|
easychen
|
PHP
|
PHP
| 5,272
| 464
|
《一人企业方法论》第二版,也适合做其他副业(比如自媒体、电商、数字商品)的非技术人群。
|
easychen_one-person-businesses-methodology-v2.0
|
DOC_CHANGE
|
changes in readme
|
e6555ac5e1945deed7d5e75e9a26048621385ccb
|
2024-08-24 04:06:32
|
electron-appveyor-updater[bot]
|
build: update appveyor image to latest version (#43451) Co-authored-by: electron-appveyor-updater[bot] <161660339+electron-appveyor-updater[bot]@users.noreply.github.com>
| false
| 2
| 2
| 4
|
--- appveyor-woa.yml
@@ -29,7 +29,7 @@
version: 1.0.{build}
build_cloud: electronhq-16-core
-image: e-130.0.6672.0
+image: e-129.0.6664.0
environment:
GIT_CACHE_PATH: C:\Users\appveyor\libcc_cache
ELECTRON_OUT_DIR: Default
--- appveyor.yml
@@ -29,7 +29,7 @@
version: 1.0.{build}
build_cloud: electronhq-16-core
-image: e-130.0.6672.0
+image: e-129.0.6664.0
environment:
GIT_CACHE_PATH: C:\Users\appveyor\libcc_cache
ELECTRON_OUT_DIR: Default
|
electron
|
electron
|
C++
|
C++
| 115,677
| 15,852
|
:electron: Build cross-platform desktop apps with JavaScript, HTML, and CSS
|
electron_electron
|
PERF_IMPROVEMENT
|
Optimizing Electron's JavaScript runtime performance
|
80415a2b1287d8475cf1e1929e82e4e106b428b5
|
2023-02-16 21:07:02
|
Richard McElreath
|
lecture 14 links
| false
| 1
| 1
| 2
|
--- README.md
@@ -37,7 +37,7 @@ Note about slides: In some browsers, the slides don't show correctly. If points
| Week 04 | 27 January | Chapters 7,8,9 | [7] <[Overfitting](https://www.youtube.com/watch?v=1VgYIsANQck&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=7)> <[Slides](https://speakerdeck.com/rmcelreath/statistical-rethinking-2023-lecture-07)> <br> [8] <[MCMC](https://www.youtube.com/watch?v=rZk2FqX2XnY&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=8)> <[Slides](https://speakerdeck.com/rmcelreath/statistical-rethinking-2023-lecture-08)>
| Week 05 | 03 February | Chapters 10 and 11 | [9] <[Modeling Events](https://www.youtube.com/watch?v=Zi6N3GLUJmw&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=9)> <[Slides](https://speakerdeck.com/rmcelreath/statistical-rethinking-2023-lecture-09)> <br> [10] <[Counts and Confounds](https://www.youtube.com/watch?v=jokxu18egu0&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=10)> <[Slides](https://speakerdeck.com/rmcelreath/statistical-rethinking-2023-lecture-10)>
| Week 06 | 10 February | Chapters 11 and 12 | [11] <[Ordered Categories](https://www.youtube.com/watch?v=VVQaIkom5D0&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=11)> <[Slides](https://github.com/rmcelreath/stat_rethinking_2023/raw/main/slides/Lecture_11-ord_logit.pdf)> <br> [12] <[Multilevel Models](https://www.youtube.com/watch?v=iwVqiiXYeC4&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=12)> <[Slides](https://raw.githubusercontent.com/rmcelreath/stat_rethinking_2023/main/slides/Lecture_12-GLMM1.pdf)>
-| Week 07 | 17 February | Chapter 13 | [13] <[Multilevel Adventures](https://www.youtube.com/watch?v=sgqMkZeslxA&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=13)> <[Slides](https://raw.githubusercontent.com/rmcelreath/stat_rethinking_2023/main/slides/Lecture_13-GLMM2.pdf)> <br> [14] <[Correlated Features](https://www.youtube.com/watch?v=Es44-Bp1aKo&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=14)> <[Slides](https://github.com/rmcelreath/stat_rethinking_2023/raw/main/slides/Lecture_14-GLMM3.pdf)>
+| Week 07 | 17 February | Chapter 13 | [13] <[Multilevel Adventures](https://www.youtube.com/watch?v=sgqMkZeslxA&list=PLDcUM9US4XdPz-KxHM4XHt7uUVGWWVSus&index=13)> <[Slides](https://raw.githubusercontent.com/rmcelreath/stat_rethinking_2023/main/slides/Lecture_13-GLMM2.pdf)> <br> [14] More Multilevel Models
| Week 08 | 24 February | Chapter 14 | [15] Social networks <br> [16] Gaussian Processes
| Week 09 | 03 March | Chapter 15 | [17] Measurement Error <br> [18] Missing Data
| Week 10 | 10 March | Chapters 16 and 17 | [19] Beyond GLMs: State-space Models, ODEs <br> [20] Horoscopes
|
stat_rethinking_2024
|
rmcelreath
|
R
|
R
| 1,474
| 151
| null |
rmcelreath_stat_rethinking_2024
|
DOC_CHANGE
|
Obvious
|
6899c2cb5ab784e6ee1ebd46397c98df0a18e905
|
2025-03-24 17:54:12
|
Jesús Sánchez Palma
|
Fixed error "method_exists(): Argument #1 ($object_or_class) must be of type object|string, array given" when using Livewire and AdminLTE. (#524) * Fixed error 'method_exists(): Argument #1 ($object_or_class) must be of type object|string, array given' when using Darryldecode\Cart\Cart. * Check if the $event variable is an object before checking if it was dispatched by NativePHP.
| false
| 4
| 0
| 4
|
--- src/Events/EventWatcher.php
@@ -14,10 +14,6 @@ class EventWatcher
Event::listen('*', function (string $eventName, array $data) {
$event = $data[0] ?? (object) null;
- if(! is_object($event)) {
- return;
- }
-
if (! method_exists($event, 'broadcastOn')) {
return;
}
|
laravel
|
nativephp
|
PHP
|
PHP
| 3,498
| 182
|
Laravel wrapper for the NativePHP framework
|
nativephp_laravel
|
BUG_FIX
|
Obvious
|
b86f2357b0546f346a889d57a56f0c42346b2fcb
|
2023-10-11 11:47:56
|
Ilkka Seppälä
|
fix: include crtp module in build (#2633)
| false
| 1
| 0
| 1
|
--- pom.xml
@@ -207,7 +207,6 @@
<module>context-object</module>
<module>thread-local-storage</module>
<module>optimistic-offline-lock</module>
- <module>crtp</module>
</modules>
<repositories>
<repository>
|
java-design-patterns
|
iluwatar
|
Java
|
Java
| 90,911
| 26,831
|
Design patterns implemented in Java
|
iluwatar_java-design-patterns
|
BUG_FIX
|
Obvious
|
66b4ed9552d3ff7d6f49d531c1a13f05352ffce6
| null |
freearhey
|
Added link to Senegal playlist
| false
| 3
| 0
| 3
|
--- README.md
@@ -162,6 +162,7 @@ Or select a playlist for a specific country from the list below.
| Saint Kitts and Nevis | 2 | `https://raw.githubusercontent.com/freearhey/iptv/master/channels/kn.m3u` |
| San Marino | 2 | `https://raw.githubusercontent.com/freearhey/iptv/master/channels/sm.m3u` |
| Saudi Arabia | 8 | `https://raw.githubusercontent.com/freearhey/iptv/master/channels/sa.m3u` | `http://195.154.221.171/epg/guidearab.xml.gz`
+| Senegal | 2 | `https://raw.githubusercontent.com/freearhey/iptv/master/channels/sn.m3u` |
| Serbia | 5 | `https://raw.githubusercontent.com/freearhey/iptv/master/channels/rs.m3u` | `http://epg.streamstv.me/epg/guide-exyu.xml.gz`
| Sierra Leone | 1 | `https://raw.githubusercontent.com/freearhey/iptv/master/channels/sl.m3u` |
| Singapore | 10 | `https://raw.githubusercontent.com/freearhey/iptv/master/channels/sg.m3u` |
--- index.m3u
@@ -243,6 +243,8 @@ channels/kn.m3u
channels/sm.m3u
#EXTINF:-1,Saudi Arabia
channels/sa.m3u
+#EXTINF:-1,Senegal
+channels/sn.m3u
#EXTINF:-1,Serbia
channels/rs.m3u
#EXTINF:-1,Sierra Leone
|
iptv-org_iptv.json
| null | null | null | null | null | null |
iptv-org_iptv.json
|
NEW_FEAT
|
4, added a new link
|
5df0b526328aebc6c5a511a8c0590b7408157083
|
2024-03-10 11:58:52
|
Zezhong Li
|
Update README.md
| false
| 7
| 6
| 13
|
--- README.md
@@ -58,8 +58,8 @@ NOTE: the ranking has no particular order.
## [Yasuko Matsubara](https://www.dm.sanken.osaka-u.ac.jp/~yasuko/) & [Yasushi Sakurai](https://www.dm.sanken.osaka-u.ac.jp/~yasushi/) (from [Sakurai & Matsubara Lab](https://www.dm.sanken.osaka-u.ac.jp/))
| TYPE | Venue | Paper Title and Paper Interpretation | Code |
| :----------------------------------------------------------: | :-----------: | :----------------------------------------------------------: | :----------------------------------------------------------: |
-|  | *WWW '24* | Dynamic Multi-Network Mining of Tensor Time Series 🌟 | [DMM](https://github.com/KoheiObata/DMM) |
-|  | *WWW '23* | Fast and Multi-aspect Mining of Complex Time-stamped Event Streams 🌟 | [CubeScope](https://github.com/kotaNakm/CubeScope) |
+|  | WWW '24 | Dynamic Multi-Network Mining of Tensor Time Series 🌟 | [DMM](https://github.com/KoheiObata/DMM) |
+|  | WWW '23 | Fast and Multi-aspect Mining of Complex Time-stamped Event Streams 🌟 | [CubeScope](https://github.com/kotaNakm/CubeScope) |
|  | *KDD '22* | Fast Mining and Forecasting of Co-evolving Epidemiological Data Streams 🌟 | None |
|  | *CIKM '22* | Modeling Dynamic Interactions over Tensor Streams | [Dismo](https://github.com/kokikwbt/dismo) |
|  | *CIKM '22* | Mining Reaction and Diffusion Dynamics in Social Activities 🌟 | None |
@@ -70,7 +70,7 @@ NOTE: the ranking has no particular order.
|  | *CIKM '19* | Automatic Sequential Pattern Mining in Data Streams | None |
|  | *KDD '16* | Regime Shifts in Streams: Real-time Forecasting of Co-evolving Time Sequences | [RegimeCast](https://www.dm.sanken.osaka-u.ac.jp/~yasuko/SRC/regimecast.zip) |
|  | *WWW '16* | Non-linear mining of competing local activities | [CompCube](https://www.dm.sanken.osaka-u.ac.jp/~yasuko/SRC/compcube.zip) |
-|  | *WWW '15* | The web as a jungle: Non-linear dynamical systems for co-evolving online activities 🌟 | [Ecoweb & dataset](https://www.dm.sanken.osaka-u.ac.jp/~yasuko/SRC/ecoweb.zip) |
+|  | WWW '15 | The web as a jungle: Non-linear dynamical systems for co-evolving online activities 🌟 | [Ecoweb & dataset](https://www.dm.sanken.osaka-u.ac.jp/~yasuko/SRC/ecoweb.zip) |
|  | *SIGMOD '14* | AutoPlait Automatic Mining of Co-evolving Time Sequences 🌟 | [AutoPlait](https://www.dm.sanken.osaka-u.ac.jp/~yasuko/SRC/autoplait.zip) |
|  | *ICDM '14* | Fast and Exact Monitoring of Co-evolving Data Streams | None |
|  | *KDD '14* | FUNNEL Automatic Mining of Spatially Coevolving Epidemics | [Funnel](https://www.dm.sanken.osaka-u.ac.jp/~yasuko/SRC/funnel.zip) |
@@ -150,7 +150,7 @@ NOTE: the ranking has no particular order.
| TYPE | Venue | Paper Title and Paper Interpretation | Code |
| :----------------------------------------------------------: | :-------: | :----------------------------------------------------------: | :----------------------------------------------------------: |
-|  | *WWW '21* | Tensor time-series forecasting and anomaly detection with augmented causality | None |
+|  | *WWW '21* | Network of Tensor Time Series | [NET3](https://github.com/baoyujing/NET3) |
|  | *SDM '15* | Fast Mining of a Network of Coevolving Time Series | [dcmf](https://github.com/kokikwbt/dcmf/tree/master) (Unofficial)  |
|  | *KDD '15* | Facets: Fast comprehensive mining of coevolving high-order time | [facets](https://github.com/kokikwbt/facets) (Unofficial)  |
@@ -158,13 +158,12 @@ NOTE: the ranking has no particular order.
| TYPE | Venue | Paper Title and Paper Interpretation | Code |
| :----------------------------------------------------------: | :---------------------------: | :----------------------------------------------------------: | :----------------------------------------------------------: |
-|  | *Arxiv'24* | Network of Tensor Time Series | [NET3](https://github.com/baoyujing/NET3) |
|  | *WWW '24* | E2Usd: Efficient-yet-effective Unsupervised State Detection for Multivariate Time Series 🌟 | [E2Usd](https://github.com/AI4CTS/E2Usd) |
|  | *Information Fusion '24* | MultiBEATS Blocks of eigenvalues algorithm for multivariate time series dimensionality reduction 🌟 | [MultiBEATS](https://github.com/auroragonzalez/multiBEATS) |
|  | *Information Sciences '24* | Memetic segmentation based on variable lag aware for multivariate time series 🌟 | None |
|  | *TKDE '23* | Change Point Detection in Multi-channel Time Series via a Time-invariant Representation 🌟 | [MC-TIRE](https://github.com/caozhenxiang/MC-TIRE) |
|  | *TII '23* | A Boundary Consistency-Aware Multitask Learning Framework for Joint Activity Segmentation and Recognition With Wearable Sensors | [Coming soom](https://github.com/xspc/Segmentation-Sensor-based-HAR) 🙃 |
-|  | *SIGMOD '23* | Time2State: An Unsupervised Framework for Inferring the Latent States in Time Series Data 🌟 | [Time2State](https://github.com/Lab-ANT/Time2State) |
+|  | SIGMOD '23 | Time2State: An Unsupervised Framework for Inferring the Latent States in Time Series Data 🌟 | [Time2State](https://github.com/Lab-ANT/Time2State) |
|  | *TKDD '23* | Modeling Regime Shifts in Multiple Time Series | None |
|  | *World Wide Web '23* | Anomaly and change point detection for time series with concept drift | None |
|  | *EAAI '23* | PrecTime A deep learning architecture for precise time series segmentation in industrial manufacturing operations | None |
@@ -176,7 +175,7 @@ NOTE: the ranking has no particular order.
|  | *ICDM '22* | Change Detection with Probabilistic Models on Persistence Diagrams | None |
|  | *EAAI '22* | Graft : A graph based time series data mining framework | None |
|  | *GLOBECOM '22* | Multi-level Contrast Network for Wearables-based Joint Activity Segmentation and Recognition | None |
-|  | *ESWA '22* | Real-time Change-Point Detection A deep neural network-based adaptive approach for detecting changes in multivariate time series data | None |
+|  | ESWA '22 | Real-time Change-Point Detection A deep neural network-based adaptive approach for detecting changes in multivariate time series data | None |
|  | *npj digital medicine '21* | U-Sleep: resilient high-frequency sleep staging 🌟 | [website](https://sleep.ai.ku.dk/) |
|  | *IEEE TSP '21* | Change Point Detection in Time Series Data Using Autoencoders With a Time-Invariant Representation 🌟 | [TIRE](https://github.com/deryckt/TIRE) |
|  | *IJCNN '21* | A Transferable Technique for Detecting and Localising Segments of Repeating Patterns in Time series | None |
|
awesome-time-series-segmentation-papers
|
lzz19980125
|
MATLAB
|
MATLAB
| 454
| 8
|
This repository contains a reading list of papers on Time Series Segmentation. This repository is still being continuously improved.
|
lzz19980125_awesome-time-series-segmentation-papers
|
DOC_CHANGE
|
changes in readme
|
5023f018948134baada0095b120d49649cdc7869
|
2025-04-06T00:23:17Z
|
chromium-autoroll
|
Roll Chrome Android ARM64 PGO Profile Roll Chrome Android ARM64 PGO profile from chrome-android64-main-1743881103-29341d17e8c4b7dd2d9b75d012096d6431bcfab8-078ab5c5b44ea16a4f84c2af9f982cc81dd16787.profdata to chrome-android64-main-1743888469-68100959d1ba6b5d7bbf78a65e462ce185c50f45-b434dbc308bd199ed14346e85c588e0cb8d49aaa.profdata If this roll has caused a breakage, revert this CL and stop the roller using the controls here: https://autoroll.skia.org/r/pgo-android-arm64-chromium Please CC [email protected],[email protected] on the revert to ensure that a human is aware of the problem. To file a bug in Chromium main branch: https://bugs.chromium.org/p/chromium/issues/entry To report a problem with the AutoRoller itself, please file a bug: https://issues.skia.org/issues/new?component=1389291&template=1850622 Documentation for the AutoRoller is here: https://skia.googlesource.com/buildbot/+doc/main/autoroll/README.md Tbr: [email protected] Merge-Approval-Bypass: Chrome autoroller Change-Id: I096dead6b9def291d88fca759a1a90030fe63fea Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/6436233 Commit-Queue: chromium-autoroll <[email protected]> Bot-Commit: chromium-autoroll <[email protected]> Cr-Commit-Position: refs/heads/main@{#1443146}
| false
| 1
| 1
| 2
|
--- chrome/build/android-arm64.pgo.txt
@@ -1 +1 @@
-chrome-android64-main-1743881103-29341d17e8c4b7dd2d9b75d012096d6431bcfab8-078ab5c5b44ea16a4f84c2af9f982cc81dd16787.profdata
+chrome-android64-main-1743888469-68100959d1ba6b5d7bbf78a65e462ce185c50f45-b434dbc308bd199ed14346e85c588e0cb8d49aaa.profdata
|
chromium
| null |
C
|
C
| null | null |
Browser
|
_chromium
|
CONFIG_CHANGE
|
Only config file changes have been made.
|
d3cd50b39fc61bf2e86edf3892331663a09dc253
|
2023-08-09 05:46:02
|
Jelly Lee
|
Update README.md
| false
| 32
| 0
| 32
|
--- docs/llm-base/distribution-train/auto-parallel/README.md
@@ -1,32 +0,0 @@
-
-
-
-
-北大河图大模型自动并行训练工具Galvatron:https://zhuanlan.zhihu.com/p/591924340
-
-大模型的自动并行之难主要体现在以下三个方面:
-
-(1)多样性:首先,在并行方式方面,目前大模型的并行方式呈现出百花齐放的态势,
-即使是对于同一个算子,不考虑混合并行方式,不同的基础并行方式也会存在显著的差异,从而导致不同的内存开销、通信代价以及计算效率。
-其次,在模型方面,各种各样的模型架构最近也是层出不穷,这往往也伴随着不同的模型配置(例如不同输入序列长度,模型层数,模型隐层宽度等),从而造成计算负载上的差异。
-另外,在硬件方面,用户往往面临着非常差异化的集群环境,可能会面临不同的内存容量、通信带宽、计算能力等等。
-总体上来看,由于上述多样性的存在,没有哪种并行技术总是能够获得最佳训练效率,“自动并行”也就成为了分布式训练的核心挑战。
-
-(2)复杂性:上述分析还相对比较单一,实际上哪怕是对于同一个算子也可以同时应用多种不同的基础并行方式,
-如果考虑到由这些基础并行方式复合所构成的混合并行方式,则会导致问题变得非常复杂。
-更重要的是,大模型的计算图往往结构非常庞大,对应的也需要更大规模的集群,如果对每个算子都进行探索(包括选取集群中合适的计算资源以及设计相应的混合并行方式),
-会带来组合空间爆炸的问题,寻找整个模型的最优分布式执行方案变得难以求解。
-
-
-(3)实用性:除此之外,实用性也是非常重要的问题。
-一方面,在进行自动并行搜索的过程中,对于各种分布式执行方案,必须提供比较精确的内存、通信、计算开销,
-否则会导致结果与实际执行偏差过大,产生次优解或者根本无法使用。
-为此,就需要非常精准的代价模型,对不同的模型结构和硬件条件进行建模。
-另一方面,系统提供自动并行能力所带来的额外时间开销必须在一个可以接受的范围内,过于高昂的搜索代价同样也无法接受。
-
-
-
-
-
-
-
|
llm-action
|
liguodongiot
|
HTML
|
HTML
| 15,588
| 1,812
|
本项目旨在分享大模型相关技术原理以及实战经验(大模型工程化、大模型应用落地)
|
liguodongiot_llm-action
|
DOC_CHANGE
|
changes in readme
|
9efb31a2f02c1f61b637eec1fa034fab33e2cd0a
|
2022-11-10 21:38:57
|
Serhiy Mytrovtsiy
|
feat: added foreground color to the battery widget when the inner percentage is enabled (iOS-like style)
| false
| 17
| 9
| 26
|
--- Kit/Widgets/Battery.swift
@@ -147,27 +147,15 @@ public class BatterykWidget: WidgetWrapper {
if let percentage = self.percentage {
let maxWidth = batterySize.width - offset*2 - borderWidth*2 - 1
- let innerWidth: CGFloat = max(1, maxWidth * CGFloat(percentage))
- let innerOffset: CGFloat = -offset + borderWidth + 1
+ var innerWidth: CGFloat = max(1, maxWidth * CGFloat(percentage))
var colorState = self.colorState
- var color = percentage.batteryColor(color: colorState)
- if self.lowPowerModeState {
- color = percentage.batteryColor(color: colorState, lowPowerMode: self.lowPowerMode)
- }
if self.additional == "innerPercentage" && !self.ACStatus {
+ innerWidth = maxWidth
colorState = false
-
- let innerUnderground = NSBezierPath(roundedRect: NSRect(
- x: batteryFrame.bounds.origin.x + innerOffset,
- y: batteryFrame.bounds.origin.y + innerOffset,
- width: maxWidth,
- height: batterySize.height - offset*2 - borderWidth*2 - 1
- ), xRadius: 1, yRadius: 1)
- color.withAlphaComponent(0.65).set()
- innerUnderground.fill()
}
+ let innerOffset: CGFloat = -offset + borderWidth + 1
let inner = NSBezierPath(roundedRect: NSRect(
x: batteryFrame.bounds.origin.x + innerOffset,
y: batteryFrame.bounds.origin.y + innerOffset,
@@ -175,7 +163,11 @@ public class BatterykWidget: WidgetWrapper {
height: batterySize.height - offset*2 - borderWidth*2 - 1
), xRadius: 1, yRadius: 1)
- color.set()
+ if self.lowPowerModeState {
+ percentage.batteryColor(color: colorState, lowPowerMode: self.lowPowerMode).set()
+ } else {
+ percentage.batteryColor(color: colorState).set()
+ }
inner.fill()
if self.additional == "innerPercentage" && !self.ACStatus {
@@ -188,7 +180,7 @@ public class BatterykWidget: WidgetWrapper {
]
let value = "\(Int((percentage.rounded(toPlaces: 2)) * 100))"
- let rect = CGRect(x: inner.bounds.origin.x, y: (Constants.Widget.height-10)/2, width: maxWidth, height: 8)
+ let rect = CGRect(x: inner.bounds.origin.x, y: (Constants.Widget.height-10)/2, width: innerWidth, height: 8)
let str = NSAttributedString.init(string: value, attributes: attributes)
ctx.saveGState()
|
stats
|
exelban
|
Swift
|
Swift
| 29,655
| 950
|
macOS system monitor in your menu bar
|
exelban_stats
|
NEW_FEAT
|
Obvious
|
5640d641d6066a85439d7027b960ebc2cebd37de
| null |
Thomas Aylott
|
skip Worker test unless the browser supports them
| false
| 1
| 1
| 0
|
--- ReactWebWorker-test.js
@@ -22,7 +22,7 @@
"use strict";
describe('ReactWebWorker', function() {
- it('can run React in a web worker', function() {
+ ;(typeof Worker == 'undefined' ? xit : it)('can run React in a web worker', function() {
var done = false;
var error;
|
facebook_react.json
| null | null | null | null | null | null |
facebook_react.json
|
BUG_FIX
|
3, most likely as we are skipping some tests which were enabled before
|
43b415c0321016ae91570d9d46ed37b6f5931520
|
2022-11-08 12:57:08
|
longpanda
|
Fix an issue that CGI backup tool can not open in WePE.
| false
| 2
| 8
| 10
|
--- INSTALL/ventoy/vtoyjump32.exe
Binary files a/INSTALL/ventoy/vtoyjump32.exe and b/INSTALL/ventoy/vtoyjump32.exe differ
--- INSTALL/ventoy/vtoyjump64.exe
Binary files a/INSTALL/ventoy/vtoyjump64.exe and b/INSTALL/ventoy/vtoyjump64.exe differ
--- vtoyjump/vtoyjump/vtoyjump.c
@@ -46,7 +46,7 @@ static CHAR g_prog_name[MAX_PATH];
#define VTOY_PECMD_PATH "X:\\Windows\\system32\\ventoy\\PECMD.EXE"
#define ORG_PECMD_PATH "X:\\Windows\\system32\\PECMD.EXE"
-#define ORG_PECMD_BK_PATH "X:\\Windows\\system32\\VTOYJUMP.EXE"
+#define ORG_PECMD_BK_PATH "X:\\Windows\\system32\\PECMD.EXE_BACK.EXE"
#define WIMBOOT_FILE "X:\\Windows\\system32\\vtoy_wimboot"
#define WIMBOOT_DONE "X:\\Windows\\system32\\vtoy_wimboot_done"
@@ -2533,7 +2533,7 @@ int real_main(int argc, char **argv)
if (_stricmp(g_prog_name, "winpeshl.exe") != 0 && IsFileExist("ventoy\\%s", g_prog_name))
{
- sprintf_s(NewFile, sizeof(NewFile), "%s\\VTOYJUMP.EXE", g_prog_dir);
+ sprintf_s(NewFile, sizeof(NewFile), "%s_BACK.EXE", g_prog_full_path);
MoveFileA(g_prog_full_path, NewFile);
Log("Move <%s> to <%s>", g_prog_full_path, NewFile);
@@ -2578,6 +2578,12 @@ int real_main(int argc, char **argv)
sprintf_s(LunchFile, sizeof(LunchFile), "%s", "cmd.exe");
}
+ if (IsFileExist(ORG_PECMD_BK_PATH))
+ {
+ Log("Delete backup file <%s>", ORG_PECMD_BK_PATH);
+ vtoy_cmd_delete_file(ORG_PECMD_BK_PATH);
+ }
+
Log("Backup log at this point");
CopyFileA(LOG_FILE, "X:\\Windows\\ventoy.backup", TRUE);
|
ventoy
|
ventoy
|
C
|
C
| 65,265
| 4,197
|
A new bootable USB solution.
|
ventoy_ventoy
|
BUG_FIX
|
correcting display behavior under Wayland
|
29a711a542656e665cf1de98e5f60352dc9bf228
|
2022-11-04 14:04:44
|
TheBestFlyingPig
|
docs: supplement RocketMQ's message consumption failure scenarios and common solutions (#280)
| false
| 41
| 0
| 41
|
--- docs/high-concurrency/how-to-ensure-the-reliable-transmission-of-messages.md
@@ -165,44 +165,3 @@ RabbitMQ 如果丢失了数据,主要是因为你消费的时候,**刚消费
#### 生产者会不会弄丢数据?
如果按照上述的思路设置了 `acks=all` ,一定不会丢,要求是,你的 leader 接收到消息,所有的 follower 都同步到了消息之后,才认为本次写成功了。如果没满足这个条件,生产者会自动不断的重试,重试无限次。
-
-### RocketMQ
-
-#### 消息丢失的场景
-
-1. 生产者发送消息到 MQ 有可能丢失消息
-2. MQ 收到消息后写入硬盘可能丢失消息
-3. 消息写入硬盘后,硬盘坏了丢失消息
-4. 消费者消费 MQ 也可能丢失消息
-5. 整个 MQ 节点挂了丢失消息
-
-#### 生产者发送消息时如何保证不丢失?
-
-解决发送时消息丢失的问题可以采用 RocketMQ 自带的**事物消息**机制
-
-事物消息原理:首先生产者会发送一个**half 消息**(对原始消息的封装),该消息对消费者不可见,MQ 通过 ACK 机制返回消息接受状态, 生产者执行本地事务并且返回给 MQ 一个状态(Commit、RollBack 等),如果是 Commit 的话 MQ 就会把消息给到下游, RollBack 的话就会丢弃该消息,状态如果为 UnKnow 的话会过一段时间回查本地事务状态,默认回查 15 次,一直是 UnKnow 状态的话就会丢弃此消息。
-
-为什么先发一个 half 消息,作用就是先判断下 MQ 有没有问题,服务正不正常。
-
-#### MQ 收到消息后写入硬盘如何保证不丢失?
-
-数据存盘绕过缓存,改为同步刷盘,这一步需要修改 Broker 的配置文件,将 flushDiskType 改为 SYNC_FLUSH 同步刷盘策略,默认的是 ASYNC_FLUSH 异步刷盘,一旦同步刷盘返回成功,那么就一定保证消息已经持久化到磁盘中了。
-
-#### 消息写入硬盘后,硬盘坏了如何保证不丢失?
-
-为了保证磁盘损坏导致丢失数据,RocketMQ 采用主从机构,集群部署,Leader 中的数据在多个 Follower 中都存有备份,防止单点故障导致数据丢失。
-
-Master 节点挂了怎么办?Master 节点挂了之后 DLedger 登场
-
-- 接管 MQ 的 commitLog
-- 选举从节点
-- 文件复制 uncommited 状态 多半从节点收到之后改为 commited
-
-#### 消费者消费 MQ 如何保证不丢失?
-
-1. 如果是网络问题导致的消费失败可以进行重试机制,默认每条消息重试 16 次
-2. 多线程异步消费失败,MQ 认为已经消费成功但是实际上对于业务逻辑来说消息是没有落地的,解决方案就是按照 mq 官方推荐的先执行本地事务再返回成功状态。
-
-#### 整个 MQ 节点挂了如何保证不丢失?
-
-这种极端情况可以消息发送失败之后先存入本地,例如放到缓存中,另外启动一个线程扫描缓存的消息去重试发送。
|
advanced-java
|
doocs
|
Java
|
Java
| 77,149
| 19,158
|
😮 Core Interview Questions & Answers For Experienced Java(Backend) Developers | 互联网 Java 工程师进阶知识完全扫盲:涵盖高并发、分布式、高可用、微服务、海量数据处理等领域知识
|
doocs_advanced-java
|
DOC_CHANGE
|
Obvious
|
0e1de78fca849c135fd00cd85b5b87920e46e50d
|
2024-05-06 14:25:42
|
guybe7
|
XREADGROUP from PEL should not affect server.dirty (#13251) Because it does not cause any propagation (arguably it should, see the
comment in the tcl file)
The motivation for this fix is that in 6.2 if dirty changed without
propagation inside MULTI/EXEC it would cause propagation of EXEC only,
which would result in the replica sending errors to its master
| false
| 93
| 12
| 105
|
--- src/stream.h
@@ -116,7 +116,7 @@ struct client;
stream *streamNew(void);
void freeStream(stream *s);
unsigned long streamLength(const robj *subject);
-size_t streamReplyWithRange(client *c, stream *s, streamID *start, streamID *end, size_t count, int rev, streamCG *group, streamConsumer *consumer, int flags, streamPropInfo *spi, unsigned long *propCount);
+size_t streamReplyWithRange(client *c, stream *s, streamID *start, streamID *end, size_t count, int rev, streamCG *group, streamConsumer *consumer, int flags, streamPropInfo *spi);
void streamIteratorStart(streamIterator *si, stream *s, streamID *start, streamID *end, int rev);
int streamIteratorGetID(streamIterator *si, streamID *id, int64_t *numfields);
void streamIteratorGetField(streamIterator *si, unsigned char **fieldptr, unsigned char **valueptr, int64_t *fieldlen, int64_t *valuelen);
--- src/t_stream.c
@@ -1657,7 +1657,7 @@ void streamPropagateConsumerCreation(client *c, robj *key, robj *groupname, sds
#define STREAM_RWR_RAWENTRIES (1<<1) /* Do not emit protocol for array
boundaries, just the entries. */
#define STREAM_RWR_HISTORY (1<<2) /* Only serve consumer local PEL. */
-size_t streamReplyWithRange(client *c, stream *s, streamID *start, streamID *end, size_t count, int rev, streamCG *group, streamConsumer *consumer, int flags, streamPropInfo *spi, unsigned long *propCount) {
+size_t streamReplyWithRange(client *c, stream *s, streamID *start, streamID *end, size_t count, int rev, streamCG *group, streamConsumer *consumer, int flags, streamPropInfo *spi) {
void *arraylen_ptr = NULL;
size_t arraylen = 0;
streamIterator si;
@@ -1666,8 +1666,6 @@ size_t streamReplyWithRange(client *c, stream *s, streamID *start, streamID *end
int propagate_last_id = 0;
int noack = flags & STREAM_RWR_NOACK;
- if (propCount) *propCount = 0;
-
/* If the client is asking for some history, we serve it using a
* different function, so that we return entries *solely* from its
* own PEL. This ensures each consumer will always and only see
@@ -1766,7 +1764,6 @@ size_t streamReplyWithRange(client *c, stream *s, streamID *start, streamID *end
robj *idarg = createObjectFromStreamID(&id);
streamPropagateXCLAIM(c,spi->keyname,group,spi->groupname,idarg,nack);
decrRefCount(idarg);
- if (propCount) (*propCount)++;
}
}
@@ -1774,10 +1771,8 @@ size_t streamReplyWithRange(client *c, stream *s, streamID *start, streamID *end
if (count && count == arraylen) break;
}
- if (spi && propagate_last_id) {
+ if (spi && propagate_last_id)
streamPropagateGroupID(c,spi->keyname,group,spi->groupname);
- if (propCount) (*propCount)++;
- }
streamIteratorStop(&si);
if (arraylen_ptr) setDeferredArrayLen(c,arraylen_ptr,arraylen);
@@ -1813,7 +1808,7 @@ size_t streamReplyWithRangeFromConsumerPEL(client *c, stream *s, streamID *start
streamID thisid;
streamDecodeID(ri.key,&thisid);
if (streamReplyWithRange(c,s,&thisid,&thisid,1,0,NULL,NULL,
- STREAM_RWR_RAWENTRIES,NULL,NULL) == 0)
+ STREAM_RWR_RAWENTRIES,NULL) == 0)
{
/* Note that we may have a not acknowledged entry in the PEL
* about a message that's no longer here because was removed
@@ -2129,7 +2124,7 @@ void xrangeGenericCommand(client *c, int rev) {
addReplyNullArray(c);
} else {
if (count == -1) count = 0;
- streamReplyWithRange(c,s,&startid,&endid,count,rev,NULL,NULL,0,NULL,NULL);
+ streamReplyWithRange(c,s,&startid,&endid,count,rev,NULL,NULL,0,NULL);
}
}
@@ -2391,13 +2386,12 @@ void xreadCommand(client *c) {
addReplyBulk(c,c->argv[streams_arg+i]);
int flags = 0;
- unsigned long propCount = 0;
if (noack) flags |= STREAM_RWR_NOACK;
if (serve_history) flags |= STREAM_RWR_HISTORY;
streamReplyWithRange(c,s,&start,NULL,count,0,
groups ? groups[i] : NULL,
- consumer, flags, &spi, &propCount);
- if (propCount) server.dirty++;
+ consumer, flags, &spi);
+ if (groups) server.dirty++;
}
}
@@ -3304,7 +3298,7 @@ void xclaimCommand(client *c) {
if (justid) {
addReplyStreamID(c,&id);
} else {
- serverAssert(streamReplyWithRange(c,o->ptr,&id,&id,1,0,NULL,NULL,STREAM_RWR_RAWENTRIES,NULL,NULL) == 1);
+ serverAssert(streamReplyWithRange(c,o->ptr,&id,&id,1,0,NULL,NULL,STREAM_RWR_RAWENTRIES,NULL) == 1);
}
arraylen++;
@@ -3479,7 +3473,7 @@ void xautoclaimCommand(client *c) {
if (justid) {
addReplyStreamID(c,&id);
} else {
- serverAssert(streamReplyWithRange(c,o->ptr,&id,&id,1,0,NULL,NULL,STREAM_RWR_RAWENTRIES,NULL,NULL) == 1);
+ serverAssert(streamReplyWithRange(c,o->ptr,&id,&id,1,0,NULL,NULL,STREAM_RWR_RAWENTRIES,NULL) == 1);
}
arraylen++;
count--;
@@ -3703,18 +3697,18 @@ void xinfoReplyWithStreamInfo(client *c, stream *s) {
end.ms = end.seq = UINT64_MAX;
addReplyBulkCString(c,"first-entry");
emitted = streamReplyWithRange(c,s,&start,&end,1,0,NULL,NULL,
- STREAM_RWR_RAWENTRIES,NULL,NULL);
+ STREAM_RWR_RAWENTRIES,NULL);
if (!emitted) addReplyNull(c);
addReplyBulkCString(c,"last-entry");
emitted = streamReplyWithRange(c,s,&start,&end,1,1,NULL,NULL,
- STREAM_RWR_RAWENTRIES,NULL,NULL);
+ STREAM_RWR_RAWENTRIES,NULL);
if (!emitted) addReplyNull(c);
} else {
/* XINFO STREAM <key> FULL [COUNT <count>] */
/* Stream entries */
addReplyBulkCString(c,"entries");
- streamReplyWithRange(c,s,NULL,NULL,count,0,NULL,NULL,0,NULL,NULL);
+ streamReplyWithRange(c,s,NULL,NULL,count,0,NULL,NULL,0,NULL);
/* Consumer groups */
addReplyBulkCString(c,"groups");
--- tests/unit/type/stream-cgroups.tcl
@@ -1009,68 +1009,6 @@ start_server {
assert_error "*NOGROUP*" {r XGROUP CREATECONSUMER mystream mygroup consumer}
}
- test {XREADGROUP of multiple entries changes dirty by one} {
- r DEL x
- r XADD x 1-0 data a
- r XADD x 2-0 data b
- r XADD x 3-0 data c
- r XADD x 4-0 data d
- r XGROUP CREATE x g1 0
- r XGROUP CREATECONSUMER x g1 Alice
-
- set dirty [s rdb_changes_since_last_save]
- set res [r XREADGROUP GROUP g1 Alice COUNT 2 STREAMS x ">"]
- assert_equal $res {{x {{1-0 {data a}} {2-0 {data b}}}}}
- set dirty2 [s rdb_changes_since_last_save]
- assert {$dirty2 == $dirty + 1}
-
- set dirty [s rdb_changes_since_last_save]
- set res [r XREADGROUP GROUP g1 Alice NOACK COUNT 2 STREAMS x ">"]
- assert_equal $res {{x {{3-0 {data c}} {4-0 {data d}}}}}
- set dirty2 [s rdb_changes_since_last_save]
- assert {$dirty2 == $dirty + 1}
- }
-
- test {XREADGROUP from PEL does not change dirty} {
- # Techinally speaking, XREADGROUP from PEL should cause propagation
- # because it change the delivery count/time
- # It was decided that this metadata changes are too insiginificant
- # to justify propagation
- # This test covers that.
- r DEL x
- r XADD x 1-0 data a
- r XADD x 2-0 data b
- r XADD x 3-0 data c
- r XADD x 4-0 data d
- r XGROUP CREATE x g1 0
- r XGROUP CREATECONSUMER x g1 Alice
-
- set res [r XREADGROUP GROUP g1 Alice COUNT 2 STREAMS x ">"]
- assert_equal $res {{x {{1-0 {data a}} {2-0 {data b}}}}}
-
- set dirty [s rdb_changes_since_last_save]
- set res [r XREADGROUP GROUP g1 Alice COUNT 2 STREAMS x 0]
- assert_equal $res {{x {{1-0 {data a}} {2-0 {data b}}}}}
- set dirty2 [s rdb_changes_since_last_save]
- assert {$dirty2 == $dirty}
-
- set dirty [s rdb_changes_since_last_save]
- set res [r XREADGROUP GROUP g1 Alice COUNT 2 STREAMS x 9000]
- assert_equal $res {{x {}}}
- set dirty2 [s rdb_changes_since_last_save]
- assert {$dirty2 == $dirty}
-
- # The current behavior is that we create the consumer (causes dirty++) even
- # if we onlyneed to read from PEL.
- # It feels like we shouldn't create the consumer in that case, but I added
- # this test just for coverage of current behavior
- set dirty [s rdb_changes_since_last_save]
- set res [r XREADGROUP GROUP g1 noconsumer COUNT 2 STREAMS x 0]
- assert_equal $res {{x {}}}
- set dirty2 [s rdb_changes_since_last_save]
- assert {$dirty2 == $dirty + 1}
- }
-
start_server {tags {"stream needs:debug"} overrides {appendonly yes aof-use-rdb-preamble no appendfsync always}} {
test {XREADGROUP with NOACK creates consumer} {
r del mystream
@@ -1391,19 +1329,6 @@ start_server {
assert_equal [dict get $group entries-read] 3
assert_equal [dict get $group lag] 0
}
-
- test {XREADGROUP from PEL inside MULTI} {
- # This scenario used to cause propagation of EXEC without MULTI in 6.2
- $replica config set propagation-error-behavior panic
- $master del mystream
- $master xadd mystream 1-0 a b c d e f
- $master xgroup create mystream mygroup 0
- $master xreadgroup group mygroup ryan count 1 streams mystream >
- $master multi
- $master xreadgroup group mygroup ryan count 1 streams mystream 0
- set reply [$master exec]
- assert_equal $reply {{{mystream {{1-0 {a b c d e f}}}}}}
- }
}
start_server {tags {"stream needs:debug"} overrides {appendonly yes aof-use-rdb-preamble no}} {
|
redis
|
redis
|
C
|
C
| 68,201
| 23,916
|
Redis is an in-memory database that persists on disk. The data model is key-value, but many different kind of values are supported: Strings, Lists, Sets, Sorted Sets, Hashes, Streams, HyperLogLogs, Bitmaps.
|
redis_redis
|
BUG_FIX
|
simplify decoder draining logic
|
5226cfe5624290111779dd57bc6139676eded76d
| null |
Alex Mikhalev
|
Reduced Desktop Entries heading size
| false
| 1
| 1
| 0
|
--- README.md
@@ -92,7 +92,7 @@ If all goes well, this should place a binary at `target/release/alacritty`.
many things (such as arrow keys) would not work. If you're on macOS, you'll need
to change the `monospace` font family to something like `Menlo`.
-## Desktop Entry
+### Desktop Entry
Many linux distributions support desktop entries for adding applications to
system menus. To install the desktop entry for Alacritty, run
|
alacritty_alacritty.json
| null | null | null | null | null | null |
alacritty_alacritty.json
|
CONFIG_CHANGE
|
5, obvious
|
859c9d49b1aa886097b0a4381feba377f9609848
|
2022-12-20 23:42:47
|
Vinicius Souza
|
update accessibility and alexa categories
| false
| 4
| 0
| 4
|
--- README.md
@@ -174,16 +174,12 @@
- [Capable](https://github.com/chrs1885/Capable) - Track accessibility features to improve your app for people with certain disabilities.
-**[back to top](#contents)**
-
## Alexa
*Frameworks that help to support writing custom alexa skills in swift*
- [AlexaSkillsKit](https://github.com/choefele/AlexaSkillsKit) - Swift library to develop custom Alexa Skills.
-**[back to top](#contents)**
-
## Analytics
*Analytics platforms, SDK's, error tracking and real-time answers about your app*
|
awesome-ios
|
vsouza
|
Swift
|
Swift
| 48,363
| 6,877
|
A curated list of awesome iOS ecosystem, including Objective-C and Swift Projects
|
vsouza_awesome-ios
|
DOC_CHANGE
|
Obvious
|
8ed8b4a89dba16589b169987779cb17303ca1346
|
2025-02-12 23:21:07
|
Fredia Huya-Kouadio
|
Bump the version of the openxr vendors plugin dependency
| false
| 4
| 3
| 7
|
--- platform/android/java/app/config.gradle
@@ -12,8 +12,7 @@ ext.versions = [
javaVersion : JavaVersion.VERSION_17,
// Also update 'platform/android/detect.py#get_ndk_version()' when this is updated.
ndkVersion : '23.2.8568313',
- splashscreenVersion: '1.0.1',
- openxrVendorsVersion: '3.1.2-stable'
+ splashscreenVersion: '1.0.1'
]
--- platform/android/java/editor/build.gradle
@@ -188,7 +188,7 @@ dependencies {
implementation "org.bouncycastle:bcprov-jdk15to18:1.78"
// Meta dependencies
- horizonosImplementation "org.godotengine:godot-openxr-vendors-meta:$versions.openxrVendorsVersion"
+ horizonosImplementation "org.godotengine:godot-openxr-vendors-meta:3.0.0-stable"
// Pico dependencies
- picoosImplementation "org.godotengine:godot-openxr-vendors-pico:$versions.openxrVendorsVersion"
+ picoosImplementation "org.godotengine:godot-openxr-vendors-pico:3.0.1-stable"
}
|
godot
|
godotengine
|
C++
|
C++
| 94,776
| 21,828
|
Godot Engine – Multi-platform 2D and 3D game engine
|
godotengine_godot
|
CONFIG_CHANGE
|
dependency updates
|
9c5c70dc0a2531b621ba825c2599189717694d56
| null |
Hakan Ensari
|
order aliases alphabetically
| false
| 1
| 1
| 0
|
--- bundler.plugin.zsh
@@ -1,8 +1,8 @@
alias be="bundle exec"
alias bi="bundle install"
alias bl="bundle list"
-alias bu="bundle update"
alias bp="bundle package"
+alias bu="bundle update"
# The following is based on https://github.com/gma/bundler-exec
|
ohmyzsh_ohmyzsh.json
| null | null | null | null | null | null |
ohmyzsh_ohmyzsh.json
|
CODE_IMPROVEMENT
|
5, just the change in order of aliases
|
0e3ea3fbab0297f38ed48b9e2f694cc43f8af567
|
2023-10-05 17:00:39
|
Kamil
|
Fermat_little_theorem type annotation (#9794) * Replacing the generator with numpy vector operations from lu_decomposition.
* Revert "Replacing the generator with numpy vector operations from lu_decomposition."
This reverts commit ad217c66165898d62b76cc89ba09c2d7049b6448.
* Added type annotation.
* Update fermat_little_theorem.py
Used other syntax.
* Update fermat_little_theorem.py
* Update maths/fermat_little_theorem.py
---------
Co-authored-by: Tianyi Zheng <[email protected]>
| false
| 1
| 1
| 2
|
--- maths/fermat_little_theorem.py
@@ -5,7 +5,7 @@
# Wikipedia reference: https://en.wikipedia.org/wiki/Fermat%27s_little_theorem
-def binary_exponentiation(a: int, n: float, mod: int) -> int:
+def binary_exponentiation(a, n, mod):
if n == 0:
return 1
|
python
|
thealgorithms
|
Python
|
Python
| 197,891
| 46,346
|
All Algorithms implemented in Python
|
thealgorithms_python
|
CODE_IMPROVEMENT
|
probably refactoring
|
f485c3d6037b98bdc2c8c7ea1ee1be7669597866
|
2024-05-16 15:24:39
|
Ilkka Seppälä
|
docs: update pipeline
| false
| 80
| 47
| 127
|
--- pipeline/README.md
@@ -3,56 +3,44 @@ title: Pipeline
category: Behavioral
language: en
tag:
- - API design
- - Data processing
- - Decoupling
- - Extensibility
- - Functional decomposition
- - Scalability
+ - Decoupling
---
-## Also known as
-
-* Chain of Operations
-* Processing Pipeline
-
## Intent
-The Pipeline design pattern is intended to allow data processing in discrete stages, where each stage is represented by a different component and the output of one stage serves as the input for the next.
+Allows processing of data in a series of stages by giving in an initial input and passing the
+processed output to be used by the next stages.
## Explanation
-The Pipeline pattern uses ordered stages to process a sequence of input values. Each implemented task is represented by a stage of the pipeline. You can think of pipelines as similar to assembly lines in a factory, where each item in the assembly line is constructed in stages. The partially assembled item is passed from one assembly stage to another. The outputs of the assembly line occur in the same order as that of the inputs.
-
+The Pipeline pattern uses ordered stages to process a sequence of input values. Each implemented
+task is represented by a stage of the pipeline. You can think of pipelines as similar to assembly
+lines in a factory, where each item in the assembly line is constructed in stages. The partially
+assembled item is passed from one assembly stage to another. The outputs of the assembly line occur
+in the same order as that of the inputs.
+
Real world example
-> A real-world analogous example of the Pipeline design pattern is an **assembly line in a car manufacturing plant**.
->
-> In this analogy, the car manufacturing process is divided into several discrete stages, each stage handling a specific part of the car assembly. For example:
->
-> 1. **Chassis Assembly:** The base frame of the car is assembled.
-> 2. **Engine Installation:** The engine is installed onto the chassis.
-> 3. **Painting:** The car is painted.
-> 4. **Interior Assembly:** The interior, including seats and dashboard, is installed.
-> 5. **Quality Control:** The finished car is inspected for defects.
->
->Each stage operates independently and sequentially, where the output of one stage (e.g., a partially assembled car) becomes the input for the next stage. This modular approach allows for easy maintenance, scalability (e.g., adding more workers to a stage), and flexibility (e.g., replacing a stage with a more advanced version). Just like in a software pipeline, changes in one stage do not affect the others, facilitating continuous improvements and efficient production.
+> Suppose we wanted to pass through a string to a series of filtering stages and convert it as a
+> char array on the last stage.
In plain words
-> Pipeline pattern is an assembly line where partial results are passed from one stage to another.
+> Pipeline pattern is an assembly line where partial results are passed from one stage to another.
Wikipedia says
-> In software engineering, a pipeline consists of a chain of processing elements (processes, threads, coroutines, functions, etc.), arranged so that the output of each element is the input of the next; the name is by analogy to a physical pipeline.
+> In software engineering, a pipeline consists of a chain of processing elements (processes,
+> threads, coroutines, functions, etc.), arranged so that the output of each element is the input
+> of the next; the name is by analogy to a physical pipeline.
**Programmatic Example**
-Let's create a string processing pipeline example. The stages of our pipeline are called `Handler`s.
+The stages of our pipeline are called `Handler`s.
```java
interface Handler<I, O> {
- O process(I input);
+ O process(I input);
}
```
@@ -60,15 +48,15 @@ In our string processing example we have 3 different concrete `Handler`s.
```java
class RemoveAlphabetsHandler implements Handler<String, String> {
- // ...
+ ...
}
class RemoveDigitsHandler implements Handler<String, String> {
- // ...
+ ...
}
class ConvertToCharArrayHandler implements Handler<String, char[]> {
- // ...
+ ...
}
```
@@ -77,77 +65,56 @@ Here is the `Pipeline` that will gather and execute the handlers one by one.
```java
class Pipeline<I, O> {
- private final Handler<I, O> currentHandler;
+ private final Handler<I, O> currentHandler;
- Pipeline(Handler<I, O> currentHandler) {
- this.currentHandler = currentHandler;
- }
+ Pipeline(Handler<I, O> currentHandler) {
+ this.currentHandler = currentHandler;
+ }
- <K> Pipeline<I, K> addHandler(Handler<O, K> newHandler) {
- return new Pipeline<>(input -> newHandler.process(currentHandler.process(input)));
- }
+ <K> Pipeline<I, K> addHandler(Handler<O, K> newHandler) {
+ return new Pipeline<>(input -> newHandler.process(currentHandler.process(input)));
+ }
- O execute(I input) {
- return currentHandler.process(input);
- }
+ O execute(I input) {
+ return currentHandler.process(input);
+ }
}
```
And here's the `Pipeline` in action processing the string.
```java
-var filters = new Pipeline<>(new RemoveAlphabetsHandler()).addHandler(new RemoveDigitsHandler()).addHandler(new ConvertToCharArrayHandler());
-filters.execute("GoYankees123!");
+ var filters = new Pipeline<>(new RemoveAlphabetsHandler())
+ .addHandler(new RemoveDigitsHandler())
+ .addHandler(new ConvertToCharArrayHandler());
+ filters.execute("GoYankees123!");
```
## Class diagram
-
+
## Applicability
Use the Pipeline pattern when you want to
-* When you need to process data in a sequence of stages.
-* When each stage of processing is independent and can be easily replaced or reordered.
-* When you want to improve the scalability and maintainability of data processing code.
-
-## Tutorials
-
-* [The Pipeline Pattern — for fun and profit](https://medium.com/@aaronweatherall/the-pipeline-pattern-for-fun-and-profit-9b5f43a98130)
-* [The Pipeline design pattern (in Java)](https://medium.com/@deepakbapat/the-pipeline-design-pattern-in-java-831d9ce2fe21)
-
-## Known Uses
-
-* Data transformation and ETL (Extract, Transform, Load) processes.
-* Compilers for processing source code through various stages such as lexical analysis, syntax analysis, semantic analysis, and code generation.
-* Image processing applications where multiple filters are applied sequentially.
-* Logging frameworks where messages pass through multiple handlers for formatting, filtering, and output.
-
-## Consequences
-
-Benefits:
+* Execute individual stages that yields a final value.
+* Add readability to complex sequence of operations by providing a fluent builder as an interface.
+* Improve testability of code since stages will most likely be doing a single thing, complying to
+the [Single Responsibility Principle (SRP)](https://java-design-patterns.com/principles/#single-responsibility-principle)
-* Decoupling: Each stage of the pipeline is a separate component, making the system more modular and easier to maintain.
-* Reusability: Individual stages can be reused in different pipelines.
-* Extensibility: New stages can be added without modifying existing ones.
-* Scalability: Pipelines can be parallelized by running different stages on different processors or threads.
+## Known uses
-Trade-offs:
+* [java.util.Stream](https://docs.oracle.com/javase/8/docs/api/java/util/stream/package-summary.html)
+* [Maven Build Lifecycle](http://maven.apache.org/guides/introduction/introduction-to-the-lifecycle.html)
+* [Functional Java](https://github.com/functionaljava/functionaljava)
-* Complexity: Managing the flow of data through multiple stages can introduce complexity.
-* Performance Overhead: Each stage introduces some performance overhead due to context switching and data transfer between stages.
-* Debugging Difficulty: Debugging pipelines can be more challenging since the data flows through multiple components.
+## Related patterns
-## Related Patterns
-
-* [Chain of Responsibility](https://java-design-patterns.com/patterns/chain-of-responsibility/): Both patterns involve passing data through a series of handlers, but in Chain of Responsibility, handlers can decide not to pass the data further.
-* [Decorator](https://java-design-patterns.com/patterns/decorator/): Both patterns involve adding behavior dynamically, but Decorator wraps additional behavior around objects, whereas Pipeline processes data in discrete steps.
-* [Composite](https://java-design-patterns.com/patterns/composite/): Like Pipeline, Composite also involves hierarchical processing, but Composite is more about part-whole hierarchies.
+* [Chain of Responsibility](https://java-design-patterns.com/patterns/chain-of-responsibility/)
## Credits
-* [Design Patterns: Elements of Reusable Object-Oriented Software](https://amzn.to/3w0pvKI)
-* [Java Design Patterns: A Hands-On Experience with Real-World Examples](https://amzn.to/3yhh525)
-* [Patterns of Enterprise Application Architecture](https://amzn.to/3WfKBPR)
+* [The Pipeline Pattern — for fun and profit](https://medium.com/@aaronweatherall/the-pipeline-pattern-for-fun-and-profit-9b5f43a98130)
+* [The Pipeline design pattern (in Java)](https://medium.com/@deepakbapat/the-pipeline-design-pattern-in-java-831d9ce2fe21)
* [Pipelines | Microsoft Docs](https://docs.microsoft.com/en-us/previous-versions/msp-n-p/ff963548(v=pandp.10))
|
java-design-patterns
|
iluwatar
|
Java
|
Java
| 90,911
| 26,831
|
Design patterns implemented in Java
|
iluwatar_java-design-patterns
|
DOC_CHANGE
|
The prefix fix: suggests a bug fix, but the actual change is not fixing code behavior, it’s improving documentation rendering
|
06a91793091ee788163a6ecb051da1754c9885ae
|
2025-02-05 05:25:10
|
Jordan Harband
|
[Refactor] combine `sed -e` invocations/arguments
| false
| 12
| 10
| 22
|
--- nvm.sh
@@ -136,17 +136,15 @@ nvm_download() {
eval "curl -q --fail ${CURL_COMPRESSED_FLAG:-} ${CURL_HEADER_FLAG:-} ${NVM_DOWNLOAD_ARGS}"
elif nvm_has "wget"; then
# Emulate curl with wget
- ARGS=$(nvm_echo "$@" | command sed "
- s/--progress-bar /--progress=bar /
- s/--compressed //
- s/--fail //
- s/-L //
- s/-I /--server-response /
- s/-s /-q /
- s/-sS /-nv /
- s/-o /-O /
- s/-C - /-c /
- ")
+ ARGS=$(nvm_echo "$@" | command sed -e 's/--progress-bar /--progress=bar /' \
+ -e 's/--compressed //' \
+ -e 's/--fail //' \
+ -e 's/-L //' \
+ -e 's/-I /--server-response /' \
+ -e 's/-s /-q /' \
+ -e 's/-sS /-nv /' \
+ -e 's/-o /-O /' \
+ -e 's/-C - /-c /')
if [ -n "${NVM_AUTH_HEADER:-}" ]; then
ARGS="${ARGS} --header \"${NVM_AUTH_HEADER}\""
@@ -2734,7 +2732,7 @@ nvm_npm_global_modules() {
local NPMLIST
local VERSION
VERSION="$1"
- NPMLIST=$(nvm use "${VERSION}" >/dev/null && npm list -g --depth=0 2>/dev/null | command sed -e '1d' -e '/UNMET PEER DEPENDENCY/d')
+ NPMLIST=$(nvm use "${VERSION}" >/dev/null && npm list -g --depth=0 2>/dev/null | command sed 1,1d | nvm_grep -v 'UNMET PEER DEPENDENCY')
local INSTALLS
INSTALLS=$(nvm_echo "${NPMLIST}" | command sed -e '/ -> / d' -e '/\(empty\)/ d' -e 's/^.* \(.*@[^ ]*\).*/\1/' -e '/^npm@[^ ]*.*$/ d' | command xargs)
|
nvm
|
nvm-sh
|
Shell
|
Shell
| 82,623
| 8,249
|
Node Version Manager - POSIX-compliant bash script to manage multiple active node.js versions
|
nvm-sh_nvm
|
CODE_IMPROVEMENT
|
refactoring done
|
2454d1fdd9f5cf94c219c0212d6c65e43412c901
|
2023-01-11 04:36:34
|
Kebin Liu
|
Set version number to commit
| false
| 13
| 6
| 19
|
--- .github/workflows/main.yml
@@ -18,7 +18,7 @@ jobs:
- name: Build
run: |
brew install automake
- make VERSION="${GITHUB_SHA::7}" debug
+ make debug
make debug-dmg
shasum -a 256 build/Debug/ShadowsocksX-NG.dmg > build/Debug/ShadowsocksX-NG.dmg.checksum
--- Makefile
@@ -1,14 +1,12 @@
-VERSION ?= 0.0.0
-
.PHONY: all
all: debug
.PHONY: debug
-debug: deps/dist set-version
+debug: deps/dist
xcodebuild -workspace ShadowsocksX-NG.xcworkspace -scheme ShadowsocksX-NG -configuration Debug SYMROOT=$${PWD}/build
.PHONY: release
-release: deps/dist set-version
+release: deps/dist
xcodebuild -workspace ShadowsocksX-NG.xcworkspace -scheme ShadowsocksX-NG -configuration Release SYMROOT=$${PWD}/build
.PHONY: debug-dmg release-dmg
@@ -22,10 +20,6 @@ debug-dmg release-dmg:
&& hdiutil create build/$${t}/ShadowsocksX-NG.dmg -ov -volname "ShadowsocksX-NG" -fs HFS+ -srcfolder build/$${t}/ShadowsocksX-NG/ \
&& rm -rf build/$${t}/ShadowsocksX-NG/
-.PHONY: set-version
-set-version:
- agvtool new-marketing-version $(VERSION)
-
deps/dist:
$(MAKE) -C deps
--- ShadowsocksX-NG.xcodeproj/project.pbxproj
@@ -581,6 +581,7 @@
9B0BFFDD1D0460A70040E62B /* Project object */ = {
isa = PBXProject;
attributes = {
+ BuildIndependentTargetsInParallel = NO;
LastSwiftUpdateCheck = 1250;
LastUpgradeCheck = 1250;
ORGANIZATIONNAME = qiuyuzhou;
@@ -1024,7 +1025,6 @@
"CODE_SIGN_IDENTITY[sdk=macosx*]" = "-";
CODE_SIGN_STYLE = Automatic;
COMBINE_HIDPI_IMAGES = YES;
- CURRENT_PROJECT_VERSION = 2;
DEVELOPMENT_TEAM = "";
HEADER_SEARCH_PATHS = (
"$(inherited)",
@@ -1041,6 +1041,7 @@
"$(PROJECT_DIR)/ShadowsocksX-NG/simple-obfs",
);
MACOSX_DEPLOYMENT_TARGET = 10.12;
+ MARKETING_VERSION = 1.9.4;
PRODUCT_BUNDLE_IDENTIFIER = "com.qiuyuzhou.ShadowsocksX-NG";
PRODUCT_NAME = "$(TARGET_NAME)";
PROVISIONING_PROFILE_SPECIFIER = "";
@@ -1048,7 +1049,6 @@
SWIFT_OPTIMIZATION_LEVEL = "-Onone";
SWIFT_SWIFT3_OBJC_INFERENCE = Default;
SWIFT_VERSION = 5.0;
- VERSIONING_SYSTEM = "apple-generic";
};
name = Debug;
};
@@ -1062,7 +1062,6 @@
"CODE_SIGN_IDENTITY[sdk=macosx*]" = "-";
CODE_SIGN_STYLE = Automatic;
COMBINE_HIDPI_IMAGES = YES;
- CURRENT_PROJECT_VERSION = 2;
DEVELOPMENT_TEAM = "";
HEADER_SEARCH_PATHS = (
"$(inherited)",
@@ -1079,13 +1078,13 @@
"$(PROJECT_DIR)/ShadowsocksX-NG/simple-obfs",
);
MACOSX_DEPLOYMENT_TARGET = 10.12;
+ MARKETING_VERSION = 1.9.4;
PRODUCT_BUNDLE_IDENTIFIER = "com.qiuyuzhou.ShadowsocksX-NG";
PRODUCT_NAME = "$(TARGET_NAME)";
PROVISIONING_PROFILE_SPECIFIER = "";
SWIFT_OBJC_BRIDGING_HEADER = "ShadowsocksX-NG/ShadowsocksX-NG-Bridging-Header.h";
SWIFT_SWIFT3_OBJC_INFERENCE = Default;
SWIFT_VERSION = 5.0;
- VERSIONING_SYSTEM = "apple-generic";
};
name = Release;
};
|
shadowsocksx-ng
|
shadowsocks
|
Swift
|
Swift
| 32,651
| 7,935
|
Next Generation of ShadowsocksX
|
shadowsocks_shadowsocksx-ng
|
BUG_FIX
|
Code change: bug removal
|
f63b7113640ef9cb8340dcd8fa8074199bb25575
|
2023-11-30 04:46:27
|
Tien Do Nam
|
fix: select correct color if dynamic color not supported
| false
| 20
| 4
| 24
|
--- app/lib/init.dart
@@ -47,9 +47,7 @@ Future<RefenaContainer> preInit(List<String> args) async {
initLogger(args.contains('-v') || args.contains('--verbose') ? Level.ALL : Level.INFO);
MapperContainer.globals.use(const FileDtoMapper());
- final dynamicColors = await getDynamicColors();
-
- final persistenceService = await PersistenceService.initialize(dynamicColors);
+ final persistenceService = await PersistenceService.initialize();
initI18n();
@@ -104,7 +102,7 @@ Future<RefenaContainer> preInit(List<String> args) async {
deviceRawInfoProvider.overrideWithValue(await getDeviceInfo()),
appArgumentsProvider.overrideWithValue(args),
tvProvider.overrideWithValue(await checkIfTv()),
- dynamicColorsProvider.overrideWithValue(dynamicColors),
+ dynamicColorsProvider.overrideWithValue(await getDynamicColors()),
sleepProvider.overrideWithInitialState((ref) => startHidden),
],
);
--- app/lib/provider/persistence_provider.dart
@@ -16,7 +16,6 @@ import 'package:localsend_app/util/alias_generator.dart';
import 'package:localsend_app/util/native/platform_check.dart';
import 'package:localsend_app/util/security_helper.dart';
import 'package:localsend_app/util/shared_preferences_portable.dart';
-import 'package:localsend_app/util/ui/dynamic_colors.dart';
import 'package:logging/logging.dart';
import 'package:path/path.dart' as p;
import 'package:path_provider/path_provider.dart' as path;
@@ -77,7 +76,7 @@ class PersistenceService {
PersistenceService._(this._prefs);
- static Future<PersistenceService> initialize(DynamicColors? dynamicColors) async {
+ static Future<PersistenceService> initialize() async {
SharedPreferences prefs;
if (checkPlatform([TargetPlatform.windows]) && SharedPreferencesPortable.exists()) {
@@ -123,26 +122,13 @@ class PersistenceService {
await prefs.setString(_securityContext, jsonEncode(generateSecurityContext()));
}
- final supportsDynamicColors = dynamicColors != null;
if (prefs.getString(_colorKey) == null) {
- await _initColorSetting(prefs, supportsDynamicColors);
- } else {
- // fix when device does not support dynamic colors
- final supported = supportsDynamicColors ? ColorMode.values : ColorMode.values.where((e) => e != ColorMode.system);
- final colorMode = supported.firstWhereOrNull((color) => color.name == prefs.getString(_colorKey));
- if (colorMode == null) {
- await _initColorSetting(prefs, supportsDynamicColors);
- }
+ await prefs.setString(_colorKey, checkPlatform([TargetPlatform.android]) ? ColorMode.system.name : ColorMode.localsend.name);
}
return PersistenceService._(prefs);
}
- static Future<void> _initColorSetting(SharedPreferences prefs, bool supportsDynamicColors) async {
- await prefs.setString(
- _colorKey, checkPlatform([TargetPlatform.android]) && supportsDynamicColors ? ColorMode.system.name : ColorMode.localsend.name);
- }
-
StoredSecurityContext getSecurityContext() {
final contextRaw = _prefs.getString(_securityContext)!;
return StoredSecurityContext.fromJson(jsonDecode(contextRaw));
|
localsend
|
localsend
|
Dart
|
Dart
| 58,423
| 3,136
|
An open-source cross-platform alternative to AirDrop
|
localsend_localsend
|
BUG_FIX
|
obvious
|
5bfdc8f3ed1b4f3bc9c08b2a451150e6481197ec
|
2025-01-02 16:02:54
|
Ilya Mashchenko
|
docs(go.d/nats): add missing labels to meta (#19309)
| false
| 30
| 0
| 30
|
--- src/go/plugin/go.d/collector/nats/metadata.yaml
@@ -186,12 +186,8 @@ modules:
- name: server
description: These metrics refer to NATS servers.
labels:
- - name: cluster_name
- description: "The name of the NATS cluster this server belongs to."
- name: server_id
description: "A unique identifier for a server within the NATS cluster."
- - name: server_name
- description: "The configured name of the NATS server."
metrics:
- name: nats.server_traffic
description: Server Traffic
@@ -301,12 +297,8 @@ modules:
- name: http endpoint
description: These metrics refer to HTTP endpoints.
labels:
- - name: cluster_name
- description: "The name of the NATS cluster this server belongs to."
- name: server_id
description: "A unique identifier for a server within the NATS cluster."
- - name: server_name
- description: "The configured name of the NATS server."
- name: http_endpoint
description: "HTTP endpoint path."
metrics:
@@ -319,12 +311,8 @@ modules:
- name: account
description: These metrics refer to [Accounts](https://docs.nats.io/running-a-nats-service/nats_admin/monitoring#account-statistics).
labels:
- - name: cluster_name
- description: "The name of the NATS cluster this server belongs to."
- name: server_id
description: "A unique identifier for a server within the NATS cluster."
- - name: server_name
- description: "The configured name of the NATS server."
- name: account
description: "Account name."
metrics:
@@ -375,12 +363,8 @@ modules:
- name: route
description: These metrics refer to [Routes](https://docs.nats.io/running-a-nats-service/nats_admin/monitoring#route-information).
labels:
- - name: cluster_name
- description: "The name of the NATS cluster this server belongs to."
- name: server_id
description: "A unique identifier for a server within the NATS cluster."
- - name: server_name
- description: "The configured name of the NATS server."
- name: route_id
description: "A unique identifier for a route within the NATS cluster."
- name: remote_id
@@ -409,12 +393,8 @@ modules:
- name: inbound gateway connection
description: These metrics refer to [Inbound Gateway Connections](https://docs.nats.io/running-a-nats-service/nats_admin/monitoring#gateway-information).
labels:
- - name: cluster_name
- description: "The name of the NATS cluster this server belongs to."
- name: server_id
description: "A unique identifier for a server within the NATS cluster."
- - name: server_name
- description: "The configured name of the NATS server."
- name: gateway
description: "The name of the local gateway."
- name: remote_gateway
@@ -451,12 +431,8 @@ modules:
- name: outbound gateway connection
description: These metrics refer to [Outbound Gateway Connections](https://docs.nats.io/running-a-nats-service/nats_admin/monitoring#gateway-information).
labels:
- - name: cluster_name
- description: "The name of the NATS cluster this server belongs to."
- name: server_id
description: "A unique identifier for a server within the NATS cluster."
- - name: server_name
- description: "The configured name of the NATS server."
- name: gateway
description: "The name of the local gateway."
- name: remote_gateway
@@ -493,12 +469,6 @@ modules:
- name: leaf node connection
description: These metrics refer to [Leaf Node Connections](https://docs.nats.io/running-a-nats-service/nats_admin/monitoring#leaf-node-information).
labels:
- - name: cluster_name
- description: "The name of the NATS cluster this server belongs to."
- - name: server_id
- description: "A unique identifier for a server within the NATS cluster."
- - name: server_name
- description: "The configured name of the NATS server."
- name: remote_name
description: "Unique identifier of the remote leaf node server, either its configured name or automatically assigned ID."
- name: account
|
netdata
|
netdata
|
C
|
C
| 73,681
| 6,023
|
X-Ray Vision for your infrastructure!
|
netdata_netdata
|
CONFIG_CHANGE
|
Obvious
|
11f986fdc7d6b4c80e396437e9c45c939362bdee
|
2024-12-07 09:33:23
|
ptmkenny
|
[eslint config] [deps] update `eslint-plugin-react-hooks`
| false
| 2
| 2
| 4
|
--- packages/eslint-config-airbnb/package.json
@@ -78,7 +78,7 @@
"eslint-plugin-import": "^2.30.0",
"eslint-plugin-jsx-a11y": "^6.10.0",
"eslint-plugin-react": "^7.36.1",
- "eslint-plugin-react-hooks": "^5.1.0",
+ "eslint-plugin-react-hooks": "^4.6.2",
"in-publish": "^2.0.1",
"react": ">= 0.13.0",
"safe-publish-latest": "^2.0.0",
@@ -89,7 +89,7 @@
"eslint-plugin-import": "^2.30.0",
"eslint-plugin-jsx-a11y": "^6.10.0",
"eslint-plugin-react": "^7.36.1",
- "eslint-plugin-react-hooks": "^5.1.0"
+ "eslint-plugin-react-hooks": "^4.6.2"
},
"engines": {
"node": "^10.12.0 || ^12.22.0 || ^14.17.0 || >=16.0.0"
|
javascript
|
airbnb
|
JavaScript
|
JavaScript
| 146,197
| 26,671
|
JavaScript Style Guide
|
airbnb_javascript
|
CONFIG_CHANGE
|
Obvious
|
15eeee26d51da2b78968b3b54ae78ee8449ddaab
|
2023-04-07 06:40:08
|
macro
|
Update OmsOrderReturnApplyDao.xml
| false
| 0
| 3
| 3
|
--- mall-admin/src/main/resources/dao/OmsOrderReturnApplyDao.xml
@@ -24,6 +24,9 @@
<if test="queryParam.status!=null">
AND status = #{queryParam.status}
</if>
+ <if test="queryParam.status!=null">
+ AND status = #{queryParam.status}
+ </if>
<if test="queryParam.handleMan!=null and queryParam.handleMan!=''">
AND handle_man = #{queryParam.handleMan}
</if>
|
mall
|
macrozheng
|
Java
|
Java
| 79,319
| 29,052
|
mall项目是一套电商系统,包括前台商城系统及后台管理系统,基于Spring Boot+MyBatis实现,采用Docker容器化部署。 前台商城系统包含首页门户、商品推荐、商品搜索、商品展示、购物车、订单流程、会员中心、客户服务、帮助中心等模块。 后台管理系统包含商品管理、订单管理、会员管理、促销管理、运营管理、内容管理、统计报表、财务管理、权限管理、设置等模块。
|
macrozheng_mall
|
CONFIG_CHANGE
|
xml file updated
|
a149db655c3bf119167b527682f64dd51e54dd45
|
2024-04-19 21:30:31
|
Constantin Graf
|
Added invitation delete endpoint
| false
| 120
| 0
| 120
|
--- app/Http/Controllers/Api/V1/InvitationController.php
@@ -9,21 +9,12 @@
use App\Http\Resources\V1\Invitation\InvitationCollection;
use App\Http\Resources\V1\Invitation\InvitationResource;
use App\Models\Organization;
-use App\Models\OrganizationInvitation;
use Illuminate\Auth\Access\AuthorizationException;
use Illuminate\Http\JsonResponse;
use Laravel\Jetstream\Contracts\InvitesTeamMembers;
class InvitationController extends Controller
{
- protected function checkPermission(Organization $organization, string $permission, ?OrganizationInvitation $organizationInvitation = null): void
- {
- parent::checkPermission($organization, $permission);
- if ($organizationInvitation !== null && $organizationInvitation->organization_id !== $organization->id) {
- throw new AuthorizationException('Invitation does not belong to organization');
- }
- }
-
/**
* List all invitations of an organization
*
@@ -63,20 +54,4 @@ public function store(Organization $organization, InvitationStoreRequest $reques
return response()->json(null, 204);
}
-
- /**
- * Remove a pending invitation
- *
- * @throws AuthorizationException
- *
- * @operationId removeInvitation
- */
- public function destroy(Organization $organization, OrganizationInvitation $invitation): JsonResponse
- {
- $this->checkPermission($organization, 'invitations:remove', $invitation);
-
- $invitation->delete();
-
- return response()->json(null, 204);
- }
}
--- app/Models/OrganizationInvitation.php
@@ -4,9 +4,7 @@
namespace App\Models;
-use Database\Factories\OrganizationInvitationFactory;
use Illuminate\Database\Eloquent\Concerns\HasUuids;
-use Illuminate\Database\Eloquent\Factories\HasFactory;
use Illuminate\Database\Eloquent\Relations\BelongsTo;
use Laravel\Jetstream\Jetstream;
use Laravel\Jetstream\TeamInvitation as JetstreamTeamInvitation;
@@ -17,12 +15,9 @@
* @property string $role
* @property string $organization_id
* @property-read Organization $organization
- *
- * @method static OrganizationInvitationFactory factory()
*/
class OrganizationInvitation extends JetstreamTeamInvitation
{
- use HasFactory;
use HasUuids;
/**
--- app/Providers/AppServiceProvider.php
@@ -86,6 +86,5 @@ public function boot(): void
});
Route::model('member', Membership::class);
- Route::model('invitation', OrganizationInvitation::class);
}
}
--- app/Providers/JetstreamServiceProvider.php
@@ -93,9 +93,6 @@ protected function configurePermissions(): void
'organizations:view',
'organizations:update',
'import',
- 'invitations:view',
- 'invitations:create',
- 'invitations:remove',
'members:view',
'members:invite-placeholder',
'members:change-role',
--- database/factories/OrganizationInvitationFactory.php
@@ -1,37 +0,0 @@
-<?php
-
-declare(strict_types=1);
-
-namespace Database\Factories;
-
-use App\Enums\Role;
-use App\Models\Organization;
-use App\Models\OrganizationInvitation;
-use Illuminate\Database\Eloquent\Factories\Factory;
-
-/**
- * @extends Factory<OrganizationInvitation>
- */
-class OrganizationInvitationFactory extends Factory
-{
- /**
- * Define the model's default state.
- *
- * @return array<string, mixed>
- */
- public function definition(): array
- {
- return [
- 'email' => $this->faker->unique()->safeEmail(),
- 'role' => Role::Employee->value,
- 'organization_id' => Organization::factory(),
- ];
- }
-
- public function forOrganization(Organization $organization): self
- {
- return $this->state(fn (array $attributes) => [
- 'organization_id' => $organization->getKey(),
- ]);
- }
-}
--- routes/api.php
@@ -49,7 +49,6 @@
Route::name('invitations.')->group(static function () {
Route::get('/organizations/{organization}/invitations', [InvitationController::class, 'index'])->name('index');
Route::post('/organizations/{organization}/invitations', [InvitationController::class, 'store'])->name('store');
- Route::delete('/organizations/{organization}/invitations/{invitation}', [InvitationController::class, 'destroy'])->name('destroy');
});
// Project routes
--- tests/Unit/Endpoint/Api/V1/InvitationEndpointTest.php
@@ -76,52 +76,4 @@ public function test_store_invites_user_to_organization(): void
$this->assertEquals('[email protected]', $invitation->email);
$this->assertEquals('employee', $invitation->role);
}
-
- public function test_delete_fails_if_user_has_no_permission_to_remove_invitations(): void
- {
- // Arrange
- $data = $this->createUserWithPermission([
- ]);
- Passport::actingAs($data->user);
- $invitation = OrganizationInvitation::factory()->forOrganization($data->organization)->create();
-
- // Act
- $response = $this->deleteJson(route('api.v1.invitations.destroy', [$data->organization->getKey(), $invitation->getKey()]));
-
- // Assert
- $response->assertStatus(403);
- }
-
- public function test_delete_fails_if_invitation_belongs_to_different_organization(): void
- {
- // Arrange
- $data = $this->createUserWithPermission([
- 'invitations:remove',
- ]);
- Passport::actingAs($data->user);
- $invitation = OrganizationInvitation::factory()->create();
-
- // Act
- $response = $this->deleteJson(route('api.v1.invitations.destroy', [$data->organization->getKey(), $invitation->getKey()]));
-
- // Assert
- $response->assertStatus(403);
- }
-
- public function test_delete_removes_invitation(): void
- {
- // Arrange
- $data = $this->createUserWithPermission([
- 'invitations:remove',
- ]);
- Passport::actingAs($data->user);
- $invitation = OrganizationInvitation::factory()->forOrganization($data->organization)->create();
-
- // Act
- $response = $this->deleteJson(route('api.v1.invitations.destroy', [$data->organization->getKey(), $invitation->getKey()]));
-
- // Assert
- $response->assertStatus(204);
- $this->assertNull(OrganizationInvitation::find($invitation->getKey()));
- }
}
|
solidtime
|
solidtime-io
|
PHP
|
PHP
| 5,267
| 278
|
Modern open-source time-tracking app
|
solidtime-io_solidtime
|
NEW_FEAT
|
Code change: new js function
|
b709857ecbf511bb25603790ff9c3f12abe36559
|
2025-03-25 20:01:24
|
Eric Dumazet
|
ipv6: fix _DEVADD() and _DEVUPD() macros ip6_rcv_core() is using: __IP6_ADD_STATS(net, idev, IPSTATS_MIB_NOECTPKTS + (ipv6_get_dsfield(hdr) & INET_ECN_MASK), max_t(unsigned short, 1, skb_shinfo(skb)->gso_segs)); This is currently evaluating both expressions twice. Fix _DEVADD() and _DEVUPD() macros to evaluate their arguments once. Signed-off-by: Eric Dumazet <[email protected]> Reviewed-by: Simon Horman <[email protected]> Link: https://patch.msgid.link/[email protected] Signed-off-by: Jakub Kicinski <[email protected]>
| false
| 7
| 4
| 11
|
--- include/net/ipv6.h
@@ -246,20 +246,17 @@ extern int sysctl_mld_qrv;
#define _DEVADD(net, statname, mod, idev, field, val) \
({ \
struct inet6_dev *_idev = (idev); \
- unsigned long _field = (field); \
- unsigned long _val = (val); \
if (likely(_idev != NULL)) \
- mod##SNMP_ADD_STATS((_idev)->stats.statname, _field, _val); \
- mod##SNMP_ADD_STATS((net)->mib.statname##_statistics, _field, _val);\
+ mod##SNMP_ADD_STATS((_idev)->stats.statname, (field), (val)); \
+ mod##SNMP_ADD_STATS((net)->mib.statname##_statistics, (field), (val));\
})
#define _DEVUPD(net, statname, mod, idev, field, val) \
({ \
struct inet6_dev *_idev = (idev); \
- unsigned long _val = (val); \
if (likely(_idev != NULL)) \
- mod##SNMP_UPD_PO_STATS((_idev)->stats.statname, field, _val); \
- mod##SNMP_UPD_PO_STATS((net)->mib.statname##_statistics, field, _val);\
+ mod##SNMP_UPD_PO_STATS((_idev)->stats.statname, field, (val)); \
+ mod##SNMP_UPD_PO_STATS((net)->mib.statname##_statistics, field, (val));\
})
/* MIBs */
|
linux
|
torvalds
|
C
|
C
| 189,022
| 55,340
|
Linux kernel source tree
|
torvalds_linux
|
CONFIG_CHANGE
|
Obvious
|
6dc6646d529a7dc6a1d8e1a2c1507dff7e528c8d
|
2023-01-12 18:26:45
|
Romain Janvier
|
docs(git): add missing entries and run formatter
| false
| 14
| 12
| 26
|
--- plugins/git/README.md
@@ -60,8 +60,6 @@ plugins=(... git)
| gcpa | git cherry-pick --abort |
| gcpc | git cherry-pick --continue |
| gcs | git commit -S |
-| gcss | git commit -S -s |
-| gcssm | git commit -S -s -m |
| gd | git diff |
| gdca | git diff --cached |
| gdcw | git diff --cached --word-diff |
@@ -159,8 +157,8 @@ plugins=(... git)
| gsr | git svn rebase |
| gss | git status --short |
| gst | git status |
-| gsta | git stash push (git version >= 2.13) |
-| gsta | git stash save (git version < 2.13) |
+| gsta | git stash push |
+| gsta | git stash save |
| gstaa | git stash apply |
| gstc | git stash clear |
| gstd | git stash drop |
@@ -178,7 +176,7 @@ plugins=(... git)
| gtv | git tag \| sort -V |
| gtl | gtl(){ git tag --sort=-v:refname -n --list ${1}\* }; noglob gtl |
| gunignore | git update-index --no-assume-unchanged |
-| gunwip | git log --max-count=1 \| grep -q -c "\-\-wip\-\-" && git reset HEAD~1 |
+| gunwip | git log --max-count=1 \| grep -q -c "\-\-wip\-\-" && git reset HEAD~1 |
| gup | git pull --rebase |
| gupv | git pull --rebase --verbose |
| gupa | git pull --rebase --autostash |
@@ -194,10 +192,10 @@ plugins=(... git)
| gams | git am --skip |
| gama | git am --abort |
| gamscp | git am --show-current-patch |
-| gwt | git worktree |
-| gwtls | git worktree list |
-| gwtmv | git worktree move |
-| gwtrm | git worktree remove |
+| gwt | git worktree |
+| gwtls | git worktree list |
+| gwtmv | git worktree move |
+| gwtrm | git worktree remove |
### Main branch preference
@@ -229,7 +227,7 @@ These are aliases that have been removed, renamed, or otherwise modified in a wa
### Current
| Command | Description |
-| :--------------------- | :------------------------------------------------------------------------------------------------------- |
+|:-----------------------|:---------------------------------------------------------------------------------------------------------|
| `grename <old> <new>` | Rename `old` branch to `new`, including in origin remote |
| current_branch | Return the name of the current branch |
| git_current_user_name | Returns the `user.name` config value |
@@ -242,13 +240,13 @@ These are aliases that have been removed, renamed, or otherwise modified in a wa
These features allow to pause a branch development and switch to another one (_"Work in Progress"_, or wip). When you want to go back to work, just unwip it.
| Command | Description |
-| :--------------- | :---------------------------------------------- |
+|:-----------------|:------------------------------------------------|
| work_in_progress | Echoes a warning if the current branch is a wip |
| gwip | Commit wip branch |
| gunwip | Uncommit wip branch |
### Deprecated functions
-| Command | Description | Reason |
-| :----------------- | :-------------------------------------- | :-------------------------------------------------------------- |
-| current_repository | Return the names of the current remotes | Didn't work properly. Use `git remote -v` instead (`grv` alias) |
+| Command | Description | Reason |
+|:-----------------------|:----------------------------------------|:----------------------------------------------------------------|
+| current_repository | Return the names of the current remotes | Didn't work properly. Use `git remote -v` instead (`grv` alias) |
|
ohmyzsh
|
ohmyzsh
|
Shell
|
Shell
| 176,465
| 26,013
|
🙃 A delightful community-driven (with 2,400+ contributors) framework for managing your zsh configuration. Includes 300+ optional plugins (rails, git, macOS, hub, docker, homebrew, node, php, python, etc), 140+ themes to spice up your morning, and an auto-update tool that makes it easy to keep up with the latest updates from the community.
|
ohmyzsh_ohmyzsh
|
DOC_CHANGE
|
changes in readme
|
6b57241fa853455c7683fa45bdd1191a73ea13c6
|
2023-07-02 23:08:30
|
Binbin
|
Revert zrangeGenericCommand negative offset check (#12377) The negative offset check was added in #9052, we realized
that this is a non-mandatory breaking change and we would
like to add it only in 8.0.
This reverts PR #9052, will be re-introduced later in 8.0.
| false
| 5
| 28
| 33
|
--- src/t_zset.c
@@ -3630,7 +3630,7 @@ void zrangeGenericCommand(zrange_result_handler *handler, int argc_start, int st
long opt_end = 0;
int opt_withscores = 0;
long opt_offset = 0;
- long opt_limit = -1;
+ long opt_limit = -1; /* A negative limit returns all elements from the offset. */
/* Step 1: Skip the <src> <min> <max> args and parse remaining optional arguments. */
for (int j=argc_start + 3; j < c->argc; j++) {
@@ -3638,11 +3638,12 @@ void zrangeGenericCommand(zrange_result_handler *handler, int argc_start, int st
if (!store && !strcasecmp(c->argv[j]->ptr,"withscores")) {
opt_withscores = 1;
} else if (!strcasecmp(c->argv[j]->ptr,"limit") && leftargs >= 2) {
- if ((getLongFromObjectOrReply(c, c->argv[j+1], &opt_offset, NULL) != C_OK) ||
- (getLongFromObjectOrReply(c, c->argv[j+2], &opt_limit, NULL) != C_OK))
- {
+ if (getRangeLongFromObjectOrReply(c, c->argv[j+1], 0, LONG_MAX,
+ &opt_offset, "offset should be greater than or equal to 0") != C_OK)
+ return;
+
+ if (getLongFromObjectOrReply(c, c->argv[j+2], &opt_limit, NULL) != C_OK)
return;
- }
j += 2;
} else if (direction == ZRANGE_DIRECTION_AUTO &&
!strcasecmp(c->argv[j]->ptr,"rev"))
--- tests/unit/type/zset.tcl
@@ -370,6 +370,28 @@ start_server {tags {"zset"}} {
r zrem ztmp a b c d e f g
} {3}
+ test "ZRANGE* with wrong offset or limit should throw error" {
+ r del src{t} dst{t}
+
+ foreach offset {-1 -100 str NaN} {
+ assert_error "ERR*offset*" {r ZRANGE src{t} 0 -1 byscore limit $offset 1}
+ assert_error "ERR*offset*" {r ZRANGESTORE dst{t} src{t} 0 -1 byscore limit $offset 4}
+ assert_error "ERR*offset*" {r ZRANGEBYLEX src{t} (az (b limit $offset 5}
+ assert_error "ERR*offset*" {r ZRANGEBYSCORE src{t} 0 -1 limit $offset 2}
+ assert_error "ERR*offset*" {r ZREVRANGEBYLEX src{t} (az (b limit $offset 6}
+ assert_error "ERR*offset*" {r ZREVRANGEBYSCORE src{t} -1 0 limit $offset 3}
+ }
+
+ foreach limit {str NaN} {
+ assert_error "ERR value*" {r ZRANGE src{t} 0 -1 byscore limit 0 $limit}
+ assert_error "ERR value*" {r ZRANGESTORE dst{t} src{t} 0 -1 byscore limit 0 $limit}
+ assert_error "ERR value*" {r ZRANGEBYLEX src{t} (az (b limit 0 $limit}
+ assert_error "ERR value*" {r ZRANGEBYSCORE src{t} 0 -1 limit 0 $limit}
+ assert_error "ERR value*" {r ZREVRANGEBYLEX src{t} (az (b limit 0 $limit}
+ assert_error "ERR value*" {r ZREVRANGEBYSCORE src{t} -1 0 limit 0 $limit}
+ }
+ }
+
test "ZRANGE basics - $encoding" {
r del ztmp
r zadd ztmp 1 a
|
redis
|
redis
|
C
|
C
| 68,201
| 23,916
|
Redis is an in-memory database that persists on disk. The data model is key-value, but many different kind of values are supported: Strings, Lists, Sets, Sorted Sets, Hashes, Streams, HyperLogLogs, Bitmaps.
|
redis_redis
|
CODE_IMPROVEMENT
|
only offset change done
|
5ca71895630719cc41f8171aba8be461fb8cc9d2
|
2023-04-02 10:18:19
|
Christian Clauss
|
Rename quantum_random.py.DISABLED.txt to quantum_random.py (#8601) Co-authored-by: github-actions <${GITHUB_ACTOR}@users.noreply.github.com>
| false
| 31
| 30
| 61
|
--- DIRECTORY.md
@@ -1033,7 +1033,6 @@
* [Q Fourier Transform](quantum/q_fourier_transform.py)
* [Q Full Adder](quantum/q_full_adder.py)
* [Quantum Entanglement](quantum/quantum_entanglement.py)
- * [Quantum Random](quantum/quantum_random.py)
* [Quantum Teleportation](quantum/quantum_teleportation.py)
* [Ripple Adder Classic](quantum/ripple_adder_classic.py)
* [Single Qubit Measure](quantum/single_qubit_measure.py)
|
python
|
thealgorithms
|
Python
|
Python
| 197,891
| 46,346
|
All Algorithms implemented in Python
|
thealgorithms_python
|
DOC_CHANGE
|
changes in md file
|
1e2583eff25200b10f01d1fc2d13e7936fba57d2
|
2025-02-25 00:31:55
|
Jonah Williams
|
[ui] Fix ImageFilter.shader equality to consider uniform values. (#163348) Fixes https://github.com/flutter/flutter/issues/163302 Framework widgets check for ImageFIlter.== to determine whether to mark themselves dirty. The filter obejct needs to delegate its equality to the underlying native filter so that uniform values are considered.
| false
| 40
| 1
| 41
|
--- engine/src/flutter/lib/ui/dart_ui.cc
@@ -207,7 +207,6 @@ typedef CanvasPath Path;
V(ImageFilter, initComposeFilter) \
V(ImageFilter, initShader) \
V(ImageFilter, initMatrix) \
- V(ImageFilter, equals) \
V(ImageShader, dispose) \
V(ImageShader, initWithImage) \
V(ImmutableBuffer, dispose) \
--- engine/src/flutter/lib/ui/painting.dart
@@ -4474,14 +4474,9 @@ class _FragmentShaderImageFilter implements ImageFilter {
if (other.runtimeType != runtimeType) {
return false;
}
- return other is _FragmentShaderImageFilter &&
- other.shader == shader &&
- _equals(nativeFilter, other.nativeFilter);
+ return other is _FragmentShaderImageFilter && other.shader == shader;
}
- @Native<Bool Function(Handle, Handle)>(symbol: 'ImageFilter::equal')
- external static bool _equals(_ImageFilter a, _ImageFilter b);
-
@override
int get hashCode => shader.hashCode;
}
--- engine/src/flutter/lib/ui/painting/image_filter.cc
@@ -125,8 +125,4 @@ void ImageFilter::initShader(ReusableFragmentShader* shader) {
filter_ = shader->as_image_filter();
}
-bool ImageFilter::equals(ImageFilter* a, ImageFilter* b) {
- return a->filter_ == b->filter_;
-}
-
} // namespace flutter
--- engine/src/flutter/lib/ui/painting/image_filter.h
@@ -36,7 +36,6 @@ class ImageFilter : public RefCountedDartWrappable<ImageFilter> {
void initColorFilter(ColorFilter* colorFilter);
void initComposeFilter(ImageFilter* outer, ImageFilter* inner);
void initShader(ReusableFragmentShader* shader);
- bool equals(ImageFilter* a, ImageFilter* b);
const std::shared_ptr<DlImageFilter> filter(DlTileMode mode) const;
--- engine/src/flutter/testing/dart/fragment_shader_test.dart
@@ -415,34 +415,6 @@ void main() async {
expect(color, const Color(0xFF00FF00));
});
- // For an explaination of the problem see https://github.com/flutter/flutter/issues/163302 .
- test('ImageFilter.shader equality checks consider uniform values', () async {
- if (!impellerEnabled) {
- print('Skipped for Skia');
- return;
- }
- final FragmentProgram program = await FragmentProgram.fromAsset('filter_shader.frag.iplr');
- final FragmentShader shader = program.fragmentShader();
- final ImageFilter filter = ImageFilter.shader(shader);
-
- // The same shader is equal to itself.
- expect(filter, filter);
- expect(identical(filter, filter), true);
-
- final ImageFilter filter_2 = ImageFilter.shader(shader);
-
- // The different shader is equal as long as uniforms are identical.
- expect(filter, filter_2);
- expect(identical(filter, filter_2), false);
-
- // Not equal if uniforms change.
- shader.setFloat(0, 1);
- final ImageFilter filter_3 = ImageFilter.shader(shader);
-
- expect(filter, isNot(filter_3));
- expect(identical(filter, filter_3), false);
- });
-
if (impellerEnabled) {
print('Skipped for Impeller - https://github.com/flutter/flutter/issues/122823');
return;
|
flutter
|
flutter
|
Dart
|
Dart
| 168,965
| 28,132
|
Flutter makes it easy and fast to build beautiful apps for mobile and beyond
|
flutter_flutter
|
BUG_FIX
|
obvious
|
aa682bfd1207ce1ad1fd485e792530359251b219
| null |
Martin Lopes
|
doc: update the path to profiles.json (#3087) This path is consistently `WindowsTerminal_8wekyb3d8bbwe`.
| false
| 1
| 1
| 0
|
--- UsingJsonSettings.md
@@ -4,7 +4,7 @@ One way (currently the only way) to configure Windows Terminal is by editing the
`profiles.json` settings file. At the time of writing you can open the settings
file in your default editor by selecting `Settings` from the WT pull down menu.
-The settings are stored in the file `$env:LocalAppData\Packages\Microsoft.WindowsTerminal_<randomString>\LocalState\profiles.json`.
+The settings are stored in the file `$env:LocalAppData\Packages\Microsoft.WindowsTerminal_8wekyb3d8bbwe\LocalState\profiles.json`.
As of [#2515](https://github.com/microsoft/terminal/pull/2515), the settings are
split into _two_ files: a hardcoded `defaults.json`, and `profiles.json`, which
|
microsoft_terminal.json
| null | null | null | null | null | null |
microsoft_terminal.json
|
CONFIG_CHANGE
|
5, obvious
|
4a43c4748a8f7f96644c5e1ec9836fd855b49246
|
2024-11-25 08:02:34
|
Fabio Stabile
|
fix typo in dockerhub alternatives (#7780) Co-authored-by: Fabio Stabile <[email protected]>
| false
| 1
| 1
| 2
|
--- src/data/roadmaps/docker/content/107-container-registries/101-dockerhub-alt.md
@@ -1,6 +1,6 @@
# DockerHub Alternatives
-Container images can be stored in many different registries, not just Dockerhub. Most major cloud platforms now provide container registries such as "Artifact Registry" on Google Cloud Platform, Elastic Container Registry on AWS and Azure Container Registry on Microsoft Azure. GitHub also provides it's own registry which is useful when container builds are included in your GitHub Actions workflow.
+Container images can be stored in many different registries, not just Dockerhub. Most major cloud platforms now provide container registries such as "Artifact Registry" on Google Cloud Platform, Elastic Container Registry on AWS and Azure Container Registry on Microsoft Azure. GitHub also provides it's own resistry which is useful when container builds are included in your GitHub Actions workflow.
Visit the following resources to learn more:
|
developer-roadmap
|
kamranahmedse
|
TypeScript
|
TypeScript
| 309,677
| 40,429
|
Interactive roadmaps, guides and other educational content to help developers grow in their careers.
|
kamranahmedse_developer-roadmap
|
DOC_CHANGE
|
Obvious
|
bf35f97961306fa3dad817309cac0e48dd7b289b
|
2022-10-24 20:04:28
|
Ai Ling
|
[offers][fix] Use title typeahead, add default currency and remove specialization field (#423)
| false
| 116
| 132
| 248
|
--- apps/portal/src/components/offers/constants.ts
@@ -2,6 +2,26 @@ import { EducationBackgroundType } from './types';
export const emptyOption = '----';
+// TODO: use enums
+export const titleOptions = [
+ {
+ label: 'Software Engineer',
+ value: 'Software Engineer',
+ },
+ {
+ label: 'Frontend Engineer',
+ value: 'Frontend Engineer',
+ },
+ {
+ label: 'Backend Engineer',
+ value: 'Backend Engineer',
+ },
+ {
+ label: 'Full-stack Engineer',
+ value: 'Full-stack Engineer',
+ },
+];
+
export const locationOptions = [
{
label: 'Singapore, Singapore',
--- apps/portal/src/components/offers/offersSubmission/OffersSubmissionForm.tsx
@@ -115,7 +115,7 @@ export default function OffersSubmissionForm({
),
hasNext: true,
hasPrevious: false,
- label: 'Offers',
+ label: 'Offer details',
},
{
component: <BackgroundForm key={1} />,
@@ -125,33 +125,28 @@ export default function OffersSubmissionForm({
},
{
component: (
- <OffersProfileSave
+ <OfferAnalysis
key={2}
- profileId={createProfileResponse.id || ''}
- token={createProfileResponse.token}
+ allAnalysis={analysis}
+ isError={generateAnalysisMutation.isError}
+ isLoading={generateAnalysisMutation.isLoading}
/>
),
hasNext: true,
hasPrevious: false,
- label: 'Save profile',
+ label: 'Analysis',
},
{
component: (
- <div>
- <h5 className="mb-8 text-center text-4xl font-bold text-slate-900">
- Result
- </h5>
- <OfferAnalysis
- key={3}
- allAnalysis={analysis}
- isError={generateAnalysisMutation.isError}
- isLoading={generateAnalysisMutation.isLoading}
- />
- </div>
+ <OffersProfileSave
+ key={3}
+ profileId={createProfileResponse.id || ''}
+ token={createProfileResponse.token}
+ />
),
hasNext: false,
- hasPrevious: true,
- label: 'Analysis',
+ hasPrevious: false,
+ label: 'Save',
},
];
@@ -236,7 +231,7 @@ export default function OffersSubmissionForm({
<FormProvider {...formMethods}>
<form onSubmit={handleSubmit(onSubmit)}>
{formSteps[formStep].component}
- <pre>{JSON.stringify(formMethods.watch(), null, 2)}</pre>
+ {/* <pre>{JSON.stringify(formMethods.watch(), null, 2)}</pre> */}
{formSteps[formStep].hasNext && (
<div className="flex justify-end">
<Button
--- apps/portal/src/components/offers/offersSubmission/submissionForm/BackgroundForm.tsx
@@ -8,17 +8,13 @@ import {
emptyOption,
FieldError,
locationOptions,
+ titleOptions,
} from '~/components/offers/constants';
import type { BackgroundPostData } from '~/components/offers/types';
import CompaniesTypeahead from '~/components/shared/CompaniesTypeahead';
-import JobTitlesTypeahead from '~/components/shared/JobTitlesTypahead';
-import {
- Currency,
- CURRENCY_OPTIONS,
-} from '~/utils/offers/currency/CurrencyEnum';
+import { CURRENCY_OPTIONS } from '~/utils/offers/currency/CurrencyEnum';
-import FormMonthYearPicker from '../../forms/FormMonthYearPicker';
import FormRadioList from '../../forms/FormRadioList';
import FormSelect from '../../forms/FormSelect';
import FormTextInput from '../../forms/FormTextInput';
@@ -96,13 +92,13 @@ function FullTimeJobFields() {
return (
<>
<div className="mb-5 grid grid-cols-2 space-x-3">
- <div>
- <JobTitlesTypeahead
- onSelect={({ value }) =>
- setValue(`background.experiences.0.title`, value)
- }
- />
- </div>
+ <FormSelect
+ display="block"
+ label="Title"
+ options={titleOptions}
+ placeholder={emptyOption}
+ {...register(`background.experiences.0.title`)}
+ />
<div>
<CompaniesTypeahead
onSelect={({ value }) =>
@@ -116,7 +112,6 @@ function FullTimeJobFields() {
endAddOn={
<FormSelect
borderStyle="borderless"
- defaultValue={Currency.SGD}
isLabelHidden={true}
label="Currency"
options={CURRENCY_OPTIONS}
@@ -182,13 +177,13 @@ function InternshipJobFields() {
return (
<>
<div className="mb-5 grid grid-cols-2 space-x-3">
- <div>
- <JobTitlesTypeahead
- onSelect={({ value }) =>
- setValue(`background.experiences.0.title`, value)
- }
- />
- </div>
+ <FormSelect
+ display="block"
+ label="Title"
+ options={titleOptions}
+ placeholder={emptyOption}
+ {...register(`background.experiences.0.title`)}
+ />
<div>
<CompaniesTypeahead
onSelect={({ value }) =>
@@ -202,7 +197,6 @@ function InternshipJobFields() {
endAddOn={
<FormSelect
borderStyle="borderless"
- defaultValue={Currency.SGD}
isLabelHidden={true}
label="Currency"
options={CURRENCY_OPTIONS}
@@ -316,22 +310,6 @@ function EducationSection() {
{...register(`background.educations.0.school`)}
/>
</div>
- <div className="grid grid-cols-2 space-x-3">
- <FormMonthYearPicker
- monthLabel="Candidature Start"
- yearLabel=""
- {...register(`background.educations.0.startDate`, {
- required: FieldError.REQUIRED,
- })}
- />
- <FormMonthYearPicker
- monthLabel="Candidature End"
- yearLabel=""
- {...register(`background.educations.0.endDate`, {
- required: FieldError.REQUIRED,
- })}
- />
- </div>
</Collapsible>
</div>
</>
@@ -341,9 +319,13 @@ function EducationSection() {
export default function BackgroundForm() {
return (
<div>
- <h5 className="mb-8 text-center text-4xl font-bold text-slate-900">
+ <h5 className="mb-2 text-center text-4xl font-bold text-slate-900">
Help us better gauge your offers
</h5>
+ <h6 className="text-md mx-10 mb-8 text-center font-light text-slate-600">
+ This section is mostly optional, but your background information helps
+ us benchmark your offers.
+ </h6>
<div>
<YoeSection />
<CurrentJobSection />
--- apps/portal/src/components/offers/offersSubmission/submissionForm/OfferDetailsForm.tsx
@@ -13,7 +13,6 @@ import { JobType } from '@prisma/client';
import { Button, Dialog } from '@tih/ui';
import CompaniesTypeahead from '~/components/shared/CompaniesTypeahead';
-import JobTitlesTypeahead from '~/components/shared/JobTitlesTypahead';
import {
defaultFullTimeOfferValues,
@@ -24,6 +23,7 @@ import {
FieldError,
internshipCycleOptions,
locationOptions,
+ titleOptions,
yearOptions,
} from '../../constants';
import FormMonthYearPicker from '../../forms/FormMonthYearPicker';
@@ -32,10 +32,7 @@ import FormTextArea from '../../forms/FormTextArea';
import FormTextInput from '../../forms/FormTextInput';
import type { OfferFormData } from '../../types';
import { JobTypeLabel } from '../../types';
-import {
- Currency,
- CURRENCY_OPTIONS,
-} from '../../../../utils/offers/currency/CurrencyEnum';
+import { CURRENCY_OPTIONS } from '../../../../utils/offers/currency/CurrencyEnum';
type FullTimeOfferDetailsFormProps = Readonly<{
index: number;
@@ -67,11 +64,32 @@ function FullTimeOfferDetailsForm({
return (
<div className="my-5 rounded-lg border border-slate-200 px-10 py-5">
<div className="mb-5 grid grid-cols-2 space-x-3">
+ <FormSelect
+ display="block"
+ errorMessage={offerFields?.offersFullTime?.title?.message}
+ label="Title"
+ options={titleOptions}
+ placeholder={emptyOption}
+ required={true}
+ {...register(`offers.${index}.offersFullTime.title`, {
+ required: FieldError.REQUIRED,
+ })}
+ />
+ <FormTextInput
+ errorMessage={offerFields?.offersFullTime?.specialization?.message}
+ label="Focus / Specialization"
+ placeholder="e.g. Front End"
+ required={true}
+ {...register(`offers.${index}.offersFullTime.specialization`, {
+ required: FieldError.REQUIRED,
+ })}
+ />
+ </div>
+ <div className="mb-5 flex grid grid-cols-2 space-x-3">
<div>
- <JobTitlesTypeahead
- required={true}
+ <CompaniesTypeahead
onSelect={({ value }) =>
- setValue(`offers.${index}.offersFullTime.title`, value)
+ setValue(`offers.${index}.companyId`, value)
}
/>
</div>
@@ -85,15 +103,7 @@ function FullTimeOfferDetailsForm({
})}
/>
</div>
- <div className="mb-5 flex grid grid-cols-2 space-x-3">
- <div>
- <CompaniesTypeahead
- required={true}
- onSelect={({ value }) =>
- setValue(`offers.${index}.companyId`, value)
- }
- />
- </div>
+ <div className="mb-5 flex grid grid-cols-2 items-start space-x-3">
<FormSelect
display="block"
errorMessage={offerFields?.location?.message}
@@ -105,8 +115,6 @@ function FullTimeOfferDetailsForm({
required: FieldError.REQUIRED,
})}
/>
- </div>
- <div className="mb-5 flex grid grid-cols-2 items-start space-x-3">
<FormMonthYearPicker
monthLabel="Date Received"
monthRequired={true}
@@ -121,7 +129,6 @@ function FullTimeOfferDetailsForm({
endAddOn={
<FormSelect
borderStyle="borderless"
- defaultValue={Currency.SGD}
isLabelHidden={true}
label="Currency"
options={CURRENCY_OPTIONS}
@@ -158,12 +165,14 @@ function FullTimeOfferDetailsForm({
endAddOn={
<FormSelect
borderStyle="borderless"
- defaultValue={Currency.SGD}
isLabelHidden={true}
label="Currency"
options={CURRENCY_OPTIONS}
{...register(
`offers.${index}.offersFullTime.baseSalary.currency`,
+ {
+ required: FieldError.REQUIRED,
+ },
)}
/>
}
@@ -171,11 +180,13 @@ function FullTimeOfferDetailsForm({
errorMessage={offerFields?.offersFullTime?.baseSalary?.value?.message}
label="Base Salary (Annual)"
placeholder="0"
+ required={true}
startAddOn="$"
startAddOnType="label"
type="number"
{...register(`offers.${index}.offersFullTime.baseSalary.value`, {
min: { message: FieldError.NON_NEGATIVE_NUMBER, value: 0 },
+ required: FieldError.REQUIRED,
valueAsNumber: true,
})}
/>
@@ -183,22 +194,25 @@ function FullTimeOfferDetailsForm({
endAddOn={
<FormSelect
borderStyle="borderless"
- defaultValue={Currency.SGD}
isLabelHidden={true}
label="Currency"
options={CURRENCY_OPTIONS}
- {...register(`offers.${index}.offersFullTime.bonus.currency`)}
+ {...register(`offers.${index}.offersFullTime.bonus.currency`, {
+ required: FieldError.REQUIRED,
+ })}
/>
}
endAddOnType="element"
errorMessage={offerFields?.offersFullTime?.bonus?.value?.message}
label="Bonus (Annual)"
placeholder="0"
+ required={true}
startAddOn="$"
startAddOnType="label"
type="number"
{...register(`offers.${index}.offersFullTime.bonus.value`, {
min: { message: FieldError.NON_NEGATIVE_NUMBER, value: 0 },
+ required: FieldError.REQUIRED,
valueAsNumber: true,
})}
/>
@@ -208,22 +222,25 @@ function FullTimeOfferDetailsForm({
endAddOn={
<FormSelect
borderStyle="borderless"
- defaultValue={Currency.SGD}
isLabelHidden={true}
label="Currency"
options={CURRENCY_OPTIONS}
- {...register(`offers.${index}.offersFullTime.stocks.currency`)}
+ {...register(`offers.${index}.offersFullTime.stocks.currency`, {
+ required: FieldError.REQUIRED,
+ })}
/>
}
endAddOnType="element"
errorMessage={offerFields?.offersFullTime?.stocks?.value?.message}
label="Stocks (Annual)"
placeholder="0"
+ required={true}
startAddOn="$"
startAddOnType="label"
type="number"
{...register(`offers.${index}.offersFullTime.stocks.value`, {
min: { message: FieldError.NON_NEGATIVE_NUMBER, value: 0 },
+ required: FieldError.REQUIRED,
valueAsNumber: true,
})}
/>
@@ -274,19 +291,32 @@ function InternshipOfferDetailsForm({
return (
<div className="my-5 rounded-lg border border-slate-200 px-10 py-5">
<div className="mb-5 grid grid-cols-2 space-x-3">
- <div>
- <JobTitlesTypeahead
- required={true}
- onSelect={({ value }) =>
- setValue(`offers.${index}.offersIntern.title`, value)
- }
- />
- </div>
+ <FormSelect
+ display="block"
+ errorMessage={offerFields?.offersIntern?.title?.message}
+ label="Title"
+ options={titleOptions}
+ placeholder={emptyOption}
+ required={true}
+ {...register(`offers.${index}.offersIntern.title`, {
+ minLength: 1,
+ required: FieldError.REQUIRED,
+ })}
+ />
+ <FormTextInput
+ errorMessage={offerFields?.offersIntern?.specialization?.message}
+ label="Focus / Specialization"
+ placeholder="e.g. Front End"
+ required={true}
+ {...register(`offers.${index}.offersIntern.specialization`, {
+ minLength: 1,
+ required: FieldError.REQUIRED,
+ })}
+ />
</div>
<div className="mb-5 grid grid-cols-2 space-x-3">
<div>
<CompaniesTypeahead
- required={true}
onSelect={({ value }) =>
setValue(`offers.${index}.companyId`, value)
}
@@ -344,7 +374,6 @@ function InternshipOfferDetailsForm({
endAddOn={
<FormSelect
borderStyle="borderless"
- defaultValue={Currency.SGD}
isLabelHidden={true}
label="Currency"
options={CURRENCY_OPTIONS}
--- apps/portal/src/components/offers/table/OffersRow.tsx
@@ -1,8 +1,5 @@
import Link from 'next/link';
-import type { JobTitleType } from '~/components/shared/JobTitles';
-import { getLabelForJobTitleType } from '~/components/shared/JobTitles';
-
import { convertMoneyToString } from '~/utils/offers/currency';
import { formatDate } from '~/utils/offers/time';
@@ -22,9 +19,7 @@ export default function OfferTableRow({
scope="row">
{company.name}
</th>
- <td className="py-4 px-6">
- {getLabelForJobTitleType(title as JobTitleType)}
- </td>
+ <td className="py-4 px-6">{title}</td>
<td className="py-4 px-6">{totalYoe}</td>
<td className="py-4 px-6">{convertMoneyToString(income)}</td>
<td className="py-4 px-6">{formatDate(monthYearReceived)}</td>
--- apps/portal/src/pages/offers/home.tsx
@@ -1,12 +1,13 @@
import { useState } from 'react';
+import { Select } from '@tih/ui';
+import { titleOptions } from '~/components/offers/constants';
import OffersTitle from '~/components/offers/OffersTitle';
import OffersTable from '~/components/offers/table/OffersTable';
import CompaniesTypeahead from '~/components/shared/CompaniesTypeahead';
-import JobTitlesTypeahead from '~/components/shared/JobTitlesTypahead';
export default function OffersHomePage() {
- const [jobTitleFilter, setjobTitleFilter] = useState('software-engineer');
+ const [jobTitleFilter, setjobTitleFilter] = useState('Software Engineer');
const [companyFilter, setCompanyFilter] = useState('');
return (
@@ -17,17 +18,19 @@ export default function OffersHomePage() {
<div className="mt-4 flex items-center">
Viewing offers for
<div className="mx-4">
- <JobTitlesTypeahead
+ <Select
isLabelHidden={true}
- placeHolder="Software Engineer"
- onSelect={({ value }) => setjobTitleFilter(value)}
+ label="Select a job title"
+ options={titleOptions}
+ value={jobTitleFilter}
+ onChange={setjobTitleFilter}
/>
</div>
in
<div className="ml-4">
<CompaniesTypeahead
isLabelHidden={true}
- placeHolder="All Companies"
+ placeHolder="All companies"
onSelect={({ value }) => setCompanyFilter(value)}
/>
</div>
--- apps/portal/src/pages/offers/profile/[offerProfileId].tsx
@@ -10,8 +10,6 @@ import type {
BackgroundDisplayData,
OfferDisplayData,
} from '~/components/offers/types';
-import type { JobTitleType } from '~/components/shared/JobTitles';
-import { getLabelForJobTitleType } from '~/components/shared/JobTitles';
import { useToast } from '~/../../../packages/ui/dist';
import { convertMoneyToString } from '~/utils/offers/currency';
@@ -64,9 +62,7 @@ export default function OfferProfile() {
companyName: res.company.name,
id: res.offersFullTime.id,
jobLevel: res.offersFullTime.level,
- jobTitle: getLabelForJobTitleType(
- res.offersFullTime.title as JobTitleType,
- ),
+ jobTitle: res.offersFullTime.title,
location: res.location,
negotiationStrategy: res.negotiationStrategy,
otherComment: res.comments,
@@ -81,9 +77,7 @@ export default function OfferProfile() {
const filteredOffer: OfferDisplayData = {
companyName: res.company.name,
id: res.offersIntern!.id,
- jobTitle: getLabelForJobTitleType(
- res.offersIntern!.title as JobTitleType,
- ),
+ jobTitle: res.offersIntern!.title,
location: res.location,
monthlySalary: convertMoneyToString(
res.offersIntern!.monthlySalary,
@@ -113,9 +107,7 @@ export default function OfferProfile() {
companyName: experience.company?.name,
duration: experience.durationInMonths,
jobLevel: experience.level,
- jobTitle: experience.title
- ? getLabelForJobTitleType(experience.title as JobTitleType)
- : null,
+ jobTitle: experience.title,
monthlySalary: experience.monthlySalary
? convertMoneyToString(experience.monthlySalary)
: null,
|
tech-interview-handbook
|
yangshun
|
TypeScript
|
TypeScript
| 122,353
| 15,039
|
💯 Curated coding interview preparation materials for busy software engineers
|
yangshun_tech-interview-handbook
|
BUG_FIX
|
this commit fixes/polishes an earlier feature
|
2f9240b7ac6c4798aefad87df9de719f3706d487
|
2024-05-07 02:45:37
|
pbbp0904
|
Added constant velocity warp shell metric
| false
| 286
| 0
| 286
|
--- Examples/1 Metrics/M2_Default_Metrics.mlx
Binary files a/Examples/1 Metrics/M2_Default_Metrics.mlx and b/Examples/1 Metrics/M2_Default_Metrics.mlx differ
--- Examples/4 Warp Shell/W1_Warp_Shell.mlx
Binary files a/Examples/4 Warp Shell/W1_Warp_Shell.mlx and /dev/null differ
--- Metrics/WarpShell/metricGet_WarpShellComoving.m
@@ -1,187 +0,0 @@
-function [Metric] = metricGet_WarpShellComoving(gridSize,worldCenter,m,R1,R2,Rbuff,sigma,smoothFactor,vWarp,doWarp,gridScaling)
-
-
-%% METRICGET_WARPSHELLCOMOVING: Builds the Warp Shell metric in a comoving frame
-% https://iopscience.iop.org/article/10.1088/1361-6382/ad26aa
-%
-% INPUTS:
-% gridSize - 1x4 array. world size in [t, x, y, z], double type.
-%
-% worldCenter - 1x4 array. world center location in [t, x, y, z], double type.
-%
-% m - total mass of the warp shell
-%
-% R1 - inner radius of the shell
-%
-% R2 - outer radius of the shell
-%
-% Rbuff - buffer distance between the shell wall and when the shift
-% starts to change
-%
-% sigma - sharpness parameter of the shift sigmoid
-%
-% smoothfactor - factor by which to smooth the walls of the shell
-%
-% vWarp - speed of the warp drive in factors of c, along the x direction, double type.
-%
-% doWarp - 0 or 1, whether or not to create the warp effect inside the
-% shell
-%
-% gridScale - scaling of the grid in [t, x, y, z]. double type.
-%
-% OUTPUTS:
-% metric - metric struct object.
-
-%%
-
-% input values
-if nargin < 6
- Rbuff = 0;
-end
-if nargin < 7
- sigma = 0;
-end
-if nargin < 8
- smoothFactor = 1;
-end
-if nargin < 9
- vWarp = 0;
-end
-if nargin < 10
- doWarp = 0;
-end
-if nargin < 11
- gridScaling = [1,1,1,1];
-end
-
-Metric.type = "metric";
-Metric.name = "Comoving Warp Shell";
-Metric.scaling = gridScaling;
-Metric.coords = "cartesian";
-Metric.index = "covariant";
-Metric.date = date;
-
-% declare radius array
-worldSize = sqrt((gridSize(2)*gridScaling(2)-worldCenter(2))^2+(gridSize(3)*gridScaling(3)-worldCenter(3))^2+(gridSize(4)*gridScaling(4)-worldCenter(4))^2);
-rSampleRes = 10^5;
-rsample = linspace(0,worldSize*1.2,rSampleRes);
-
-% construct rho profile
-rho = zeros(1,length(rsample))+m/(4/3*pi*(R2^3-R1^3)).*(rsample>R1 & rsample<R2);
-Metric.params.rho = rho;
-
-[~, maxR] = min(diff(rho>0));
-maxR = rsample(maxR);
-
-% construct mass profile
-M = cumtrapz(rsample, 4*pi.*rho.*rsample.^2);
-
-% construct pressure profile
-P = TOVconstDensity(R2,M,rho,rsample);
-Metric.params.P = P;
-
-% smooth functions
-rho = smooth(smooth(smooth(smooth(rho,1.79*smoothFactor),1.79*smoothFactor),1.79*smoothFactor),1.79*smoothFactor);
-rho = rho';
-Metric.params.rhosmooth = rho;
-
-P = smooth(smooth(smooth(smooth(P,smoothFactor),smoothFactor),smoothFactor),smoothFactor);
-P = P';
-Metric.params.Psmooth = P;
-
-% reconstruct mass profile
-M = cumtrapz(rsample, 4*pi.*rho.*rsample.^2);
-M(M<0) = max(M);
-
-
-% save varaibles
-Metric.params.M = M;
-Metric.params.rVec = rsample;
-
-
-% set shift line vector
-shiftRadialVector = compactSigmoid(rsample,R1,R2,sigma,Rbuff);
-shiftRadialVector = smooth(smooth(shiftRadialVector,smoothFactor),smoothFactor);
-
-% construct metric using spherical symmetric solution:
-% solve for B
-B = (1-2*G.*M./rsample/c^2).^(-1);
-B(1) = 1;
-
-% solve for a
-a = alphaNumericSolver(M,P,maxR,rsample);
-
-% solve for A from a
-A = -exp(2.*a);
-
-% save variables to the metric.params
-Metric.params.A = A;
-Metric.params.B = B;
-
-% return metric boosted and in cartesian space
-Metric.tensor = cell(4);
-for mu = 1:4
- for nu = 1:4
- Metric.tensor{mu,nu} = zeros(gridSize);
- end
-end
-ShiftMatrix = zeros(gridSize);
-
-% set offset value to handle r = 0
-epsilon = 0;
-
-for i = 1:gridSize(2)
- for j = 1:gridSize(3)
- for k = 1:gridSize(4)
-
- x = ((i*gridScaling(2)-worldCenter(2)));
- y = ((j*gridScaling(3)-worldCenter(3)));
- z = ((k*gridScaling(4)-worldCenter(4)));
-
- %ref Catalog of Spacetimes, Eq. (1.6.2) for coords def.
- r = sqrt(x^2+y^2+z^2)+epsilon;
- theta = atan2(sqrt(x^2+y^2),z);
- phi = atan2(y,x);
-
- [~, minIdx] = min(abs(rsample-r));
- if rsample(minIdx) > r
- minIdx = minIdx - 1;
- end
-
- minIdx = minIdx + (r-rsample(minIdx))/(rsample(minIdx+1)-rsample(minIdx));
-
- g11_sph = legendreRadialInterp(A,minIdx);
- g22_sph = legendreRadialInterp(B,minIdx);
-
- [g11_cart, g22_cart, g23_cart, g24_cart, g33_cart, g34_cart, g44_cart] = sph2cartDiag(theta,phi,g11_sph,g22_sph);
-
- Metric.tensor{1,1}(1,i,j,k) = g11_cart;
-
- Metric.tensor{2,2}(1,i,j,k) = g22_cart;
-
- Metric.tensor{2,3}(1,i,j,k) = g23_cart;
- Metric.tensor{3,2}(1,i,j,k) = Metric.tensor{2,3}(1,i,j,k);
-
- Metric.tensor{2,4}(1,i,j,k) = g24_cart;
- Metric.tensor{4,2}(1,i,j,k) = Metric.tensor{2,4}(1,i,j,k);
-
- Metric.tensor{3,3}(1,i,j,k) = g33_cart;
-
- Metric.tensor{3,4}(1,i,j,k) = g34_cart;
- Metric.tensor{4,3}(1,i,j,k) = Metric.tensor{3,4}(1,i,j,k);
-
- Metric.tensor{4,4}(1,i,j,k) = g44_cart;
-
- ShiftMatrix(1,i,j,k) = legendreRadialInterp(shiftRadialVector,minIdx);
-
- end
- end
-end
-
-% Add warp effect
-if doWarp
- Metric.tensor{1,2} = Metric.tensor{1,2}-Metric.tensor{1,2}.*ShiftMatrix - ShiftMatrix*vWarp;
- Metric.tensor{2,1} = Metric.tensor{1,2};
-end
-
-end
--- Metrics/utils/TOVconstDensity.m
@@ -1,3 +0,0 @@
-function P = TOVconstDensity(R,M,rho,r)
- P = c^2*rho.*((R*sqrt(R-2*G*M(end)/c^2)-sqrt(R^3-2*G*M(end).*r.^2/c^2))./(sqrt(R^3-2*G*M(end).*r.^2/c^2)-3*R*sqrt(R-2*G*M(end)/c^2))).*(r<R);
-end
\ No newline at end of file
--- Metrics/utils/alphaNumericSolver.m
@@ -1,52 +0,0 @@
-function alpha = alphaNumericSolver(M,P,R,r)
-
-% % Trapezoidal Method:
-dalpha = (G*M./c^2+4*pi*G*r.^3.*P./c^4)./(r.^2-2*G*M.*r./c^2);
-dalpha(1) = 0;
-alphaTemp = cumtrapz(r,dalpha);
-C = 1/2*log(1-2*G*M(end)./r(end)/c^2);
-offset = C-alphaTemp(end);
-alpha = alphaTemp+offset;
-
-
-% Old manual integration
-% alpha = 1/2*log(1-2*G*M(end)./r/c^2);
-%
-% for i = flip(2:(length(r)))
-% if r(i) <= R
-% dr = -(r(i)-r(i-1));
-% dalpha = (G*M(i)/c^2+4*pi*G*r(i)^3*P(i)/c^4)/(r(i)^2*(1-2*G*M(i)/r(i)/c^2));
-% alpha(i-1) = alpha(i)+dalpha*dr;
-% end
-% end
-
-% Simpson's rule
-% alpha = 1/2*log(1-2*G*M(end)./r/c^2);
-% alpha = zeros(length(r),1);
-% dalpha = (G*M./c^2+4*pi*G*r.^3.*P./c^4)./(r.^2-2*G*M.*r./c^2);
-% dalpha(1) = 0;
-%
-% for i = flip(2:(length(r)-2))
-% a = r(i);
-% b = r(i-1);
-% [~, minIdx1] = min(abs(r-(2*a+b)/3));
-% if r(minIdx1) > (2*a+b)/3
-% minIdx1 = minIdx1 - 1;
-% end
-% minIdx1 = minIdx1 + ((2*a+b)/3-r(minIdx1))/(r(minIdx1+1)-r(minIdx1));
-%
-% [~, minIdx2] = min(abs(r-(a+2*b)/3));
-% if r(minIdx2) > (2*a+b)/3
-% minIdx2 = minIdx2 - 1;
-% end
-% minIdx2 = minIdx2 + ((2*a+b)/3-r(minIdx2))/(r(minIdx2+1)-r(minIdx2));
-%
-% k1 = dalpha(i-1);
-% k2 = legendreRadialInterp(dalpha,minIdx1);
-% k3 = legendreRadialInterp(dalpha,minIdx2);
-% k4 = dalpha(i);
-% simp = (b-a)/8*(k1+3*k2+3*k3+k4);
-% alpha(i-1) = alpha(i)+simp;
-% end
-
-end
--- Metrics/utils/compactSigmoid.m
@@ -1,8 +0,0 @@
-function f = compactSigmoid(r,R1,R2,sigma,Rbuff)
-
-f = abs(1./(exp(((R2-R1-2*Rbuff)*(sigma+2))/2*(1./(r-R2+Rbuff)+1./(r-R1-Rbuff)))+1).*(r>R1+Rbuff).*(r<R2-Rbuff)+(r>=R2-Rbuff)-1);
-if any(isinf(f)) || any(~isreal(f))
- error('compact sigmoid returns non-numeric values!')
-end
-
-end
\ No newline at end of file
--- Metrics/utils/sph2cartDiag.m
@@ -1,36 +0,0 @@
-function [g11_cart, g22_cart, g23_cart, g24_cart, g33_cart, g34_cart, g44_cart] = sph2cartDiag(theta,phi,g11_sph,g22_sph)
-
-g11_cart = g11_sph;
-
-E = g22_sph;
-
-if abs(phi) == pi/2
- cosPhi = 0;
-else
- cosPhi = cos(phi);
-end
-
-if abs(theta) == pi/2
- cosTheta = 0;
-else
- cosTheta = cos(theta);
-end
-
-
-g22_cart = (E*cosPhi^2*sin(theta)^2 + (cosPhi^2*cosTheta^2)) + sin(phi)^2;
-g33_cart = (E*sin(phi)^2*sin(theta)^2 + (cosTheta^2*sin(phi)^2)) + cosPhi^2;
-g44_cart = (E*cosTheta^2 + sin(theta)^2);
-
-g23_cart = (E*cosPhi*sin(phi)*sin(theta)^2 + (cosPhi*cosTheta^2*sin(phi)) - cosPhi*sin(phi));
-g24_cart = (E*cosPhi*cosTheta*sin(theta) - (cosPhi*cosTheta*sin(theta)));
-g34_cart = (E*cosTheta*sin(phi)*sin(theta) - (cosTheta*sin(phi)*sin(theta)));
-
-% g22_cart = (E*cos(phi)^2*sin(theta)^2 + (cos(phi)^2*cos(theta)^2)) + sin(phi)^2;
-% g33_cart = (E*sin(phi)^2*sin(theta)^2 + (cos(theta)^2*sin(phi)^2)) + cos(phi)^2;
-% g44_cart = (E*cos(theta)^2 + sin(theta)^2);
-%
-% g23_cart = (E*cos(phi)*sin(phi)*sin(theta)^2 + (cos(phi)*cos(theta)^2*sin(phi)) - cos(phi)*sin(phi));
-% g24_cart = (E*cos(phi)*cos(theta)*sin(theta) - (cos(phi)*cos(theta)*sin(theta)));
-% g34_cart = (E*cos(theta)*sin(phi)*sin(theta) - (cos(theta)*sin(phi)*sin(theta)));
-
-end
|
warpfactory
|
nerdswithattitudes
|
MATLAB
|
MATLAB
| 298
| 41
|
WarpFactory is a numerical toolkit for analyzing warp drive spacetimes.
|
nerdswithattitudes_warpfactory
|
NEW_FEAT
|
obvious
|
414d5eaf68bc368d44273125fe64a58102d37850
|
2023-03-24 14:05:08
|
jianchenggu
|
Update README.md
| false
| 20
| 0
| 20
|
--- README.md
@@ -133,11 +133,9 @@ University of Bamberg, Germany
- [Master of Data Science for Public Policy](https://www.hertie-school.org/en/mds), Hertie School, Germany
- [Master Social and Economic Data Science](https://www.polver.uni-konstanz.de/studium/master/master-social-and-economic-data-science/),
University of Konstanz, Germany
-- [M.Sc. Quantitative Data Science Methods: Psychometrics, Econometrics and Machine Learning], University of Tübingen, Germany
- ~~[Master Computational Social Systems](https://www.rwth-aachen.de/cms/root/Studium/Vor-dem-Studium/Studiengaenge/Liste-Aktuelle-Studiengaenge/Studiengangbeschreibung/~sthd/Computational-Social-Systems-M-Sc/)~~ (This course of study is being phased out), RWTH Aachen, Germany
- [Master Politics and Data Science](https://www.ucd.ie/connected_politics/studywithus/), University College Dublin, Ireland
- [MSc Social Data Science](https://hub.ucd.ie/usis/!W_HU_MENU.P_PUBLISH?p_tag=PROG&MAJR=W559), University College Dublin, Ireland
-- [PhD Quantitative and Computational Social Science](https://www.ucd.ie/spire/study/prospectivephdstudents/phdquantitativeandcomputationalsocialscience/), University College Dublin, Ireland
- [MSc/PG Diploma Applied Social Data Science](https://www.tcd.ie/Political_Science/postgraduate/pg-dip-applied-social-data-science/), Trinity College Dublin, Ireland
- [Master Data Science for Economics](https://dse.cdl.unimi.it/en), University of Milan, Italy
- [Master (Research) in Societal Resilience - Big Data for Society](https://vu.nl/en/education/master/societal-resilience/), Vrije Universiteit Amsterdam, Netherlands
@@ -161,8 +159,6 @@ Koç University, Turkey
- [M.A. in Computational Social Science](https://macss.uchicago.edu/), University of Chicago, United States
- [M.S. in Computational Analysis & Public Policy](https://capp.uchicago.edu/), University of Chicago, United States
- [Master of Science in Data Analytics & Computational Social Science](https://www.umass.edu/social-sciences/academics/data-analytics-computational-social-science/ms-dacss), University of Massachusetts Amherst, United States
-- [Master of Arts in Interdisciplinary Studies: Computational Social Science Concentration](https://mais.gmu.edu/programs/la-mais-isin-css), George Mason University, United States
-- [PhD in Computational Social Science](https://science.gmu.edu/academics/departments-units/computational-data-sciences/computational-social-science-phd), George Mason University, United States
- [Master of Science in Data Science for Public Policy](https://mccourt.georgetown.edu/master-of-science-in-data-science-for-public-policy/), Georgetown University, United States
- [Master of Science in Public Policy and Data Science](https://priceschool.usc.edu/mppds/), University of Southern California, United States
- [Master's Degree Applied Urban Science and Informatics](https://cusp.nyu.edu/masters-degree/), New York University, United States
@@ -173,19 +169,14 @@ Koç University, Turkey
## Research Groups
> Ordered alphabetically by country and city
-- [Computational Social Science Lab](https://www.sydney.edu.au/arts/our-research/centres-institutes-and-groups/sydney-social-sciences-and-humanities-advanced-research-centre/research/computational-social-science-lab.html), University of Sydney, Australia
- [CSS Lab TU Graz](https://www.tugraz.at/institute/isds/research/research-groups/computational-social-science-lab-css-lab/), Graz, Austria
- [Digital Humanities Lab at UFBA](https://www.labhd.ufba.br/), Salvador, Brazil
-- [Computational Communication Collaboratory](https://computational-communication.com/), Nanjing, China
- [Copenhagen Center for Social Data Science (SODAS)](https://sodas.ku.dk), Copenhagen, Denmark
- [NEtwoRks, Data, and Society (NERDS)](https://nerds.itu.dk/), Copenhagen, Denmark
-- [Centre for Social Data Science](https://www.helsinki.fi/en/networks/centre-social-data-science), Helsinki, Finland
- [CSS Lab RWTH Aachen](https://www.css-lab.rwth-aachen.de/), Aachen, Germany
- [CSS Department at GESIS](https://www.gesis.org/en/institute/departments/computational-social-science), Cologne, Germany
- [Computational Social Science and Big Data TUM Munich](https://www.hfp.tum.de/css/startseite/), Munich, Germany
- [Department of Digital and Computational Demography](https://www.demogr.mpg.de/en/research_6120/digital_and_computational_demography_zagheni_11666/), Rostock, Germany
-- [Computational Social Sciences and Law Lab](https://www.cityu.edu.hk/cpal/lab_cssl.htm), City University of Hong Kong
-- [Web Mining Lab](http://weblab.com.cityu.edu.hk/blog/), City University of Hong Kong
- [Connected_Politics Lab](https://www.ucd.ie/connected_politics/), Dublin, Ireland
- [Behave Lab](https://behavelab.org/), Milan, Italy
- [Center of Data Science and Complexity for Society (CDCS)](https://cdcs.di.uniroma1.it/index.php), Sapienza University, Rome, Italy
@@ -193,24 +184,14 @@ Koç University, Turkey
- [Mobile and Social Computing Lab (MobS Lab)](https://ict.fbk.eu/units/mobs/), Bruno Kessler Foundation, Trento, Italy
- [CENTAI Institute](https://centai.eu), Turin, Italy
- [Computational Social Science Lab](https://www.colorlessgreen.info/), Tokyo Institute of Technology, Tokyo, Japan
-- [Computational Communication Science Amsterdam](https://ccs.amsterdam), Netherlands
-- [Social and Behavioural Data Science Centre](https://sobedsc.uva.nl/), Amsterdam, Netherlands
-- [ODISSEI (Open Data Infrastructure for Social Science and Economic Innovations)](https://odissei-data.nl/en/), Rotterdam, Netherlands
- [Institute for Cross-Disciplinary Physics and Complex Systems (IFISC)](https://ifisc.uib-csic.es/en/research/dynamics-and-collective-phenomena-social-systems/), Palma, Spain
- [Social Networks Lab](https://sn.ethz.ch/), Zürich, Switzerland
-- [Communication Data and Network Analytics Lab (CDNA)](https://survey.sinica.edu.tw/?page_id=4844&lang=en), Academia Sinica, Taipei, Taiwan
- [Data Science and AI Lab](https://nyuad.nyu.edu/en/research/faculty-labs-and-projects/data-science-and-ai-lab.html), Abu Dhabi, UAE
-- [Social Data Institute](https://www.ucl.ac.uk/social-data/home/social-data-institute), University College London, UK
- [Oxford Internet Institute](https://www.oii.ox.ac.uk/about/), Oxford, UK
- [Observatory on Social Media](https://osome.iu.edu/), Indiana University, Bloomington, USA
-- [Soda (Social Data and AI) Lab](https://soda-labo.github.io/), Indiana University, Bloomington, USA
- [Lazerlab](https://lazerlab.net/), Northeastern University, Boston, USA
-- [Social data science center](https://socialdatascience.umd.edu/), University of Maryland, College Park, USA
- [Laboratory for the Modeling of Biological and Socio-Technical Systems (MOBS Lab)](https://www.mobs-lab.org/), Northeastern University, Boston, USA
- [Computational Social Science Institute at UMass](https://www.cssi.umass.edu), Massachusetts Amherst, USA
-- [Working Group on Computational Social Science](https://datascience.columbia.edu/research/groups/computational-social-science/), Columbia University, New York, USA
-- [Center for Computational Analysis of Social and Organizational Systems (CASOS)](http://www.casos.cs.cmu.edu/index.php), Carnegie Mellon University, Pittsburgh, USA
-- [IRiSS Center for Computational Social Science](https://iriss.stanford.edu/research-centers/computational-social-science), Stanford University, USA
## Journals
@@ -219,7 +200,6 @@ Koç University, Turkey
- [Big Data & Society](https://journals.sagepub.com/home/bds)
- [Computational Communication Research](https://computationalcommunication.org/ccr)
-- [Computational Economics](https://www.springer.com/journal/10614)
- [EPJ Data Science](https://epjdatascience.springeropen.com/)
- [Frontiers in Big Data](https://www.frontiersin.org/journals/big-data)
- [Information, Communication & Society](https://www.tandfonline.com/journals/rics20)
|
awesome-computational-social-science
|
gesiscss
|
R
|
R
| 648
| 83
|
A list of awesome resources for Computational Social Science
|
gesiscss_awesome-computational-social-science
|
DOC_CHANGE
|
changes in readme
|
91c7ed45a312a71b7d3b856228047ddcd51d4651
|
2023-04-20 17:12:47
|
Nathanial Spearing
|
feat(npm): add `npmrd` alias (#11627)
| false
| 4
| 0
| 4
|
--- plugins/npm/README.md
@@ -29,7 +29,6 @@ plugins=(... npm)
| `npmI` | `npm init` | Run npm init |
| `npmi` | `npm info` | Run npm info |
| `npmSe` | `npm search` | Run npm search |
-| `npmrd` | `npm run dev` | Run npm run dev |
## `npm install` / `npm uninstall` toggle
--- plugins/npm/npm.plugin.zsh
@@ -70,9 +70,6 @@ alias npmi="npm info"
# Run npm search
alias npmSe="npm search"
-# Run npm run dev
-alias npmrd="npm run dev"
-
npm_toggle_install_uninstall() {
# Look up to the previous 2 history commands
local line
|
ohmyzsh
|
ohmyzsh
|
Shell
|
Shell
| 176,465
| 26,013
|
🙃 A delightful community-driven (with 2,400+ contributors) framework for managing your zsh configuration. Includes 300+ optional plugins (rails, git, macOS, hub, docker, homebrew, node, php, python, etc), 140+ themes to spice up your morning, and an auto-update tool that makes it easy to keep up with the latest updates from the community.
|
ohmyzsh_ohmyzsh
|
NEW_FEAT
|
obvious
|
1454f490f4f61a828bd36c28e15bf7b0bd9a3103
|
2025-02-28 16:02:15
|
kobewi
|
Don't hard-code setting list in DependencyEditor
| false
| 19
| 24
| 43
|
--- editor/dependency_editor.cpp
@@ -589,19 +589,34 @@ void DependencyRemoveDialog::ok_pressed() {
}
}
- HashMap<String, StringName> setting_path_map;
- for (const StringName &setting : path_project_settings) {
- const String path = ResourceUID::ensure_path(GLOBAL_GET(setting));
- setting_path_map[path] = setting;
- }
-
bool project_settings_modified = false;
for (const String &file : files_to_delete) {
// If the file we are deleting for e.g. the main scene, default environment,
// or audio bus layout, we must clear its definition in Project Settings.
- const StringName *setting_name = setting_path_map.getptr(file);
- if (setting_name) {
- ProjectSettings::get_singleton()->set(*setting_name, "");
+ if (file == ResourceUID::ensure_path(GLOBAL_GET("application/config/icon"))) {
+ ProjectSettings::get_singleton()->set("application/config/icon", "");
+ project_settings_modified = true;
+ } else if (file == ResourceUID::ensure_path(GLOBAL_GET("application/run/main_scene"))) {
+ ProjectSettings::get_singleton()->set("application/run/main_scene", "");
+ project_settings_modified = true;
+ } else if (file == ResourceUID::ensure_path(GLOBAL_GET("application/boot_splash/image"))) {
+ ProjectSettings::get_singleton()->set("application/boot_splash/image", "");
+ project_settings_modified = true;
+ } else if (file == ResourceUID::ensure_path(GLOBAL_GET("rendering/environment/defaults/default_environment"))) {
+ ProjectSettings::get_singleton()->set("rendering/environment/defaults/default_environment", "");
+ project_settings_modified = true;
+ } else if (file == ResourceUID::ensure_path(GLOBAL_GET("display/mouse_cursor/custom_image"))) {
+ ProjectSettings::get_singleton()->set("display/mouse_cursor/custom_image", "");
+ project_settings_modified = true;
+ } else if (file == ResourceUID::ensure_path(GLOBAL_GET("gui/theme/custom"))) {
+ ProjectSettings::get_singleton()->set("gui/theme/custom", "");
+ project_settings_modified = true;
+ } else if (file == ResourceUID::ensure_path(GLOBAL_GET("gui/theme/custom_font"))) {
+ ProjectSettings::get_singleton()->set("gui/theme/custom_font", "");
+ project_settings_modified = true;
+ } else if (file == ResourceUID::ensure_path(GLOBAL_GET("audio/buses/default_bus_layout"))) {
+ ProjectSettings::get_singleton()->set("audio/buses/default_bus_layout", "");
+ project_settings_modified = true;
}
const String path = OS::get_singleton()->get_resource_dir() + file.replace_first("res://", "/");
@@ -701,14 +716,6 @@ DependencyRemoveDialog::DependencyRemoveDialog() {
owners->set_custom_minimum_size(Size2(0, 94) * EDSCALE);
vb_owners->add_child(owners);
owners->set_v_size_flags(Control::SIZE_EXPAND_FILL);
-
- List<PropertyInfo> property_list;
- ProjectSettings::get_singleton()->get_property_list(&property_list);
- for (const PropertyInfo &pi : property_list) {
- if (pi.type == Variant::STRING && pi.hint == PROPERTY_HINT_FILE) {
- path_project_settings.push_back(pi.name);
- }
- }
}
//////////////
--- editor/dependency_editor.h
@@ -120,8 +120,6 @@ class DependencyRemoveDialog : public ConfirmationDialog {
}
};
- LocalVector<StringName> path_project_settings;
-
void _find_files_in_removed_folder(EditorFileSystemDirectory *efsd, const String &p_folder);
void _find_all_removed_dependencies(EditorFileSystemDirectory *efsd, Vector<RemovedDependency> &p_removed);
void _find_localization_remaps_of_removed_files(Vector<RemovedDependency> &p_removed);
|
godot
|
godotengine
|
C++
|
C++
| 94,776
| 21,828
|
Godot Engine – Multi-platform 2D and 3D game engine
|
godotengine_godot
|
CODE_IMPROVEMENT
|
hard coding of setting list removed
|
6dfcff6dcfca078fb41d675c1bd29c251eeb8184
|
2024-11-09 02:52:31
|
jbengler
|
Add CRAN badge
| false
| 3
| 0
| 3
|
--- README.Rmd
@@ -23,7 +23,6 @@ knitr::opts_chunk$set(
<!-- badges: start -->
[](https://github.com/jbengler/tidyplots/actions/workflows/R-CMD-check.yaml)
-[](https://CRAN.R-project.org/package=tidyplots)
<!-- badges: end -->
The goal of `tidyplots` is to streamline the creation of publication-ready plots for scientific papers. It allows to gradually add, remove and adjust plot components using a consistent and intuitive syntax.
--- README.md
@@ -6,8 +6,6 @@
<!-- badges: start -->
[](https://github.com/jbengler/tidyplots/actions/workflows/R-CMD-check.yaml)
-[](https://CRAN.R-project.org/package=tidyplots)
<!-- badges: end -->
The goal of `tidyplots` is to streamline the creation of
|
tidyplots
|
jbengler
|
R
|
R
| 495
| 18
|
Tidy Plots for Scientific Papers
|
jbengler_tidyplots
|
DOC_CHANGE
|
changes in readme
|
445d8e71bc588dd7983e92579579ef45178f121c
|
2025-03-20 03:30:01
|
Chuan He
|
Support fusion location in TAC filter. PiperOrigin-RevId: 738542354
| false
| 31
| 5
| 36
|
--- tensorflow/compiler/mlir/lite/experimental/tac/tests/tac-filter.mlir
@@ -62,23 +62,3 @@ module {
func.return
}
}
-
-// -----
-
-// expected-remark@below {{Tac filter (0): filter type: function filter SKIP_TARGET_ANNOTATION, filter_pattern: "^testFunction"}}
-// expected-remark@below {{Tac filter (0) specified but not applied to any op}}
-// expected-remark@below {{Tac filter (1): filter type: function filter INCLUDE_TARGET_ANNOTATION, filter_pattern: "testFunctionInclude"}}
-// expected-remark@below {{Tac filter (1) specified but not applied to any op}}
-// expected-remark@below {{Tac filter (2): filter type: op filter, filter_pattern: "^test_op"}}
-module {
- // CHECK-LABEL: testOpMultipleResults
- // expected-remark@+1 {{all ops filtered by tac filter (2): "tfl.split_v"}}
- func.func @testOpMultipleResults(%arg0: tensor<16x4x4xf32>) -> (tensor<7x4x4xf32>, tensor<3x4x4xf32>, tensor<6x4x4xf32>) {
- %size_splits = arith.constant dense<[7, 3, 6]> : tensor<3xi32>
- %split_dim = arith.constant dense<0> : tensor<i32>
- // CHECK: tfl.split_v
- // CHECK-SAME: tac.skip_target_annotation
- %0, %1, %2 = "tfl.split_v"(%arg0, %size_splits, %split_dim) {num_splits = 3 : i32} : (tensor<16x4x4xf32>, tensor<3xi32>, tensor<i32>) -> (tensor<7x4x4xf32>, tensor<3x4x4xf32>, tensor<6x4x4xf32>) loc("test_op_split"("/tmp/test_model.tflite":0:0))
- func.return %0, %1, %2 : tensor<7x4x4xf32>, tensor<3x4x4xf32>, tensor<6x4x4xf32>
- }
-}
\ No newline at end of file
--- tensorflow/compiler/mlir/lite/experimental/tac/transforms/tac_filter.cc
@@ -157,14 +157,8 @@ void ApplyTacFilter(
OpFilter::MatchType match_type = tac_filter.op_filter().match_type();
OpFilter::DeviceType device_type = tac_filter.op_filter().device_type();
module.walk([&](Operation* op) {
- NameLoc loc;
- if (auto name_loc = mlir::dyn_cast<NameLoc>(op->getLoc())) {
- loc = name_loc;
- } else if (auto fused_loc = mlir::dyn_cast<FusedLoc>(op->getLoc())) {
- loc = dyn_cast<NameLoc>(fused_loc.getLocations().front());
- }
-
- if (!loc) {
+ auto named_loc = mlir::dyn_cast<NameLoc>(op->getLoc());
+ if (!named_loc) {
return;
}
// There can be two kinds of `match_type`:
@@ -177,11 +171,11 @@ void ApplyTacFilter(
//
// The code below maps an op to the appropriate device based on the above
// fields.
- if (op_regex.match(loc.getName())) {
+ if (op_regex.match(named_loc.getName())) {
switch (match_type) {
case OpFilter::MATCH:
if (device_type == OpFilter::CPU) {
- map_op_to_cpu(op, loc.getName().str());
+ map_op_to_cpu(op, named_loc.getName().str());
return;
}
map_op_to_custom_device(op);
@@ -193,7 +187,7 @@ void ApplyTacFilter(
switch (match_type) {
case OpFilter::INVERT_MATCH:
if (device_type == OpFilter::CPU) {
- map_op_to_cpu(op, loc.getName().str());
+ map_op_to_cpu(op, named_loc.getName().str());
return;
}
map_op_to_custom_device(op);
|
tensorflow
|
tensorflow
|
C++
|
C++
| 188,388
| 74,565
|
An Open Source Machine Learning Framework for Everyone
|
tensorflow_tensorflow
|
NEW_FEAT
|
support for fusion location added
|
2fc108699cfe94a9f878b0fdf34c87bc52d3620d
|
2024-10-28 09:39:16
|
hiddenSharp429
|
Feat(QRCode): Enhance 'Scan QR Code From Screen' notifications # #1250 Improve the clarity of notifications in the QR code scanning feature: 1. Permission handling: - Show notification when screen recording permission is missing - Automatically open system settings for permission grant 2. Enhanced scanning status notifications: When no QR code found: - Title: "Scanned X displays" - Subtitle: "No QR codes found" - Body: "Try adjusting the QR code position on your screen" When invalid QR codes found: - Title: "Found X QR code(s)" - Subtitle: "No valid Shadowsocks URLs" - Body: "QR codes found are not Shadowsocks configuration" When valid QR codes found: - Title: "Found X Shadowsocks URL(s)" - Subtitle: "Scanned X displays, found X QR codes" - Body: "Successfully added X server configuration(s)"
| false
| 140
| 55
| 195
|
--- ShadowsocksX-NG/AppDelegate.swift
@@ -627,43 +627,42 @@ class AppDelegate: NSObject, NSApplicationDelegate, NSUserNotificationCenterDele
}
func handleFoundSSURL(_ note: Notification) {
- let sendNotify = { (title: String, subtitle: String, infoText: String) in
+ let sendNotify = {
+ (title: String, subtitle: String, infoText: String) in
+
let userNote = NSUserNotification()
userNote.title = title
userNote.subtitle = subtitle
userNote.informativeText = infoText
userNote.soundName = NSUserNotificationDefaultSoundName
- NSUserNotificationCenter.default.deliver(userNote)
+ NSUserNotificationCenter.default
+ .deliver(userNote);
}
if let userInfo = (note as NSNotification).userInfo {
- // 检查错误
- if let error = userInfo["error"] as? String {
- sendNotify("Scan Failed", "", error.localized)
- return
- }
-
- // 使用新的通知信息
- let title = (userInfo["title"] as? String) ?? ""
- let subtitle = (userInfo["subtitle"] as? String) ?? ""
- let body = (userInfo["body"] as? String) ?? ""
-
let urls: [URL] = userInfo["urls"] as! [URL]
- let addCount = ServerProfileManager.instance.addServerProfileByURL(urls: urls)
+
+ let mgr = ServerProfileManager.instance
+ let addCount = mgr.addServerProfileByURL(urls: urls)
if addCount > 0 {
- sendNotify(
- title.localized,
- subtitle.localized,
- "Successfully added \(addCount) server configuration(s)".localized
- )
+ var subtitle: String = ""
+ if userInfo["source"] as! String == "qrcode" {
+ subtitle = "By scan QR Code".localized
+ } else if userInfo["source"] as! String == "url" {
+ subtitle = "By handle SS URL".localized
+ } else if userInfo["source"] as! String == "pasteboard" {
+ subtitle = "By import from pasteboard".localized
+ }
+
+ sendNotify("Add \(addCount) Shadowsocks Server Profile".localized, subtitle, "")
} else {
- sendNotify(
- title.localized,
- subtitle.localized,
- body.localized
- )
+ if userInfo["source"] as! String == "qrcode" {
+ sendNotify("", "", "Not found valid QRCode of shadowsocks profile".localized)
+ } else if userInfo["source"] as! String == "url" {
+ sendNotify("", "", "Not found valid URL of shadowsocks profile".localized)
+ }
}
}
}
--- ShadowsocksX-NG/Info.plist
@@ -49,7 +49,5 @@
<string>MainMenu</string>
<key>NSPrincipalClass</key>
<string>SWBApplication</string>
- <key>NSScreenCaptureUsageDescription</key>
- <string>ShadowsocksX-NG needs Screen Recording permission to scan QR codes on your screen</string>
</dict>
</plist>
--- ShadowsocksX-NG/Utils.m
@@ -10,153 +10,71 @@
#import <CoreImage/CoreImage.h>
#import <AppKit/AppKit.h>
-void ScanQRCodeOnScreen(void) {
- /* check system version and permission status */
- if (@available(macOS 10.12, *)) {
- BOOL hasPermission = CGPreflightScreenCaptureAccess();
- NSLog(@"Screen Recording Permission Status: %@", hasPermission ? @"Granted" : @"Not Granted");
-
- if (!hasPermission) {
- NSLog(@"Requesting Screen Recording Permission...");
- CGRequestScreenCaptureAccess();
-
- /* check permission status after request */
- hasPermission = CGPreflightScreenCaptureAccess();
- NSLog(@"Screen Recording Permission Status After Request: %@", hasPermission ? @"Granted" : @"Not Granted");
-
- if (!hasPermission) {
- NSLog(@"Screen Recording Permission Denied");
-
- /* send notification about permission missing */
- [[NSNotificationCenter defaultCenter]
- postNotificationName:@"NOTIFY_FOUND_SS_URL"
- object:nil
- userInfo:@{
- @"urls": @[],
- @"source": @"qrcode",
- @"error": @"Screen Recording permission required. Please grant permission in System Preferences and restart ShadowsocksX-NG"
- }];
-
- /* open system privacy settings */
- [[NSWorkspace sharedWorkspace] openURL:[NSURL URLWithString:@"x-apple.systempreferences:com.apple.preference.security?Privacy_ScreenCapture"]];
- return;
- }
- }
-
- NSLog(@"Proceeding with screen capture...");
- }
-
+void ScanQRCodeOnScreen(void) {
/* displays[] Quartz display ID's */
CGDirectDisplayID *displays = nil;
- CGDisplayCount dspCount = 0;
- /* variables for collecting scan information */
- NSMutableDictionary *scanInfo = [NSMutableDictionary dictionary];
- NSMutableArray *foundSSUrls = [NSMutableArray array];
- NSMutableArray *foundQRCodes = [NSMutableArray array];
+ CGError err = CGDisplayNoErr;
+ CGDisplayCount dspCount = 0;
/* How many active displays do we have? */
- CGError err = CGGetActiveDisplayList(0, NULL, &dspCount);
-
- if(err != CGDisplayNoErr) {
- [[NSNotificationCenter defaultCenter]
- postNotificationName:@"NOTIFY_FOUND_SS_URL"
- object:nil
- userInfo:@{
- @"urls": @[],
- @"source": @"qrcode",
- @"error": @"Failed to get display list"
- }];
+ err = CGGetActiveDisplayList(0, NULL, &dspCount);
+
+ /* If we are getting an error here then their won't be much to display. */
+ if(err != CGDisplayNoErr)
+ {
+ NSLog(@"Could not get active display count (%d)\n", err);
return;
}
- scanInfo[@"displayCount"] = @(dspCount);
- NSLog(@"Found %d displays", dspCount);
-
/* Allocate enough memory to hold all the display IDs we have. */
displays = calloc((size_t)dspCount, sizeof(CGDirectDisplayID));
// Get the list of active displays
- err = CGGetActiveDisplayList(dspCount, displays, &dspCount);
-
- if(err != CGDisplayNoErr) {
- free(displays);
- [[NSNotificationCenter defaultCenter]
- postNotificationName:@"NOTIFY_FOUND_SS_URL"
- object:nil
- userInfo:@{
- @"urls": @[],
- @"source": @"qrcode",
- @"error": @"Failed to get display information"
- }];
+ err = CGGetActiveDisplayList(dspCount,
+ displays,
+ &dspCount);
+
+ /* More error-checking here. */
+ if(err != CGDisplayNoErr)
+ {
+ NSLog(@"Could not get active display list (%d)\n", err);
return;
}
- CIDetector *detector = [CIDetector detectorOfType:@"CIDetectorTypeQRCode"
- context:nil
- options:@{ CIDetectorAccuracy:CIDetectorAccuracyHigh }];
+ NSMutableArray* foundSSUrls = [NSMutableArray array];
- int totalQRCodesFound = 0;
- int validSSUrlsFound = 0;
+ CIDetector *detector = [CIDetector detectorOfType:@"CIDetectorTypeQRCode"
+ context:nil
+ options:@{ CIDetectorAccuracy:CIDetectorAccuracyHigh }];
- for (unsigned int displaysIndex = 0; displaysIndex < dspCount; displaysIndex++) {
+ for (unsigned int displaysIndex = 0; displaysIndex < dspCount; displaysIndex++)
+ {
+ /* Make a snapshot image of the current display. */
CGImageRef image = CGDisplayCreateImage(displays[displaysIndex]);
NSArray *features = [detector featuresInImage:[CIImage imageWithCGImage:image]];
-
- /* count total QR codes found */
- totalQRCodesFound += (int)features.count;
-
for (CIQRCodeFeature *feature in features) {
- NSLog(@"Found QR Code: %@", feature.messageString);
- [foundQRCodes addObject:feature.messageString];
-
- if ([feature.messageString hasPrefix:@"ss://"]) {
+ NSLog(@"%@", feature.messageString);
+ if ( [feature.messageString hasPrefix:@"ss://"] )
+ {
NSURL *url = [NSURL URLWithString:feature.messageString];
if (url) {
[foundSSUrls addObject:url];
- validSSUrlsFound++;
}
}
}
- CGImageRelease(image);
+ CGImageRelease(image);
}
free(displays);
- /* prepare notification information */
- NSString *notificationTitle;
- NSString *notificationSubtitle;
- NSString *notificationBody;
-
- if (totalQRCodesFound == 0) {
- notificationTitle = [NSString stringWithFormat:@"Scanned %d displays", dspCount];
- notificationSubtitle = @"No QR codes found";
- notificationBody = @"Try adjusting the QR code position on your screen";
- } else if (validSSUrlsFound == 0) {
- notificationTitle = [NSString stringWithFormat:@"Found %d QR code(s)", totalQRCodesFound];
- notificationSubtitle = @"No valid Shadowsocks URLs";
- notificationBody = @"QR codes found are not Shadowsocks configuration";
- } else {
- notificationTitle = [NSString stringWithFormat:@"Found %d Shadowsocks URL(s)", validSSUrlsFound];
- notificationSubtitle = [NSString stringWithFormat:@"Scanned %d displays, found %d QR codes", dspCount, totalQRCodesFound];
- notificationBody = @"Processing Shadowsocks configuration...";
- }
-
[[NSNotificationCenter defaultCenter]
postNotificationName:@"NOTIFY_FOUND_SS_URL"
object:nil
- userInfo:@{
- @"urls": foundSSUrls,
- @"source": @"qrcode",
- @"title": notificationTitle,
- @"subtitle": notificationSubtitle,
- @"body": notificationBody,
- @"scanInfo": @{
- @"displayCount": @(dspCount),
- @"totalQRCodes": @(totalQRCodesFound),
- @"validURLs": @(validSSUrlsFound)
- }
- }];
+ userInfo: @{ @"urls": foundSSUrls,
+ @"source": @"qrcode"
+ }
+ ];
}
NSImage* createQRImage(NSString *string, NSSize size) {
|
shadowsocksx-ng
|
shadowsocks
|
Swift
|
Swift
| 32,651
| 7,935
|
Next Generation of ShadowsocksX
|
shadowsocks_shadowsocksx-ng
|
NEW_FEAT
|
this commit fixes/polishes an earlier feature (auto-start from PR #3609
|
98bedbe313879d3f3218b424f76c78d361db1ef3
|
2025-03-04 08:12:44
|
engine-flutter-autoroll
|
Roll Skia from 52d06100a044 to 6912d66c0c7a (1 revision) (#164526) https://skia.googlesource.com/skia.git/+log/52d06100a044..6912d66c0c7a 2025-03-04 [email protected] [dawn][headers] Removes use of ifdef for Descriptor change. If this roll has caused a breakage, revert this CL and stop the roller using the controls here: https://autoroll.skia.org/r/skia-flutter-autoroll Please CC [email protected],[email protected],[email protected] on the revert to ensure that a human is aware of the problem. To file a bug in Skia: https://bugs.chromium.org/p/skia/issues/entry To file a bug in Flutter: https://github.com/flutter/flutter/issues/new/choose To report a problem with the AutoRoller itself, please file a bug: https://issues.skia.org/issues/new?component=1389291&template=1850622 Documentation for the AutoRoller is here: https://skia.googlesource.com/buildbot/+doc/main/autoroll/README.md
| false
| 2
| 2
| 4
|
--- DEPS
@@ -14,7 +14,7 @@ vars = {
'flutter_git': 'https://flutter.googlesource.com',
'skia_git': 'https://skia.googlesource.com',
'llvm_git': 'https://llvm.googlesource.com',
- 'skia_revision': '6912d66c0c7a475486fdda75f5f0a738cb6509ad',
+ 'skia_revision': '52d06100a044877d985442dadf5d2f810f7cc74c',
# WARNING: DO NOT EDIT canvaskit_cipd_instance MANUALLY
# See `lib/web_ui/README.md` for how to roll CanvasKit to a new version.
--- engine/src/flutter/ci/licenses_golden/licenses_skia
@@ -1,4 +1,4 @@
-Signature: bca5a4bcc3e034474a3f88a935f5f370
+Signature: abb1614e0262e68aa744d6c4fa9ece53
====================================================================================================
LIBRARY: etc1
|
flutter
|
flutter
|
Dart
|
Dart
| 168,965
| 28,132
|
Flutter makes it easy and fast to build beautiful apps for mobile and beyond
|
flutter_flutter
|
CONFIG_CHANGE
|
version changes are done
|
e4c186eacb92a01fb8e32ccf02bce8cd52855f1d
|
2025-01-04 00:20:43
|
Kieran
|
Added IDs as an option for output templates (#543)
| false
| 11
| 5
| 16
|
--- lib/pinchflat/downloading/download_option_builder.ex
@@ -201,9 +201,6 @@ defmodule Pinchflat.Downloading.DownloadOptionBuilder do
source = media_item_with_preloads.source
%{
- "media_item_id" => to_string(media_item_with_preloads.id),
- "source_id" => to_string(source.id),
- "media_profile_id" => to_string(source.media_profile_id),
"source_custom_name" => source.custom_name,
"source_collection_id" => source.collection_id,
"source_collection_name" => source.collection_name,
--- lib/pinchflat_web/controllers/media_profiles/media_profile_html.ex
@@ -57,7 +57,7 @@ defmodule PinchflatWeb.MediaProfiles.MediaProfileHTML do
end
def media_center_custom_output_template_options do
- [
+ %{
season_by_year__episode_by_date: "<code>Season YYYY/sYYYYeMMDD</code>",
season_by_year__episode_by_date_and_index:
"same as the above but it handles dates better. <strong>This is the recommended option</strong>",
@@ -65,11 +65,11 @@ defmodule PinchflatWeb.MediaProfiles.MediaProfileHTML do
"<code>Season 1/s01eXX</code> where <code>XX</code> is the video's position in the playlist. Only recommended for playlists (not channels) that don't change",
static_season__episode_by_date:
"<code>Season 1/s01eYYMMDD</code>. Recommended for playlists that might change or where order isn't important"
- ]
+ }
end
def other_custom_output_template_options do
- [
+ %{
upload_day: nil,
upload_month: nil,
upload_year: nil,
@@ -85,11 +85,8 @@ defmodule PinchflatWeb.MediaProfiles.MediaProfileHTML do
season_episode_index_from_date:
"the upload date formatted as <code>sYYYYeMMDDII</code> where <code>II</code> is an index to prevent date collisions",
media_playlist_index:
- "the place of the media item in the playlist. Do not use with channels. May not work if the playlist is updated",
- media_item_id: "the ID of the media item in Pinchflat's database",
- source_id: "the ID of the source in Pinchflat's database",
- media_profile_id: "the ID of the media profile in Pinchflat's database"
- ]
+ "the place of the media item in the playlist. Do not use with channels. May not work if the playlist is updated"
+ }
end
def common_output_template_options do
|
pinchflat
|
kieraneglin
|
Elixir
|
Elixir
| 2,779
| 59
|
Your next YouTube media manager
|
kieraneglin_pinchflat
|
NEW_FEAT
|
Obvious
|
e59c21f4be1b225f5b8f5a236da16233d1a0c0c1
|
2024-07-03 21:42:33
|
jbengler
|
articles
| false
| 345
| 209
| 554
|
--- R/add-misc.R
@@ -10,7 +10,7 @@
#' @inheritParams ggplot2::geom_boxplot
#'
#' @export
-add_boxplot <- function(plot, dodge_width = NULL, saturation = 0.3, show_whiskers = TRUE, show_outliers = TRUE,
+add_boxplot <- function(plot, dodge_width = NULL, saturation = 0.3, show_whiskers = TRUE, show_outliers = FALSE,
box_width = 0.6, whiskers_width = 0.5, outlier.size = 0.5, coef = 1.5,
outlier.shape = 19, linewidth = 0.25, preserve = "total", ...) {
check_tidyplot(plot)
@@ -21,10 +21,11 @@ add_boxplot <- function(plot, dodge_width = NULL, saturation = 0.3, show_whisker
coef = 0
whiskers_width = box_width
}
+ if (show_outliers == FALSE) outliers = FALSE
plot +
ggplot2::stat_boxplot(geom ='errorbar', width = whiskers_width, position = position,
linewidth = linewidth, coef = coef) +
- ggplot2::geom_boxplot(outliers = show_outliers, outlier.shape = outlier.shape, outlier.size = outlier.size,
+ ggplot2::geom_boxplot(outliers = outliers, outlier.shape = outlier.shape, outlier.size = outlier.size,
width = box_width, position = position, linewidth = linewidth, coef = coef, ...)
}
--- R/add-proportional.R
@@ -11,13 +11,11 @@ ff_pie <- function(.type = "pie") {
if (is_missing(plot, "y")) {
plot <- plot + ggplot2::geom_bar(ggplot2::aes(x = NA), position = ggplot2::position_fill(reverse = reverse),
- width = width, color = NA, ...) +
- ggplot2::ggtitle("count")
+ width = width, color = NA, ...)
} else {
plot <- plot + ggplot2::stat_summary(ggplot2::aes(x = NA), geom = "bar", fun = sum,
position = ggplot2::position_fill(reverse = reverse),
- width = width, color = NA, ...) +
- ggplot2::ggtitle(get_variable(plot, "y"))
+ width = width, color = NA, ...)
}
suppressMessages(
plot <- plot +
--- vignettes/articles/Advanced-plotting.Rmd
@@ -17,7 +17,7 @@ knitr::opts_chunk$set(
```
::: {.lead}
-In this article, we will explore advanced plotting techniques offered in tidyplots. We will cover the rasterizing of individual plot components, data subsetting for highlighting selected data points, and the construction of powerful plotting pipelines. Moreover, we will discuss the visualization of paired and missing data. We will conclude by introducing the concepts of plot orientation, dodging and plot area padding.
+In this article, we will explore advanced plotting techniques offered in tidyplots. We will cover the rasterizing of individual plot components, data subsetting for highlighting selected data points, and the construction of powerful plotting pipelines. Moreover, we will discuss the visualization of paired and missing data. We will conclude by introducing the concepts of plot orientation and plot area padding.
:::
# Raster versus vector
@@ -275,5 +275,3 @@ time_course %>%
# Padding
-# Dodging
-
--- vignettes/articles/Color-schemes.Rmd
@@ -28,32 +28,6 @@ We will conclude by.
library(tidyplots)
```
-```{r}
-energy %>%
- tidyplot(year, power, color = energy_source) %>%
- add_barstack_absolute()
-
-energy %>%
- tidyplot(year, power, color = energy_source) %>%
- add_barstack_absolute() %>%
- adjust_colors(colors_discrete_seaside)
-
-energy %>%
- tidyplot(year, power, color = energy_source) %>%
- add_barstack_absolute() %>%
- adjust_colors(colors_discrete_candy)
-
-energy %>%
- tidyplot(year, power, color = energy_source) %>%
- add_barstack_absolute() %>%
- adjust_colors(colors_discrete_pastel)
-
-energy %>%
- tidyplot(year, power, color = energy_source) %>%
- add_barstack_absolute() %>%
- adjust_colors(colors_discrete_circle)
-```
-
```{r}
colors_continuous_viridis
--- vignettes/articles/Visualizing-data.Rmd
@@ -168,7 +168,7 @@ gene_expression %>%
dplyr::glimpse()
```
-We will start by plotting the `expression` values of each `external_gene_name` across the `sample` variable.
+I used the `dplyr::glimpse()` function from the dplyr package get a nice overview of all variables including in the dataset. We will start by plotting the `expression` values of each `external_gene_name` across the `sample` variable.
```{r}
gene_expression %>%
@@ -196,7 +196,7 @@ gene_expression %>%
Now it much easier to appreciate the dynamics of individual genes across the samples on the x axis.
-However, the rows appear to be mixed. Some having rather high expression in the "Eip" samples while others have high value in the "Hip" samples. Conveniently, in the dataset there is a variable called `direction`, which classifies genes as being either "up" or "down" regulated. Let's use this variable to sort our y axis.
+However, the rows appear to be mixed. Some having rather high expression in the "Eip" samples while others have high value in the "Hip" samples. Conveniently, in the dataset there is a variable called `direction`, which is either "up" or "down". Let's use this variable to sort our y axis.
```{r, fig.asp=0.9}
gene_expression %>%
@@ -208,406 +208,310 @@ gene_expression %>%
# Central tendency
-In cases with multiple data points per experimental group, the `mean` and the `median` are a great way to compute a typical center value for the group, also known as central tendency measure. In tidyplots, these function start with `add_mean_` or `add_median_`.
-```{r}
-study %>%
- tidyplot(x = treatment, y = score) %>%
- add_data_points_beeswarm() %>%
- add_mean_dash()
-```
-
-As for the `count` and `sum` function, the second part of the function name is dedicated to the graphical representation. These include the representation as `bar`, `dash`, `dot`, `value`, `line` or `area`. Of course, these different representations can also be combined. Like in this case `line` and `dot`.
```{r}
-time_course %>%
- tidyplot(x = day, y = score, color = treatment, dodge_width = 0) %>%
- add_mean_line() %>%
- add_mean_dot()
-```
-
-Or in this case `line` and `area`.
-
-```{r}
-time_course %>%
- tidyplot(x = day, y = score, color = treatment, dodge_width = 0) %>%
- add_mean_line(linewidth = 1) %>%
- add_mean_area(alpha = 0.2)
-```
-
-In case you ask yourself why I included `dodge_width = 0` in all `tidyplot()` calls used to show a `line` or `area`, you might want to learn more about dodging in the article about [Advanced plotting](https://jbengler.github.io/tidyplots/articles/Advanced-plotting.html).
+energy %>%
+ tidyplot(year, power, color = energy_source) %>%
+ add_barstack_absolute()
-Back to representations of central tendency, here is one more example using `bar` and `value`.
+energy %>%
+ tidyplot(year, power, color = energy_source) %>%
+ add_barstack_absolute() %>%
+ adjust_colors(colors_discrete_seaside)
-```{r}
-study %>%
- tidyplot(x = treatment, y = score) %>%
- add_mean_bar() %>%
- add_mean_value()
-```
+energy %>%
+ tidyplot(year, power, color = energy_source) %>%
+ add_barstack_absolute() %>%
+ adjust_colors(colors_discrete_candy)
-You could also plot the `mean` and the `median` together to explore in which cases they diverge. In the example below the `mean` is shown in orange and the `median` in purple.
+energy %>%
+ tidyplot(year, power, color = energy_source) %>%
+ add_barstack_absolute() %>%
+ adjust_colors(colors_discrete_pastel)
-```{r}
-study %>%
- tidyplot(x = treatment, y = score) %>%
- add_data_points_beeswarm() %>%
- add_mean_dash(color = "orange") %>%
- add_median_dash(color = "purple")
+energy %>%
+ tidyplot(year, power, color = energy_source) %>%
+ add_barstack_absolute() %>%
+ adjust_colors(colors_discrete_circle)
```
# Dispersion & uncertainty
-To complement the central tendency measure, it is often helpful to provide information about the variability or dispersion of the data points. Such measures include the standard `error` of the mean, the standard deviation `sd`, the `range` from the highest to the lowest data point or the 95% confidence interval `ci95`.
-
-A classical representation of dispersion is an error `bar`.
-
-```{r}
-time_course %>%
- tidyplot(x = day, y = score, color = treatment) %>%
- add_mean_line() %>%
- add_mean_dot() %>%
- add_error_bar(width = 2)
-```
-
-Or the use of a semitransparent `ribbon`.
-
```{r}
-time_course %>%
- tidyplot(x = day, y = score, color = treatment) %>%
- add_mean_line() %>%
- add_mean_dot() %>%
- add_error_ribbon()
-```
-Another widely used alternative, especially for not normally distributed data is the use of `violin` or `boxplot`s. Starting with the `violin`, the shape of these plots resembles the underlying distribution of the data points.
-
-```{r}
-study %>%
- tidyplot(x = treatment, y = score, color = treatment) %>%
- add_violin()
```
-These can be further augmented by adding, for example, the 0.5 quantile and the underlying data points.
+# Distribution
```{r}
-study %>%
- tidyplot(x = treatment, y = score, color = treatment) %>%
- add_violin(draw_quantiles = 0.5) %>%
+distributions %>%
+ tidyplot(name, value) %>%
+ add_mean_bar(alpha = 0.3) %>%
+ add_error_bar() %>%
add_data_points_beeswarm()
-```
-
-The `boxplot` is the more classical approach, in which the quantiles are visualized by a box and whiskers.
-```{r}
-study %>%
- tidyplot(x = treatment, y = score, color = treatment) %>%
- add_boxplot()
-```
-
-Finally, while not strictly being a measure of central tendency you can also fit a curve though your data in order derive an abstracted representation of your data.
-
-```{r}
-time_course %>%
- tidyplot(x = day, y = score, color = treatment, dodge_width = 0) %>%
- add_curve_fit()
-```
-
-# Distribution
-
-When looking at a single distribution of values, a classical approach for visualization is a `histogram`.
-
-```{r}
energy %>%
tidyplot(x = power) %>%
add_histogram()
-```
-To represent the density of values along the x axis another approach is a `density_histogram` or `density_curve`, which also can be combined in one plot.
+energy %>%
+ tidyplot(x = power) %>%
+ add_density_curve()
-```{r}
energy %>%
tidyplot(x = power) %>%
add_density_histogram() %>%
add_density_curve()
-```
-If you want to compare multiple distributions, again `violin` or `boxplot` are two potential solutions.
+# multiple distributions
-```{r}
-distributions %>%
- tidyplot(x = name, y = value) %>%
+energy %>%
+ tidyplot(x = energy_type, y = power, color = energy_type) %>%
+ add_violin()
+
+energy %>%
+ tidyplot(x = energy_type, y = power, color = energy_type) %>%
+ add_data_points_beeswarm(jitter_width = 0.8, alpha = 0.3)
+
+energy %>%
+ tidyplot(x = energy_type, y = power, color = energy_type) %>%
add_violin() %>%
- add_data_points_beeswarm()
+ add_data_points_beeswarm(jitter_width = 0.8, alpha = 0.3)
```
# Proportion
-Proportional data provides insights into the proportion or percentage that each individual category contributes to the total. To explore the visualization of proportional data in tidyplots, let's introduce the `energy` dataset.
-
```{r}
energy %>%
- dplyr::glimpse()
-```
+ tidyplot(year, power, color = energy_source) %>%
+ add_barstack_absolute()
-As you might appreciate, this dataset contains the `power` in gigawatt produced from different `energy_source`s in Germany between the `year`s 2002 and 2023. Although this might arguably not be the best plot, let's start with a `pie` plot.
+energy %>%
+ dplyr::filter(year %in% c(2005, 2010, 2015, 2020)) %>%
+ tidyplot(y = power, color = energy_source) %>%
+ add_pie() %>%
+ split_plot(by = year)
-```{r}
energy %>%
- tidyplot(color = energy_type) %>%
- add_pie()
-```
+ dplyr::filter(year %in% c(2005, 2010, 2015, 2020)) %>%
+ tidyplot(y = power, color = energy_source) %>%
+ add_donut() %>%
+ split_plot(by = year)
-The above plot represents the count of values across the different `energy_type` categories.
+# pie and donut
-However, we might be more interested, in the contribution of each `energy_type` to the total `power` production. Therefore, we have to provide the variable `power` as a `y` parameter to the `tidyplots()` function.
+animals %>%
+ tidyplot(y = weight, color = family) %>%
+ add_pie()
-```{r}
-energy %>%
- tidyplot(y = power, color = energy_type) %>%
+animals %>%
+ tidyplot(y = weight, color = family) %>%
add_donut()
-```
-Now we can appreciate the contribution of each energy type. Note that I also changed the `pie` for a `donut` plot, which is basically a pie chart with a white hole in the middle.
+animals %>%
+ tidyplot(color = family) %>%
+ add_pie()
-The main criticism of pie and donut plots stems from difficulty of the human brain to faithfully extract the proportions from the plot. For example, has Fossil or Renewable power a bigger share in the plot?
+animals %>%
+ tidyplot(color = family) %>%
+ add_donut()
-Another graphical representation is a `barstack` plot.
+# add_barstack_absolute
-```{r}
-energy %>%
- tidyplot(y = power, color = energy_type) %>%
+animals %>%
+ tidyplot(color = family) %>%
add_barstack_absolute()
-```
-Here it might be slightly easier to compare the energy types. However to really pinpoint this, we probably need to go back to a classical bar plot.
+animals %>%
+ tidyplot(x = diet, color = family) %>%
+ add_barstack_absolute()
-```{r}
-energy %>%
- tidyplot(x = energy_type, y = power) %>%
- add_sum_bar() %>%
- add_sum_value() %>%
- reorder_x_axis_labels("Renewable")
-```
+## with y supplied
-Now we can appreciate that Renewable power produced the most energy in the time between 2002 and 2023.
+animals %>%
+ tidyplot(y = diet, color = family) %>%
+ add_barstack_absolute()
-However, if we want to visualize proportional data across time or another variable, `barstack` plots can still be informative.
+animals %>%
+ tidyplot(x = diet, color = family) %>%
+ add_barstack_absolute()
-```{r}
-energy %>%
- tidyplot(x = year, y = power, color = energy_type) %>%
+animals %>%
+ tidyplot(x = diet, y = speed, color = family) %>%
add_barstack_absolute()
-```
-Moreover, to see the relative instead of the absolute contribution, we can use the `add_barstack_relative()` function.
+# add_barstack_absolute
-```{r}
-energy %>%
- tidyplot(x = year, y = power, color = energy_type) %>%
+animals %>%
+ tidyplot(x = diet, color = family) %>%
add_barstack_relative()
-```
-A similar plot can be achieved using an `areastack`.
+animals %>%
+ tidyplot(x = diet, color = family) %>%
+ add_barstack_absolute(alpha = 0.3) %>%
+ add_areastack_absolute()
-```{r}
-energy %>%
- tidyplot(x = year, y = power, color = energy_type) %>%
- add_areastack_relative()
-```
+## with y supplied
-In both plots, the increasing contribution of renewable energy to the total energy production becomes apparent.
+animals %>%
+ tidyplot(x = diet, y = speed, color = family) %>%
+ add_barstack_absolute(alpha = 0.3) %>%
+ add_areastack_absolute()
-This can also be shown using donut plots. However, we need to downsample the dataset to 4 representative years.
+animals %>%
+ tidyplot(y = diet, color = family) %>%
+ add_barstack_relative()
+
+animals %>%
+ tidyplot(x = diet, y = speed, color = family) %>%
+ add_barstack_relative()
+```
+
+# Statistical comparison
```{r}
-energy %>%
- # downsample to 4 representative years
- dplyr::filter(year %in% c(2005, 2010, 2015, 2020)) %>%
- # start plotting
- tidyplot(y = power, color = energy_type) %>%
- add_donut() %>%
- adjust_colors(new_colors = c("Fossil" = "grey",
- "Nuclear" = "#F6C54D",
- "Renewable" = "#4FAE62",
- "Other" = "#C02D45")) %>%
- split_plot(by = year)
+
```
-Now let's have a look at a related dataset showing one week of the energy data in higher resolution.
+# Annotation
```{r}
-energy_week %>%
- tidyplot(date, power, color = energy_source) %>%
- add_areastack_absolute()
+
```
-In this plot, one can appreciate the higher contribution of solar power during day time in comparison to night time.
-Also this plot can be shown as a relative areastack.
```{r}
+# curves
+
energy_week %>%
tidyplot(date, power, color = energy_source) %>%
- add_areastack_relative()
-```
+ add_line() %>%
+ remove_plot_area_padding()
-This nicely illustrates how wind energy closes the solar gap during the night, however when wind is weak, like on September 10, fossil energy sources need to kick in to fill the gap.
+energy_week %>%
+ tidyplot(date, power, color = energy_source) %>%
+ add_area()
-# Statistical comparison
+energy_week %>%
+ tidyplot(date, power, color = energy_source) %>%
+ add_areastack_absolute()
-To test for differences between experimental groups, tidyplots offers the funtions `add_stats_asterisks()` and `add_stats_pvalue()`. While the first one includes asterisks for symbolizing significance.
+energy_week %>%
+ tidyplot(date, power, color = energy_source) %>%
+ add_areastack_relative()
-```{r}
-study %>%
- tidyplot(x = dose, y = score, color = group) %>%
- add_mean_dash() %>%
- add_error_bar() %>%
- add_data_points() %>%
- add_stats_asterisks()
-```
+###
-`add_stats_pvalue()` provides the computed _p_ value.
+energy %>%
+ tidyplot(year, power, color = energy_type) %>%
+ add_barstack_absolute()
-```{r}
-study %>%
- tidyplot(x = dose, y = score, color = group) %>%
- add_mean_dash() %>%
- add_error_bar() %>%
- add_data_points() %>%
- add_stats_pvalue()
-```
+energy %>%
+ tidyplot(year, power, color = energy_type) %>%
+ add_barstack_relative()
-As you might have noted, when using these functions, a caption is automatically included that provides details about the statistical testing performed. The default is a Student's _t_ test without multiple comparison adjustment.
+new_colors <- c("Fossil" = "grey",
+ "Nuclear" = "#F6C54D",
+ "Renewable" = "#4FAE62",
+ "Other" = "#C02D45")
-Both can be changed by providing the `method` and `p.adjust.method` parameters.
+energy %>%
+ dplyr::filter(year %in% c(2005, 2010, 2015, 2020)) %>%
+ tidyplot(y = power, color = energy_type) %>%
+ add_pie() %>%
+ adjust_colors(new_colors = new_colors) %>%
+ split_plot(by = year)
-For example, lets perform a Wilcoxon signed-rank test with Benjamini--Hochberg adjustment.
+energy %>%
+ dplyr::filter(year %in% c(2005, 2010, 2015, 2020)) %>%
+ tidyplot(y = power, color = energy_type) %>%
+ add_donut() %>%
+ adjust_colors(new_colors = new_colors) %>%
+ split_plot(by = year)
-```{r}
-study %>%
- tidyplot(x = dose, y = score, color = group) %>%
- add_mean_dash() %>%
- add_error_bar() %>%
- add_data_points() %>%
- add_stats_pvalue(method = "wilcoxon", p.adjust.method = "BH")
-```
+energy %>%
+ dplyr::filter(year %in% c(2005, 2010, 2015, 2020)) %>%
+ tidyplot(x = year, y = power, color = energy_type, dodge_width = 0) %>%
+ add_sum_line() %>%
+ add_sum_dot() %>%
+ adjust_colors(new_colors = new_colors)
-In non paired data, it might also make sense to compare all experimental conditions to a control condition. For example, let's say treatment "A" is our control.
+energy %>%
+ tidyplot(x = year, y = power, color = energy_type) %>%
+ add_areastack_absolute() %>%
+ adjust_colors(new_colors = new_colors)
-```{r}
-study %>%
- tidyplot(x = treatment, y = score, color = treatment) %>%
- add_mean_dash() %>%
- add_error_bar() %>%
- add_data_points() %>%
- add_stats_pvalue(ref.group = "A")
-```
+energy %>%
+ tidyplot(x = year, y = power, color = energy_source) %>%
+ add_areastack_absolute()
-In some scenarios you have a mixture of significant and non-significant _p_ values.
+energy %>%
+ dplyr::filter(year %in% c(2005, 2010, 2015, 2020)) %>%
+ tidyplot(x = year, y = power, color = energy_type) %>%
+ add_areastack_relative() %>%
+ adjust_colors(new_colors = new_colors)
-```{r}
-gene_expression %>%
- # filter to one gene
- dplyr::filter(external_gene_name == "Apol6") %>%
- # start plotting
- tidyplot(x = condition, y = expression, color = sample_type) %>%
- add_mean_dash() %>%
- add_error_bar() %>%
- add_data_points() %>%
- add_stats_pvalue()
-```
+energy_week %>%
+ tidyplot(date, power, color = energy_type) %>%
+ add_sum_line() %>%
+ adjust_colors(new_colors = new_colors) %>%
+ remove_plot_area_padding()
-Here you can choose to hide the non-significant _p_ value using `hide.ns = TRUE`.
+energy_week %>%
+ dplyr::summarise(power = sum(power), .by = c(date, energy_type)) %>%
+ tidyplot(date, power, color = energy_type) %>%
+ add_area() %>%
+ adjust_colors(new_colors = new_colors)
-```{r}
-gene_expression %>%
- # filter to one gene
- dplyr::filter(external_gene_name == "Apol6") %>%
- # start plotting
- tidyplot(x = condition, y = expression, color = sample_type) %>%
- add_mean_dash() %>%
- add_error_bar() %>%
- add_data_points() %>%
- add_stats_pvalue(hide.ns = TRUE)
-```
+energy_week %>%
+ tidyplot(date, power, color = energy_type) %>%
+ add_areastack_absolute() %>%
+ adjust_colors(new_colors = new_colors)
-Finally, if you want to hide the caption with statistical information you can do this by providing `include_info = FALSE`.
+energy_week %>%
+ tidyplot(date, power, color = energy_type) %>%
+ add_areastack_relative() %>%
+ adjust_colors(new_colors = new_colors)
-```{r}
-gene_expression %>%
- # filter to one gene
- dplyr::filter(external_gene_name == "Apol6") %>%
- # start plotting
- tidyplot(x = condition, y = expression, color = sample_type) %>%
- add_mean_dash() %>%
- add_error_bar() %>%
- add_data_points() %>%
- add_stats_pvalue(hide.ns = TRUE, include_info = FALSE)
```
-There are many more things you can do with statistical comparisons. Just check out the documentation of `add_stats_pvalue()` and the underlying function `ggpubr::geom_pwc()`.
-
-# Annotation
-
-Sometimes you wish to add annotations to provide the reader with important additional information. For example, tidyplots let's you add a `title` and a `caption`.
-
```{r}
study %>%
tidyplot(x = treatment, y = score, color = treatment) %>%
add_mean_dash() %>%
- add_error_bar() %>%
- add_data_points() %>%
- add_title("Interesting study") %>%
- add_caption("Here is some more detail how the study was performed")
-```
-
-In other cases you might want to highlight specific data points or reference values in the plot. Let's take the `animals` dataset and plot `speed` versus `weight`.
+ add_ci95_bar()
-```{r}
-animals %>%
- tidyplot(x = weight, y = speed) %>%
- add_data_points()
-```
+study %>%
+ tidyplot(x = treatment, y = score) %>%
+ add_ci95_ribbon() %>%
+ add_mean_line() %>%
+ add_data_points_beeswarm()
-Here it might be interesting to have closer at the extreme values. First, let's highlight the heaviest and the fastest animal.
+study %>%
+ tidyplot(x = treatment, y = score) %>%
+ add_ci95_ribbon() %>%
+ add_mean_line() %>%
+ add_data_points_beeswarm() %>%
+ adjust_colors("orange")
-```{r}
-animals %>%
- tidyplot(x = weight, y = speed) %>%
- add_data_points() %>%
- add_data_points(data = max_rows(weight, 1), color = "red", shape = 1, size = 2) %>%
- add_data_points(data = max_rows(speed, 1), color = "red", shape = 1, size = 2)
```
-Now it would interesting to know the names of these animals. We can plot the names of all animals.
-
```{r}
-animals %>%
- tidyplot(x = weight, y = speed) %>%
- add_data_points() %>%
- add_text_labels(animal, max.overlaps = Inf)
-```
-This looks a little bit too busy. So let's restrict the labels to the 3 heaviest and the 3 fastest animals.
-
-```{r}
-animals %>%
- tidyplot(x = weight, y = speed) %>%
- add_data_points() %>%
- add_text_labels(data = max_rows(weight, 3), animal) %>%
- add_text_labels(data = max_rows(speed, 3), animal)
-```
+energy_week %>%
+ tidyplot(date, power, color = energy_source) %>%
+ add_line() %>%
+ remove_plot_area_padding()
-There is lot tweaking that can be done with text labels. For more details have a look at the documentation of `add_text_labels()` and the underlying function `ggrepel::geom_text_repel()`.
+energy_week %>%
+ tidyplot(date, power, color = energy_source) %>%
+ add_area()
-As one last thing, let's add some reference lines, to highlight specific values on the x and y axis.
+energy_week %>%
+ tidyplot(date, power, color = energy_source) %>%
+ add_data_points()
-```{r}
-animals %>%
- tidyplot(x = weight, y = speed) %>%
- add_reference_lines(x = 4000, y = c(100, 200)) %>%
- add_data_points() %>%
- add_text_labels(data = max_rows(weight, 3), animal) %>%
- add_text_labels(data = max_rows(speed, 3), animal)
```
+
--- vignettes/tidyplots.Rmd
@@ -17,12 +17,12 @@ knitr::opts_chunk$set(
```
::: {.lead}
-This getting started guide aims to empower individuals without a programming background to engage in code-based plotting with tidyplots. We will start by covering essential software tools and discussing data preparation. Next, we will introduce the tidyplots workflow, which includes adding, removing, and adjusting plot components. Finally, we will showcase the application of themes and multiplot layouts.
+This getting started guide aims to empower individuals without a programming background to engage in code-based plotting with tidyplots. We will start by covering essential software tools and discussing data preparation. Next, we will introduce the tidyplots workflow, which includes adding, removing, and adjusting plot components, as well as applying themes and multiplot layouts. Finally, we'll conclude with additional resources for further exploration of code-based plotting.
:::
# Prerequisites
-You never generated code-based scientific plots? Great to have you here! To get you started, we will install a couple of software tools to setup your new working environment.
+You never used R for scientific plots? Great to have you here! To get you started, we will install a couple of software tools to setup your new working environment.
## Install R and RStudio Desktop
@@ -356,12 +356,12 @@ Conveniently, `save_plot()` also gives back the plot it received, allowing it to
This getting started guide is meant to give a high level overview of the tidyplots workflow. To dive deeper into more specific aspects of tidyplots, here a couple of resources.
-## tidyplots reference
+## Reference
- [Function reference](https://jbengler.github.io/tidyplots/reference/index.html)
A great overview of all tidyplots functions
-## tidyplots articles
+## Articles
- [Visualizing data](https://jbengler.github.io/tidyplots/articles/Visualizing-data.html)
An article with examples for common data visualizations
@@ -374,14 +374,3 @@ An article about the use of color schemes in tidyplots
- [Design principles](https://jbengler.github.io/tidyplots/articles/Design-principles.html)
An article about the design choices in tidyplots, including notable differences to ggplot2
-
-## Other ressources
-
-- [Hands-On Programming with R](https://rstudio-education.github.io/hopr/)
-Free online book by Garrett Grolemund
-
-- [R for Data Science](https://r4ds.hadley.nz)
-Free online book by Hadley Wickham
-
-- [Fundamentals of Data Visualization](https://clauswilke.com/dataviz/)
-Free online book by Claus O. Wilke
|
tidyplots
|
jbengler
|
R
|
R
| 495
| 18
|
Tidy Plots for Scientific Papers
|
jbengler_tidyplots
|
CODE_IMPROVEMENT
|
probably refactoring since the value of an argument changed
|
2a7ed1ada74e9625d8f73eec03dfe560ba978b32
| null |
Andrey Lushnikov
|
fix(firefox): add ExecutionContext to api (#3963)
| false
| 1
| 0
| 1
|
--- api.js
@@ -5,6 +5,7 @@ module.exports = {
ConsoleMessage: require('./Page').ConsoleMessage,
Dialog: require('./Dialog').Dialog,
ElementHandle: require('./JSHandle').ElementHandle,
+ ExecutionContext: require('./ExecutionContext').ExecutionContext,
Frame: require('./Page').Frame,
JSHandle: require('./JSHandle').JSHandle,
Keyboard: require('./Input').Keyboard,
|
puppeteer_puppeteer.json
| null | null | null | null | null | null |
puppeteer_puppeteer.json
|
BUG_FIX
|
5, obvious
|
3e4818d8423539e3af50a7164070e301bb9031e0
|
2022-07-25 14:47:51
|
macro
|
Update RedisCacheAspect.java
| false
| 1
| 1
| 2
|
--- mall-security/src/main/java/com/macro/mall/security/aspect/RedisCacheAspect.java
@@ -22,7 +22,7 @@ import java.lang.reflect.Method;
@Component
@Order(2)
public class RedisCacheAspect {
- private static final Logger LOGGER = LoggerFactory.getLogger(RedisCacheAspect.class);
+ private static Logger LOGGER = LoggerFactory.getLogger(RedisCacheAspect.class);
@Pointcut("execution(public * com.macro.mall.portal.service.*CacheService.*(..)) || execution(public * com.macro.mall.service.*CacheService.*(..))")
public void cacheAspect() {
|
mall
|
macrozheng
|
Java
|
Java
| 79,319
| 29,052
|
mall项目是一套电商系统,包括前台商城系统及后台管理系统,基于Spring Boot+MyBatis实现,采用Docker容器化部署。 前台商城系统包含首页门户、商品推荐、商品搜索、商品展示、购物车、订单流程、会员中心、客户服务、帮助中心等模块。 后台管理系统包含商品管理、订单管理、会员管理、促销管理、运营管理、内容管理、统计报表、财务管理、权限管理、设置等模块。
|
macrozheng_mall
|
CODE_IMPROVEMENT
|
just a keyword 'final' added
|
a5628f412785e1484148d06b5d260a0abddda205
|
2023-07-01 14:34:18
|
paigeman
|
Update jvm-parameters-intro.md
| false
| 1
| 1
| 2
|
--- docs/java/jvm/jvm-parameters-intro.md
@@ -210,7 +210,7 @@ JVM 具有四种类型的 GC 实现:
- `-server` : 启用“ Server Hotspot VM”; 此参数默认用于 64 位 JVM
- `-XX:+UseStringDeduplication` : _Java 8u20_ 引入了这个 JVM 参数,通过创建太多相同 String 的实例来减少不必要的内存使用; 这通过将重复 String 值减少为单个全局 `char []` 数组来优化堆内存。
- `-XX:+UseLWPSynchronization`: 设置基于 LWP (轻量级进程)的同步策略,而不是基于线程的同步。
-- `-XX:LargePageSizeInBytes`: 设置用于 Java 堆的较大页面大小; 它采用 GB/MB/KB 的参数; 页面大小越大,我们可以更好地利用虚拟内存硬件资源; 然而,这可能会导致 PermGen 的空间大小更大,这反过来又会迫使 Java 堆空间的大小减小。
+- ``-XX:LargePageSizeInBytes `: 设置用于 Java 堆的较大页面大小; 它采用 GB/MB/KB 的参数; 页面大小越大,我们可以更好地利用虚拟内存硬件资源; 然而,这可能会导致 PermGen 的空间大小更大,这反过来又会迫使 Java 堆空间的大小减小。
- `-XX:MaxHeapFreeRatio` : 设置 GC 后, 堆空闲的最大百分比,以避免收缩。
- `-XX:SurvivorRatio` : eden/survivor 空间的比例, 例如`-XX:SurvivorRatio=6` 设置每个 survivor 和 eden 之间的比例为 1:6。
- `-XX:+UseLargePages` : 如果系统支持,则使用大页面内存; 请注意,如果使用这个 JVM 参数,OpenJDK 7 可能会崩溃。
|
javaguide
|
snailclimb
|
Java
|
Java
| 148,495
| 45,728
|
「Java学习+面试指南」一份涵盖大部分 Java 程序员所需要掌握的核心知识。准备 Java 面试,首选 JavaGuide!
|
snailclimb_javaguide
|
DOC_CHANGE
|
changes in md file
|
9c8058c3260d39e37416154a181441839e0570f6
|
2024-02-22 17:23:02
|
Leonard Hecker
|
AtlasEngine: Improve dotted, dashed and curly underlines (#16719) This changeset makes 3 improvements:
* Dotted lines now use a 2:1 ratio between gaps and dots (from 1:1).
This makes the dots a lot easier to spot at small font sizes.
* Dashed lines use a 1:2 ratio and a cells-size independent stride.
By being cell-size independent it works more consistently with a
wider variety of fonts with weird cell aspect ratios.
* Curly lines are now cell-size independent as well and have a
height that equals the double-underline size.
This ensures that the curve isn't cut off anymore and just like
with dashed lines, that it works under weird aspect ratios.
Closes #16712
## Validation Steps Performed
This was tested using RenderingTests using Cascadia Mono, Consolas,
Courier New, Lucida Console and MS Gothic.
| false
| 25
| 45
| 70
|
--- src/renderer/atlas/BackendD3D.cpp
@@ -310,18 +310,29 @@ void BackendD3D::_updateFontDependents(const RenderingPayload& p)
// baseline of curlyline is at the middle of singly underline. When there's
// limited space to draw a curlyline, we apply a limit on the peak height.
{
- const auto cellHeight = static_cast<f32>(font.cellSize.y);
- const auto strokeWidth = static_cast<f32>(font.thinLineWidth);
-
- // This gives it the same position and height as our double-underline. There's no particular reason for that, apart from
- // it being simple to implement and robust against more peculiar fonts with unusually large/small descenders, etc.
- // We still need to ensure though that it doesn't clip out of the cellHeight at the bottom.
- const auto height = std::max(3.0f, static_cast<f32>(font.doubleUnderline[1].position + font.doubleUnderline[1].height - font.doubleUnderline[0].position));
- const auto top = std::min(static_cast<f32>(font.doubleUnderline[0].position), floorf(cellHeight - height - strokeWidth));
+ // initialize curlyline peak height to a desired value. Clamp it to at
+ // least 1.
+ constexpr auto curlyLinePeakHeightEm = 0.075f;
+ _curlyLinePeakHeight = std::max(1.0f, std::roundf(curlyLinePeakHeightEm * font.fontSize));
+
+ // calc the limit we need to apply
+ const auto strokeHalfWidth = std::floor(font.underline.height / 2.0f);
+ const auto underlineMidY = font.underline.position + strokeHalfWidth;
+ const auto maxDrawableCurlyLinePeakHeight = font.cellSize.y - underlineMidY - font.underline.height;
+
+ // if the limit is <= 0 (no height at all), stick with the desired height.
+ // This is how we force a curlyline even when there's no space, though it
+ // might be clipped at the bottom.
+ if (maxDrawableCurlyLinePeakHeight > 0.0f)
+ {
+ _curlyLinePeakHeight = std::min(_curlyLinePeakHeight, maxDrawableCurlyLinePeakHeight);
+ }
- _curlyLineHalfHeight = height * 0.5f;
- _curlyUnderline.position = gsl::narrow_cast<u16>(lrintf(top));
- _curlyUnderline.height = gsl::narrow_cast<u16>(lrintf(height));
+ const auto curlyUnderlinePos = underlineMidY - _curlyLinePeakHeight - font.underline.height;
+ const auto curlyUnderlineWidth = 2.0f * (_curlyLinePeakHeight + font.underline.height);
+ const auto curlyUnderlinePosU16 = gsl::narrow_cast<u16>(lrintf(curlyUnderlinePos));
+ const auto curlyUnderlineWidthU16 = gsl::narrow_cast<u16>(lrintf(curlyUnderlineWidth));
+ _curlyUnderline = { curlyUnderlinePosU16, curlyUnderlineWidthU16 };
}
DWrite_GetRenderParams(p.dwriteFactory.get(), &_gamma, &_cleartypeEnhancedContrast, &_grayscaleEnhancedContrast, _textRenderingParams.put());
@@ -562,8 +573,9 @@ void BackendD3D::_recreateConstBuffer(const RenderingPayload& p) const
DWrite_GetGammaRatios(_gamma, data.gammaRatios);
data.enhancedContrast = p.s->font->antialiasingMode == AntialiasingMode::ClearType ? _cleartypeEnhancedContrast : _grayscaleEnhancedContrast;
data.underlineWidth = p.s->font->underline.height;
- data.thinLineWidth = p.s->font->thinLineWidth;
- data.curlyLineHalfHeight = _curlyLineHalfHeight;
+ data.curlyLineWaveFreq = 2.0f * 3.14f / p.s->font->cellSize.x;
+ data.curlyLinePeakHeight = _curlyLinePeakHeight;
+ data.curlyLineCellOffset = p.s->font->underline.position + p.s->font->underline.height / 2.0f;
p.deviceContext->UpdateSubresource(_psConstantBuffer.get(), 0, nullptr, &data, 0, 0);
}
}
--- src/renderer/atlas/BackendD3D.h
@@ -42,8 +42,9 @@ namespace Microsoft::Console::Render::Atlas
alignas(sizeof(f32x4)) f32 gammaRatios[4]{};
alignas(sizeof(f32)) f32 enhancedContrast = 0;
alignas(sizeof(f32)) f32 underlineWidth = 0;
- alignas(sizeof(f32)) f32 thinLineWidth = 0;
- alignas(sizeof(f32)) f32 curlyLineHalfHeight = 0;
+ alignas(sizeof(f32)) f32 curlyLinePeakHeight = 0;
+ alignas(sizeof(f32)) f32 curlyLineWaveFreq = 0;
+ alignas(sizeof(f32)) f32 curlyLineCellOffset = 0;
#pragma warning(suppress : 4324) // 'PSConstBuffer': structure was padded due to alignment specifier
};
@@ -290,7 +291,7 @@ namespace Microsoft::Console::Render::Atlas
// The bounding rect of _cursorRects in pixels.
til::rect _cursorPosition;
- f32 _curlyLineHalfHeight = 0.0f;
+ f32 _curlyLinePeakHeight = 0.0f;
FontDecorationPosition _curlyUnderline;
bool _requiresContinuousRedraw = false;
--- src/renderer/atlas/shader_ps.hlsl
@@ -12,8 +12,9 @@ cbuffer ConstBuffer : register(b0)
float4 gammaRatios;
float enhancedContrast;
float underlineWidth;
- float thinLineWidth;
- float curlyLineHalfHeight;
+ float curlyLinePeakHeight;
+ float curlyLineWaveFreq;
+ float curlyLineCellOffset;
}
Texture2D<float4> background : register(t0);
@@ -75,25 +76,31 @@ Output main(PSData data) : SV_Target
}
case SHADING_TYPE_DOTTED_LINE:
{
- const bool on = frac(data.position.x / (3.0f * underlineWidth * data.renditionScale.x)) < (1.0f / 3.0f);
+ const bool on = frac(data.position.x / (2.0f * underlineWidth * data.renditionScale.x)) < 0.5f;
color = on * premultiplyColor(data.color);
weights = color.aaaa;
break;
}
case SHADING_TYPE_DASHED_LINE:
{
- const bool on = frac(data.position.x / (6.0f * underlineWidth * data.renditionScale.x)) < (4.0f / 6.0f);
+ const bool on = frac(data.position.x / (backgroundCellSize.x * data.renditionScale.x)) < 0.5f;
color = on * premultiplyColor(data.color);
weights = color.aaaa;
break;
}
case SHADING_TYPE_CURLY_LINE:
{
- const float strokeWidthHalf = thinLineWidth * data.renditionScale.y * 0.5f;
- const float amp = (curlyLineHalfHeight - strokeWidthHalf) * data.renditionScale.y;
- const float freq = data.renditionScale.x / curlyLineHalfHeight * 1.57079632679489661923f;
- const float s = sin(data.position.x * freq) * amp;
- const float d = abs(curlyLineHalfHeight - data.texcoord.y - s);
+ uint cellRow = floor(data.position.y / backgroundCellSize.y);
+ // Use the previous cell when drawing 'Double Height' curly line.
+ cellRow -= data.renditionScale.y - 1;
+ const float cellTop = cellRow * backgroundCellSize.y;
+ const float centerY = cellTop + curlyLineCellOffset * data.renditionScale.y;
+ const float strokeWidthHalf = underlineWidth * data.renditionScale.y / 2.0f;
+ const float amp = curlyLinePeakHeight * data.renditionScale.y;
+ const float freq = curlyLineWaveFreq / data.renditionScale.x;
+
+ const float s = sin(data.position.x * freq);
+ const float d = abs(centerY - (s * amp) - data.position.y);
const float a = 1 - saturate(d - strokeWidthHalf);
color = a * premultiplyColor(data.color);
weights = color.aaaa;
|
terminal
|
microsoft
|
C++
|
C++
| 97,273
| 8,477
|
The new Windows Terminal and the original Windows console host, all in the same place!
|
microsoft_terminal
|
PERF_IMPROVEMENT
|
Obvious
|
51ba576ccbc48f69cc727ec196e0ee17839a1d64
|
2023-02-09 00:15:08
|
Serhiy Mytrovtsiy
|
fix: added intel based support for usage per core bar chart in the popup view (#1304)
| false
| 9
| 5
| 14
|
--- Modules/CPU/popup.swift
@@ -240,7 +240,7 @@ internal class Popup: NSView, Popup_p {
view.addArrangedSubview(separator)
view.addArrangedSubview(lineChartContainer)
- if let cores = SystemKit.shared.device.info.cpu?.logicalCores {
+ if let cores = SystemKit.shared.device.info.cpu?.cores {
let barChartContainer: NSView = {
let box: NSView = NSView(frame: NSRect(x: 0, y: 0, width: self.frame.width, height: 50))
box.heightAnchor.constraint(equalToConstant: box.frame.height).isActive = true
@@ -253,7 +253,7 @@ internal class Popup: NSView, Popup_p {
y: Constants.Popup.spacing,
width: view.frame.width - (Constants.Popup.spacing*2),
height: box.frame.height - (Constants.Popup.spacing*2)
- ), num: Int(cores))
+ ), num: cores.count)
self.barChart = chart
box.addSubview(chart)
@@ -352,17 +352,13 @@ internal class Popup: NSView, Popup_p {
field.stringValue = "\(Int(usage * 100))%"
}
- var usagePerCore: [ColorValue] = []
if let cores = SystemKit.shared.device.info.cpu?.cores, cores.count == value.usagePerCore.count {
+ var list: [ColorValue] = []
for i in 0..<value.usagePerCore.count {
- usagePerCore.append(ColorValue(value.usagePerCore[i], color: cores[i].type == .efficiency ? NSColor.systemTeal : NSColor.systemBlue))
- }
- } else {
- for i in 0..<value.usagePerCore.count {
- usagePerCore.append(ColorValue(value.usagePerCore[i], color: NSColor.systemBlue))
+ list.append(ColorValue(value.usagePerCore[i], color: cores[i].type == .performance ? NSColor.systemBlue : NSColor.systemTeal))
}
+ self.barChart?.setValues(list)
}
- self.barChart?.setValues(usagePerCore)
self.initialized = true
}
|
stats
|
exelban
|
Swift
|
Swift
| 29,655
| 950
|
macOS system monitor in your menu bar
|
exelban_stats
|
BUG_FIX
|
obvious
|
b067016bfac505747588e425d5e3fe598ff52457
|
2023-07-25 14:09:38
|
krahets
|
Update preorder_traversal_iii.
| false
| 48
| 56
| 104
|
--- codes/cpp/chapter_backtracking/preorder_traversal_iii_template.cpp
@@ -37,6 +37,7 @@ void backtrack(vector<TreeNode *> &state, vector<TreeNode *> &choices, vector<ve
if (isSolution(state)) {
// 记录解
recordSolution(state, res);
+ return;
}
// 遍历所有选择
for (TreeNode *choice : choices) {
--- codes/csharp/chapter_backtracking/preorder_traversal_iii_compact.cs
@@ -41,7 +41,7 @@ public class preorder_traversal_iii_compact {
res = new List<List<TreeNode>>();
preOrder(root);
- Console.WriteLine("\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点");
+ Console.WriteLine("\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点,仅包含一个值为 7 的节点");
foreach (List<TreeNode> path in res) {
PrintUtil.PrintList(path.Select(p => p.val).ToList());
}
--- codes/csharp/chapter_backtracking/preorder_traversal_iii_template.cs
@@ -38,6 +38,7 @@ public class preorder_traversal_iii_template {
if (isSolution(state)) {
// 记录解
recordSolution(state, res);
+ return;
}
// 遍历所有选择
foreach (TreeNode choice in choices) {
--- codes/go/chapter_backtracking/preorder_traversal_test.go
@@ -59,7 +59,7 @@ func TestPreorderTraversalIIICompact(t *testing.T) {
res := make([][]*TreeNode, 0)
preOrderIII(root, &res, &path)
- fmt.Println("\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点")
+ fmt.Println("\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点,仅包含一个值为 7 的节点")
for _, path := range res {
for _, node := range path {
fmt.Printf("%v ", node.Val)
@@ -81,7 +81,7 @@ func TestPreorderTraversalIIITemplate(t *testing.T) {
choices = append(choices, root)
backtrackIII(&state, &choices, &res)
- fmt.Println("\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点")
+ fmt.Println("\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点,仅包含一个值为 7 的节点")
for _, path := range res {
for _, node := range path {
fmt.Printf("%v ", node.Val)
--- codes/java/chapter_backtracking/preorder_traversal_iii_compact.java
@@ -43,7 +43,7 @@ public class preorder_traversal_iii_compact {
res = new ArrayList<>();
preOrder(root);
- System.out.println("\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点");
+ System.out.println("\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点,仅包含一个值为 7 的节点");
for (List<TreeNode> path : res) {
List<Integer> vals = new ArrayList<>();
for (TreeNode node : path) {
--- codes/java/chapter_backtracking/preorder_traversal_iii_template.java
@@ -41,6 +41,7 @@ public class preorder_traversal_iii_template {
if (isSolution(state)) {
// 记录解
recordSolution(state, res);
+ return;
}
// 遍历所有选择
for (TreeNode choice : choices) {
--- codes/javascript/chapter_backtracking/preorder_traversal_iii_compact.js
@@ -37,7 +37,7 @@ const path = [];
const res = [];
preOrder(root, path, res);
-console.log('\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点');
+console.log('\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点,仅包含一个值为 7 的节点');
res.forEach((path) => {
console.log(path.map((node) => node.val));
});
--- codes/javascript/chapter_backtracking/preorder_traversal_iii_template.js
@@ -38,6 +38,7 @@ function backtrack(state, choices, res) {
if (isSolution(state)) {
// 记录解
recordSolution(state, res);
+ return;
}
// 遍历所有选择
for (const choice of choices) {
--- codes/python/chapter_backtracking/preorder_traversal_iii_compact.py
@@ -39,6 +39,6 @@ if __name__ == "__main__":
res = list[list[TreeNode]]()
pre_order(root)
- print("\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点")
+ print("\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点,仅包含一个值为 7 的节点")
for path in res:
print([node.val for node in path])
--- codes/python/chapter_backtracking/preorder_traversal_iii_template.py
@@ -43,6 +43,7 @@ def backtrack(
if is_solution(state):
# 记录解
record_solution(state, res)
+ return
# 遍历所有选择
for choice in choices:
# 剪枝:检查选择是否合法
--- codes/python/modules/__init__.py
@@ -9,7 +9,7 @@ from .list_node import (
linked_list_to_list,
get_list_node,
)
-from .tree_node import TreeNode, list_to_tree, tree_to_list
+from .tree_node import TreeNode, list_to_tree, tree_to_list, get_tree_node
from .vertex import Vertex, vals_to_vets, vets_to_vals
from .print_util import (
print_matrix,
--- codes/python/modules/tree_node.py
@@ -67,3 +67,14 @@ def tree_to_list(root: TreeNode | None) -> list[int]:
res = []
tree_to_list_dfs(root, 0, res)
return res
+
+
+def get_tree_node(root: TreeNode | None, val: int) -> TreeNode | None:
+ """Get a tree node with specific value in a binary tree"""
+ if not root:
+ return
+ if root.val == val:
+ return root
+ left: TreeNode | None = get_tree_node(root.left, val)
+ right: TreeNode | None = get_tree_node(root.right, val)
+ return left if left else right
--- codes/rust/chapter_backtracking/preorder_traversal_iii_compact.rs
@@ -42,7 +42,7 @@ pub fn main() {
let mut res = Vec::new();
pre_order(&mut res, &mut path, root);
- println!("\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点");
+ println!("\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点,仅包含一个值为 7 的节点");
for path in res {
let mut vals = Vec::new();
for node in path {
--- codes/rust/chapter_backtracking/preorder_traversal_iii_template.rs
@@ -40,6 +40,7 @@ fn backtrack(state: &mut Vec<Rc<RefCell<TreeNode>>>, choices: &mut Vec<Rc<RefCel
if is_solution(state) {
// 记录解
record_solution(state, res);
+ return;
}
// 遍历所有选择
for choice in choices {
--- codes/swift/chapter_backtracking/preorder_traversal_iii_compact.swift
@@ -42,7 +42,7 @@ enum PreorderTraversalIIICompact {
res = []
preOrder(root: root)
- print("\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点")
+ print("\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点,仅包含一个值为 7 的节点")
for path in res {
var vals: [Int] = []
for node in path {
--- codes/swift/chapter_backtracking/preorder_traversal_iii_template.swift
@@ -36,6 +36,7 @@ func backtrack(state: inout [TreeNode], choices: [TreeNode], res: inout [[TreeNo
// 检查是否为解
if isSolution(state: state) {
recordSolution(state: state, res: &res)
+ return
}
// 遍历所有选择
for choice in choices {
@@ -64,7 +65,7 @@ enum PreorderTraversalIIITemplate {
var res: [[TreeNode]] = []
backtrack(state: &state, choices: [root].compactMap { $0 }, res: &res)
- print("\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点")
+ print("\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点,仅包含一个值为 7 的节点")
for path in res {
var vals: [Int] = []
for node in path {
--- codes/typescript/chapter_backtracking/preorder_traversal_iii_compact.ts
@@ -42,7 +42,7 @@ const path: TreeNode[] = [];
const res: TreeNode[][] = [];
preOrder(root, path, res);
-console.log('\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点');
+console.log('\n输出所有根节点到节点 7 的路径,路径中不包含值为 3 的节点,仅包含一个值为 7 的节点');
res.forEach((path) => {
console.log(path.map((node) => node.val));
});
--- codes/typescript/chapter_backtracking/preorder_traversal_iii_template.ts
@@ -43,6 +43,7 @@ function backtrack(
if (isSolution(state)) {
// 记录解
recordSolution(state, res);
+ return;
}
// 遍历所有选择
for (const choice of choices) {
--- docs/chapter_backtracking/backtracking_algorithm.assets/backtrack_remove_return_or_not.png
Binary files a/docs/chapter_backtracking/backtracking_algorithm.assets/backtrack_remove_return_or_not.png and /dev/null differ
--- docs/chapter_backtracking/backtracking_algorithm.md
@@ -201,9 +201,12 @@
!!! question "例题三"
- 在二叉树中搜索所有值为 $7$ 的节点,请返回根节点到这些节点的路径,**并要求路径中不包含值为 $3$ 的节点**。
+ 在二叉树中搜索所有值为 $7$ 的节点,请返回根节点到这些节点的路径,**要求路径中只存在一个值为 $7$ 的节点,并且不允许有值为 $3$ 的节点**。
-为了满足以上约束条件,**我们需要添加剪枝操作**:在搜索过程中,若遇到值为 $3$ 的节点,则提前返回,停止继续搜索。
+在例题二的基础上添加剪枝操作,包括:
+
+- 当遇到值为 $7$ 的节点时,记录解并返回,停止搜索。
+- 当遇到值为 $3$ 的节点时,则直接返回,停止搜索。
=== "Java"
@@ -271,10 +274,27 @@
[class]{}-[func]{preOrder}
```
-剪枝是一个非常形象的名词。在搜索过程中,**我们“剪掉”了不满足约束条件的搜索分支**,避免许多无意义的尝试,从而实现搜索效率的提高。
+剪枝是一个非常形象的名词。在搜索过程中,**我们利用约束条件“剪掉”了不满足约束条件的搜索分支**,避免许多无意义的尝试,从而提升搜索效率。

+## 常用术语
+
+为了更清晰地分析算法问题,我们总结一下回溯算法中常用术语的含义,并对照例题三给出对应示例。
+
+| 名词 | 定义 | 例题三 |
+| ------------------- | -------------------------------------------------------------------------- | -------------------------------------------------------------------- |
+| 解 Solution | 解是满足问题特定条件的答案,可能有一个或多个 | 根节点到节点 $7$ 的满足约束条件的所有路径 |
+| 约束条件 Constraint | 约束条件是问题中限制解的可行性的条件,通常用于剪枝 | 路径中不包含节点 $3$ ,只包含一个节点 $7$ |
+| 状态 State | 状态表示问题在某一时刻的情况,包括已经做出的选择 | 当前已访问的节点路径,即 `path` 节点列表 |
+| 尝试 Attempt | 尝试是根据可用选择来探索解空间的过程,包括做出选择,更新状态,检查是否为解 | 递归访问左(右)子节点,将节点添加进 `path` ,判断节点的值是否为 $7$ |
+| 回退 Backtracking | 回退指遇到不满足约束条件的状态时,撤销前面做出的选择,回到上一个状态 | 当越过叶结点、结束结点访问、遇到值为 $3$ 的节点时终止搜索,函数返回 |
+| 剪枝 Pruning | 剪枝是根据问题特性和约束条件避免无意义的搜索路径的方法,可提高搜索效率 | 当遇到值为 $3$ 的节点时,则终止继续搜索 |
+
+!!! tip
+
+ 问题、解、状态等概念是通用的,在分治、回溯、动态规划、贪心等算法中都有涉及。
+
## 框架代码
接下来,我们尝试将回溯的“尝试、回退、剪枝”的主体框架提炼出来,提升代码的通用性。
@@ -290,7 +310,6 @@
if (isSolution(state)) {
// 记录解
recordSolution(state, res);
- // 停止继续搜索
return;
}
// 遍历所有选择
@@ -316,7 +335,6 @@
if (isSolution(state)) {
// 记录解
recordSolution(state, res);
- // 停止继续搜索
return;
}
// 遍历所有选择
@@ -336,13 +354,12 @@
=== "Python"
```python title=""
- def backtrack(state: State, choices: list[choice], res: list[state]):
+ def backtrack(state: State, choices: list[choice], res: list[state]) -> None:
"""回溯算法框架"""
# 判断是否为解
if is_solution(state):
# 记录解
record_solution(state, res)
- # 停止继续搜索
return
# 遍历所有选择
for choice in choices:
@@ -364,7 +381,6 @@
if isSolution(state) {
// 记录解
recordSolution(state, res)
- // 停止继续搜索
return
}
// 遍历所有选择
@@ -390,7 +406,6 @@
if (isSolution(state)) {
// 记录解
recordSolution(state, res);
- // 停止继续搜索
return;
}
// 遍历所有选择
@@ -416,7 +431,6 @@
if (isSolution(state)) {
// 记录解
recordSolution(state, res);
- // 停止继续搜索
return;
}
// 遍历所有选择
@@ -442,7 +456,6 @@
if (isSolution(state)) {
// 记录解
recordSolution(state, res, numRes);
- // 停止继续搜索
return;
}
// 遍历所有选择
@@ -468,7 +481,6 @@
if (isSolution(state)) {
// 记录解
recordSolution(state, res);
- // 停止继续搜索
return;
}
// 遍历所有选择
@@ -494,7 +506,6 @@
if isSolution(state: state) {
// 记录解
recordSolution(state: state, res: &res)
- // 停止继续搜索
return
}
// 遍历所有选择
@@ -526,7 +537,6 @@
if (isSolution(state)) {
// 记录解
recordSolution(state, res);
- // 停止继续搜索
return;
}
// 遍历所有选择
@@ -543,7 +553,7 @@
}
```
-接下来,我们基于框架代码来解决例题三。状态 `state` 为节点遍历路径,选择 `choices` 为当前节点的左子节点和右子节点,结果 `res` 是路径列表。
+下面,我们基于框架代码来解决例题三:状态 `state` 为节点遍历路径,选择 `choices` 为当前节点的左子节点和右子节点,结果 `res` 是路径列表。
=== "Java"
@@ -721,28 +731,7 @@
[class]{}-[func]{backtrack}
```
-根据题意,当找到值为 7 的节点后应该继续搜索,**因此我们需要将记录解之后的 `return` 语句删除**。下图对比了保留或删除 `return` 语句的搜索过程。
-
-
-
-相比基于前序遍历的代码实现,基于回溯算法框架的代码实现虽然显得啰嗦,但通用性更好。实际上,**许多回溯问题都可以在该框架下解决**。我们只需根据具体问题来定义 `state` 和 `choices` ,并实现框架中的各个方法即可。
-
-## 常用术语
-
-为了更清晰地分析算法问题,我们总结一下回溯算法中常用术语的含义,并对照例题三给出对应示例。
-
-| 名词 | 定义 | 例题三 |
-| ------------------- | -------------------------------------------------------------------------- | -------------------------------------------------------------------- |
-| 解 Solution | 解是满足问题特定条件的答案,可能有一个或多个 | 根节点到节点 $7$ 的满足约束条件的所有路径 |
-| 约束条件 Constraint | 约束条件是问题中限制解的可行性的条件,通常用于剪枝 | 路径中不包含节点 $3$ ,只包含一个节点 $7$ |
-| 状态 State | 状态表示问题在某一时刻的情况,包括已经做出的选择 | 当前已访问的节点路径,即 `path` 节点列表 |
-| 尝试 Attempt | 尝试是根据可用选择来探索解空间的过程,包括做出选择,更新状态,检查是否为解 | 递归访问左(右)子节点,将节点添加进 `path` ,判断节点的值是否为 $7$ |
-| 回退 Backtracking | 回退指遇到不满足约束条件的状态时,撤销前面做出的选择,回到上一个状态 | 当越过叶结点、结束结点访问、遇到值为 $3$ 的节点时终止搜索,函数返回 |
-| 剪枝 Pruning | 剪枝是根据问题特性和约束条件避免无意义的搜索路径的方法,可提高搜索效率 | 当遇到值为 $3$ 的节点时,则终止继续搜索 |
-
-!!! tip
-
- 问题、解、状态等概念是通用的,在分治、回溯、动态规划、贪心等算法中都有涉及。
+相比基于前序遍历的代码实现,基于回溯算法框架的代码实现虽然显得啰嗦,但通用性更好。实际上,**许多回溯问题都可以在该框架下解决**。我们只需根据具体问题来定义 `state` 和 `choices` ,并实现框架中的各个方法。
## 优势与局限性
|
hello-algo
|
krahets
|
Java
|
Java
| 109,696
| 13,651
|
《Hello 算法》:动画图解、一键运行的数据结构与算法教程。支持 Python, Java, C++, C, C#, JS, Go, Swift, Rust, Ruby, Kotlin, TS, Dart 代码。简体版和繁体版同步更新,English version ongoing
|
krahets_hello-algo
|
BUG_FIX
|
probably a bug fix as code changes suggests that previous version wasn't working correctly
|
333b624189a6661bfd9bb759da2505f6041ce8f9
|
2024-08-29 18:36:02
|
Hadley Wickham
|
Need two tries to get one retry
| false
| 1
| 1
| 2
|
--- R/open-ai.R
@@ -2,7 +2,7 @@ open_ai_request <- function(base_url = "https://api.openai.com/v1",
key = open_ai_key()) {
req <- httr2::request(base_url)
req <- httr2::req_auth_bearer_token(req, Sys.getenv("OPENAI_API_KEY"))
- req <- httr2::req_retry(req, max_tries = 2)
+ req <- httr2::req_retry(req, max_tries = 1)
req <- httr2::req_error(req, body = function(resp) {
httr2::resp_body_json(resp)$error$message
})
|
ellmer
|
tidyverse
|
R
|
R
| 401
| 55
|
Call LLM APIs from R
|
tidyverse_ellmer
|
CONFIG_CHANGE
|
Obvious
|
9bab8ccb89466be8bc8c579264f51b194aa89612
| null |
Anatoly Bubenkov
|
add current bookmark to rkj-repos theme
| false
| 1
| 1
| 0
|
--- rkj-repos.zsh-theme
@@ -4,7 +4,7 @@
function hg_prompt_info {
hg prompt --angle-brackets "\
-<hg:%{$fg[magenta]%}<branch>%{$reset_color%}>\
+<hg:%{$fg[magenta]%}<branch>%{$reset_color%}><:%{$fg[magenta]%}<bookmark>%{$reset_color%}>\
</%{$fg[yellow]%}<tags|%{$reset_color%}, %{$fg[yellow]%}>%{$reset_color%}>\
%{$fg[red]%}<status|modified|unknown><update>%{$reset_color%}<
patches: <patches|join( → )|pre_applied(%{$fg[yellow]%})|post_applied(%{$reset_color%})|pre_unapplied(%{$fg_bold[black]%})|post_unapplied(%{$reset_color%})>>" 2>/dev/null
|
ohmyzsh_ohmyzsh.json
| null | null | null | null | null | null |
ohmyzsh_ohmyzsh.json
|
NEW_FEAT
|
5, obvious
|
b1d0ffb6d2deda7ad0a67e1ef49e72b330c2fecb
|
2025-03-11 16:51:18
|
Matej Knopp
|
[Windows] Make lifecycle manager updates atomic (#164872) Required for multi-window. On windows the `LifecycleManager` currently sends the lifecycle event as soon as windows message is processed. This however causes problems when changing focus between application windows. When switching focus from HWND1 to HWND2, HWND1 first gets unfocused, followed by HWND2 getting focused. After HWND1 gets unfocused, `LifecycleManager` immediately notifies the framework that the application is inactive, which is wrong as the application never went into inactive state, followed by subsequent call to put the application in resumed state when HWND2 is focused. Because this happens very quickly, sometimes focus manager gets into inconsistent state. To resolve this `LifecycleManager` should gather the all the changes while window sends the messages and then notify the framework atomically in next run loop turn. This PR also simplifies the logic in `LifecycleManager` through which the application state is derived from window states. This PR removes engine forcing `resumed` lifecycle state at startup. I'm not entirely sure what the point of this was - the state can and should be determined solely from window states, this just seems to muddy the state logic. Also it happens before the framework is even listening to state changes. The mutex in `WindowsLifecycleManager` is removed. Not sure why it was there. ## Pre-launch Checklist - [x] I read the [Contributor Guide] and followed the process outlined there for submitting PRs. - [x] I read the [Tree Hygiene] wiki page, which explains my responsibilities. - [x] I read and followed the [Flutter Style Guide], including [Features we expect every widget to implement]. - [x] I signed the [CLA]. - [x] I listed at least one issue that this PR fixes in the description above. - [x] I updated/added relevant documentation (doc comments with `///`). - [x ] I added new tests to check the change I am making, or this PR is [test-exempt]. - [x] I followed the [breaking change policy] and added [Data Driven Fixes] where supported. - [x] All existing and new tests are passing. If you need help, consider asking for advice on the #hackers-new channel on [Discord]. <!-- Links --> [Contributor Guide]: https://github.com/flutter/flutter/blob/main/docs/contributing/Tree-hygiene.md#overview [Tree Hygiene]: https://github.com/flutter/flutter/blob/main/docs/contributing/Tree-hygiene.md [test-exempt]: https://github.com/flutter/flutter/blob/main/docs/contributing/Tree-hygiene.md#tests [Flutter Style Guide]: https://github.com/flutter/flutter/blob/main/docs/contributing/Style-guide-for-Flutter-repo.md [Features we expect every widget to implement]: https://github.com/flutter/flutter/blob/main/docs/contributing/Style-guide-for-Flutter-repo.md#features-we-expect-every-widget-to-implement [CLA]: https://cla.developers.google.com/ [flutter/tests]: https://github.com/flutter/tests [breaking change policy]: https://github.com/flutter/flutter/blob/main/docs/contributing/Tree-hygiene.md#handling-breaking-changes [Discord]: https://github.com/flutter/flutter/blob/main/docs/contributing/Chat.md [Data Driven Fixes]: https://github.com/flutter/flutter/blob/main/docs/contributing/Data-driven-Fixes.md --------- Co-authored-by: Matthew Kosarek <[email protected]> Co-authored-by: Harlen Batagelo <[email protected]> Co-authored-by: Loïc Sharma <[email protected]>
| false
| 120
| 74
| 194
|
--- engine/src/flutter/shell/platform/windows/flutter_windows_engine.cc
@@ -489,6 +489,7 @@ bool FlutterWindowsEngine::Run(std::string_view entrypoint) {
displays.data(), displays.size());
SendSystemLocales();
+ SetLifecycleState(flutter::AppLifecycleState::kResumed);
settings_plugin_->StartWatching();
settings_plugin_->SendSettings();
@@ -801,6 +802,12 @@ void FlutterWindowsEngine::SetNextFrameCallback(fml::closure callback) {
this);
}
+void FlutterWindowsEngine::SetLifecycleState(flutter::AppLifecycleState state) {
+ if (lifecycle_manager_) {
+ lifecycle_manager_->SetLifecycleState(state);
+ }
+}
+
void FlutterWindowsEngine::SendSystemLocales() {
std::vector<LanguageInfo> languages =
GetPreferredLanguageInfo(*windows_proc_table_);
--- engine/src/flutter/shell/platform/windows/flutter_windows_engine.h
@@ -347,6 +347,9 @@ class FlutterWindowsEngine {
// system changes.
void SendSystemLocales();
+ // Sends the current lifecycle state to the framework.
+ void SetLifecycleState(flutter::AppLifecycleState state);
+
// Create the keyboard & text input sub-systems.
//
// This requires that a view is attached to the engine.
--- engine/src/flutter/shell/platform/windows/flutter_windows_engine_unittests.cc
@@ -800,6 +800,7 @@ TEST_F(FlutterWindowsEngineTest, TestExit) {
modifier.SetImplicitView(&view);
modifier.embedder_api().RunsAOTCompiledDartCode = []() { return false; };
auto handler = std::make_unique<MockWindowsLifecycleManager>(engine.get());
+ EXPECT_CALL(*handler, SetLifecycleState(AppLifecycleState::kResumed));
EXPECT_CALL(*handler, Quit)
.WillOnce([&finished](std::optional<HWND> hwnd,
std::optional<WPARAM> wparam,
@@ -836,6 +837,7 @@ TEST_F(FlutterWindowsEngineTest, TestExitCancel) {
modifier.SetImplicitView(&view);
modifier.embedder_api().RunsAOTCompiledDartCode = []() { return false; };
auto handler = std::make_unique<MockWindowsLifecycleManager>(engine.get());
+ EXPECT_CALL(*handler, SetLifecycleState(AppLifecycleState::kResumed));
EXPECT_CALL(*handler, IsLastWindowOfProcess).WillRepeatedly(Return(true));
EXPECT_CALL(*handler, Quit).Times(0);
modifier.SetLifecycleManager(std::move(handler));
@@ -883,6 +885,7 @@ TEST_F(FlutterWindowsEngineTest, TestExitSecondCloseMessage) {
modifier.SetImplicitView(&view);
modifier.embedder_api().RunsAOTCompiledDartCode = []() { return false; };
auto handler = std::make_unique<MockWindowsLifecycleManager>(engine.get());
+ EXPECT_CALL(*handler, SetLifecycleState(AppLifecycleState::kResumed));
EXPECT_CALL(*handler, IsLastWindowOfProcess).WillOnce(Return(true));
EXPECT_CALL(*handler, Quit)
.WillOnce([handler_ptr = handler.get()](
@@ -942,6 +945,7 @@ TEST_F(FlutterWindowsEngineTest, TestExitCloseMultiWindow) {
modifier.SetImplicitView(&view);
modifier.embedder_api().RunsAOTCompiledDartCode = []() { return false; };
auto handler = std::make_unique<MockWindowsLifecycleManager>(engine.get());
+ EXPECT_CALL(*handler, SetLifecycleState(AppLifecycleState::kResumed));
EXPECT_CALL(*handler, IsLastWindowOfProcess).WillOnce([&finished]() {
finished = true;
return false;
@@ -1019,7 +1023,7 @@ TEST_F(FlutterWindowsEngineTest, ApplicationLifecycleExternalWindow) {
engine->lifecycle_manager()->ExternalWindowMessage(0, WM_CLOSE, 0, 0);
}
-TEST_F(FlutterWindowsEngineTest, LifecycleStateTransition) {
+TEST_F(FlutterWindowsEngineTest, AppStartsInResumedState) {
FlutterWindowsEngineBuilder builder{GetContext()};
auto engine = builder.Build();
@@ -1030,45 +1034,38 @@ TEST_F(FlutterWindowsEngineTest, LifecycleStateTransition) {
EngineModifier modifier(engine.get());
modifier.SetImplicitView(&view);
modifier.embedder_api().RunsAOTCompiledDartCode = []() { return false; };
+ auto handler = std::make_unique<MockWindowsLifecycleManager>(engine.get());
+ EXPECT_CALL(*handler, SetLifecycleState(AppLifecycleState::kResumed))
+ .Times(1);
+ modifier.SetLifecycleManager(std::move(handler));
engine->Run();
+}
- engine->window_proc_delegate_manager()->OnTopLevelWindowProc(
- (HWND)1, WM_SIZE, SIZE_RESTORED, 0);
-
- while (engine->lifecycle_manager()->IsUpdateStateScheduled()) {
- PumpMessage();
- }
-
- EXPECT_EQ(engine->lifecycle_manager()->GetLifecycleState(),
- AppLifecycleState::kInactive);
+TEST_F(FlutterWindowsEngineTest, LifecycleStateTransition) {
+ FlutterWindowsEngineBuilder builder{GetContext()};
- engine->lifecycle_manager()->OnWindowStateEvent((HWND)1,
- WindowStateEvent::kFocus);
+ auto engine = builder.Build();
+ auto window_binding_handler =
+ std::make_unique<::testing::NiceMock<MockWindowBindingHandler>>();
+ MockFlutterWindowsView view(engine.get(), std::move(window_binding_handler));
- while (engine->lifecycle_manager()->IsUpdateStateScheduled()) {
- PumpMessage();
- }
+ EngineModifier modifier(engine.get());
+ modifier.SetImplicitView(&view);
+ modifier.embedder_api().RunsAOTCompiledDartCode = []() { return false; };
+ engine->Run();
+ engine->window_proc_delegate_manager()->OnTopLevelWindowProc(
+ (HWND)1, WM_SIZE, SIZE_RESTORED, 0);
EXPECT_EQ(engine->lifecycle_manager()->GetLifecycleState(),
AppLifecycleState::kResumed);
engine->window_proc_delegate_manager()->OnTopLevelWindowProc(
(HWND)1, WM_SIZE, SIZE_MINIMIZED, 0);
-
- while (engine->lifecycle_manager()->IsUpdateStateScheduled()) {
- PumpMessage();
- }
-
EXPECT_EQ(engine->lifecycle_manager()->GetLifecycleState(),
AppLifecycleState::kHidden);
engine->window_proc_delegate_manager()->OnTopLevelWindowProc(
(HWND)1, WM_SIZE, SIZE_RESTORED, 0);
-
- while (engine->lifecycle_manager()->IsUpdateStateScheduled()) {
- PumpMessage();
- }
-
EXPECT_EQ(engine->lifecycle_manager()->GetLifecycleState(),
AppLifecycleState::kInactive);
}
@@ -1093,10 +1090,6 @@ TEST_F(FlutterWindowsEngineTest, ExternalWindowMessage) {
engine->ProcessExternalWindowMessage(reinterpret_cast<HWND>(1), WM_SHOWWINDOW,
FALSE, NULL);
- while (engine->lifecycle_manager()->IsUpdateStateScheduled()) {
- PumpMessage();
- }
-
EXPECT_EQ(engine->lifecycle_manager()->GetLifecycleState(),
AppLifecycleState::kHidden);
}
@@ -1124,20 +1117,12 @@ TEST_F(FlutterWindowsEngineTest, InnerWindowHidden) {
view.OnWindowStateEvent(inner, WindowStateEvent::kShow);
view.OnWindowStateEvent(inner, WindowStateEvent::kFocus);
- while (engine->lifecycle_manager()->IsUpdateStateScheduled()) {
- PumpMessage();
- }
-
EXPECT_EQ(engine->lifecycle_manager()->GetLifecycleState(),
AppLifecycleState::kResumed);
// Hide Flutter window, but not top level window.
view.OnWindowStateEvent(inner, WindowStateEvent::kHide);
- while (engine->lifecycle_manager()->IsUpdateStateScheduled()) {
- PumpMessage();
- }
-
// The top-level window is still visible, so we ought not enter hidden state.
EXPECT_EQ(engine->lifecycle_manager()->GetLifecycleState(),
AppLifecycleState::kInactive);
@@ -1259,6 +1244,7 @@ TEST_F(FlutterWindowsEngineTest, ChannelListenedTo) {
bool lifecycle_began = false;
auto handler = std::make_unique<MockWindowsLifecycleManager>(engine.get());
+ EXPECT_CALL(*handler, SetLifecycleState).Times(1);
handler->begin_processing_callback = [&]() { lifecycle_began = true; };
modifier.SetLifecycleManager(std::move(handler));
--- engine/src/flutter/shell/platform/windows/flutter_windows_unittests.cc
@@ -535,7 +535,7 @@ TEST_F(WindowsTest, Lifecycle) {
modifier.SetLifecycleManager(std::move(lifecycle_manager));
EXPECT_CALL(*lifecycle_manager_ptr,
- SetLifecycleState(AppLifecycleState::kInactive))
+ SetLifecycleState(AppLifecycleState::kResumed))
.WillOnce([lifecycle_manager_ptr](AppLifecycleState state) {
lifecycle_manager_ptr->WindowsLifecycleManager::SetLifecycleState(
state);
@@ -548,12 +548,10 @@ TEST_F(WindowsTest, Lifecycle) {
state);
});
- FlutterDesktopViewControllerProperties properties = {0, 0};
-
// Create a controller. This launches the engine and sets the app lifecycle
// to the "resumed" state.
ViewControllerPtr controller{
- FlutterDesktopEngineCreateViewController(engine.get(), &properties)};
+ FlutterDesktopViewControllerCreate(0, 0, engine.release())};
FlutterDesktopViewRef view =
FlutterDesktopViewControllerGetView(controller.get());
@@ -567,17 +565,6 @@ TEST_F(WindowsTest, Lifecycle) {
// "hidden" app lifecycle event.
::MoveWindow(hwnd, /* X */ 0, /* Y */ 0, /* nWidth*/ 100, /* nHeight*/ 100,
/* bRepaint*/ false);
-
- while (lifecycle_manager_ptr->IsUpdateStateScheduled()) {
- PumpMessage();
- }
-
- // Resets the view, simulating the window being hidden.
- controller.reset();
-
- while (lifecycle_manager_ptr->IsUpdateStateScheduled()) {
- PumpMessage();
- }
}
TEST_F(WindowsTest, GetKeyboardStateHeadless) {
--- engine/src/flutter/shell/platform/windows/windows_lifecycle_manager.cc
@@ -205,48 +205,49 @@ void WindowsLifecycleManager::SetLifecycleState(AppLifecycleState state) {
}
}
-void WindowsLifecycleManager::UpdateState() {
- AppLifecycleState new_state = AppLifecycleState::kResumed;
- if (visible_windows_.empty()) {
- new_state = AppLifecycleState::kHidden;
- } else if (focused_windows_.empty()) {
- new_state = AppLifecycleState::kInactive;
- }
- SetLifecycleState(new_state);
-}
-
void WindowsLifecycleManager::OnWindowStateEvent(HWND hwnd,
WindowStateEvent event) {
- // Instead of updating the state immediately, remember all
- // changes to individual window and then update the state in next run loop
- // turn. Otherwise the application would be temporarily deactivated when
- // switching focus between windows for example.
- if (!update_state_scheduled_) {
- update_state_scheduled_ = true;
- // Task runner will be destroyed together with engine so it is safe
- // to keep reference to it.
- engine_->task_runner()->PostTask([this]() {
- update_state_scheduled_ = false;
- UpdateState();
- });
+ // Synthesize an unfocus event when a focused window is hidden.
+ if (event == WindowStateEvent::kHide &&
+ focused_windows_.find(hwnd) != focused_windows_.end()) {
+ OnWindowStateEvent(hwnd, WindowStateEvent::kUnfocus);
}
+ std::lock_guard guard(state_update_lock_);
switch (event) {
case WindowStateEvent::kShow: {
- visible_windows_.insert(hwnd);
+ bool first_shown_window = visible_windows_.empty();
+ auto pair = visible_windows_.insert(hwnd);
+ if (first_shown_window && pair.second &&
+ state_ == AppLifecycleState::kHidden) {
+ SetLifecycleState(AppLifecycleState::kInactive);
+ }
break;
}
case WindowStateEvent::kHide: {
- visible_windows_.erase(hwnd);
- focused_windows_.erase(hwnd);
+ bool present = visible_windows_.erase(hwnd);
+ bool empty = visible_windows_.empty();
+ if (present && empty &&
+ (state_ == AppLifecycleState::kResumed ||
+ state_ == AppLifecycleState::kInactive)) {
+ SetLifecycleState(AppLifecycleState::kHidden);
+ }
break;
}
case WindowStateEvent::kFocus: {
- focused_windows_.insert(hwnd);
+ bool first_focused_window = focused_windows_.empty();
+ auto pair = focused_windows_.insert(hwnd);
+ if (first_focused_window && pair.second &&
+ state_ == AppLifecycleState::kInactive) {
+ SetLifecycleState(AppLifecycleState::kResumed);
+ }
break;
}
case WindowStateEvent::kUnfocus: {
- focused_windows_.erase(hwnd);
+ if (focused_windows_.erase(hwnd) && focused_windows_.empty() &&
+ state_ == AppLifecycleState::kResumed) {
+ SetLifecycleState(AppLifecycleState::kInactive);
+ }
break;
}
}
--- engine/src/flutter/shell/platform/windows/windows_lifecycle_manager.h
@@ -63,16 +63,17 @@ class WindowsLifecycleManager {
// message to the framework notifying it of the state change.
virtual void SetLifecycleState(AppLifecycleState state);
- // Respond to a change in window state.
- // Saves the state for the HWND and schedules UpdateState to be called
- // if it is not already scheduled.
+ // Respond to a change in window state. Transitions as follows:
+ // When the only visible window is hidden, transition from resumed or
+ // inactive to hidden.
+ // When the only focused window is unfocused, transition from resumed to
+ // inactive.
+ // When a window is focused, transition from inactive to resumed.
+ // When a window is shown, transition from hidden to inactive.
virtual void OnWindowStateEvent(HWND hwnd, WindowStateEvent event);
AppLifecycleState GetLifecycleState() { return state_; }
- // Used in tests to wait until the state is updated.
- bool IsUpdateStateScheduled() const { return update_state_scheduled_; }
-
// Called by the engine when a non-Flutter window receives an event that may
// alter the lifecycle state. The logic for external windows must differ from
// that used for FlutterWindow instances, because:
@@ -113,20 +114,12 @@ class WindowsLifecycleManager {
bool process_exit_ = false;
std::set<HWND> visible_windows_;
- std::set<HWND> focused_windows_;
- // Transitions the application state. If any windows are focused,
- // the application is considered resumed. If no windows are focused
- // but there are visible windows, application is considered inactive.
- // Otherwise, if there are no visible window, application is considered
- // hidden.
- void UpdateState();
+ std::set<HWND> focused_windows_;
- // Whether update state is scheduled to be called in next run loop turn.
- // This is needed to provide atomic updates of the state.
- bool update_state_scheduled_ = false;
+ std::mutex state_update_lock_;
- AppLifecycleState state_ = AppLifecycleState::kDetached;
+ flutter::AppLifecycleState state_;
};
} // namespace flutter
--- engine/src/flutter/shell/platform/windows/windows_lifecycle_manager_unittests.cc
@@ -4,7 +4,6 @@
#include "flutter/shell/platform/windows/windows_lifecycle_manager.h"
-#include "flutter/shell/platform/windows/testing/flutter_windows_engine_builder.h"
#include "flutter/shell/platform/windows/testing/windows_test.h"
#include "gtest/gtest.h"
@@ -13,70 +12,48 @@ namespace testing {
class WindowsLifecycleManagerTest : public WindowsTest {};
-static void WaitUntilUpdated(const WindowsLifecycleManager& manager) {
- while (manager.IsUpdateStateScheduled()) {
- ::MSG msg;
- if (::GetMessage(&msg, nullptr, 0, 0)) {
- ::TranslateMessage(&msg);
- ::DispatchMessage(&msg);
- }
- }
-}
-
TEST_F(WindowsLifecycleManagerTest, StateTransitions) {
- FlutterWindowsEngineBuilder builder{GetContext()};
- std::unique_ptr<FlutterWindowsEngine> engine = builder.Build();
-
- WindowsLifecycleManager manager{engine.get()};
+ WindowsLifecycleManager manager(nullptr);
HWND win1 = reinterpret_cast<HWND>(1);
HWND win2 = reinterpret_cast<HWND>(2);
// Hidden to inactive upon window shown.
manager.SetLifecycleState(AppLifecycleState::kHidden);
manager.OnWindowStateEvent(win1, WindowStateEvent::kShow);
- WaitUntilUpdated(manager);
EXPECT_EQ(manager.GetLifecycleState(), AppLifecycleState::kInactive);
// Showing a second window does not change state.
manager.OnWindowStateEvent(win2, WindowStateEvent::kShow);
- WaitUntilUpdated(manager);
EXPECT_EQ(manager.GetLifecycleState(), AppLifecycleState::kInactive);
// Inactive to resumed upon window focus.
manager.OnWindowStateEvent(win2, WindowStateEvent::kFocus);
- WaitUntilUpdated(manager);
EXPECT_EQ(manager.GetLifecycleState(), AppLifecycleState::kResumed);
// Showing a second window does not change state.
manager.OnWindowStateEvent(win1, WindowStateEvent::kFocus);
- WaitUntilUpdated(manager);
EXPECT_EQ(manager.GetLifecycleState(), AppLifecycleState::kResumed);
// Unfocusing one window does not change state while another is focused.
manager.OnWindowStateEvent(win1, WindowStateEvent::kUnfocus);
- WaitUntilUpdated(manager);
EXPECT_EQ(manager.GetLifecycleState(), AppLifecycleState::kResumed);
// Unfocusing final remaining focused window transitions to inactive.
manager.OnWindowStateEvent(win2, WindowStateEvent::kUnfocus);
- WaitUntilUpdated(manager);
EXPECT_EQ(manager.GetLifecycleState(), AppLifecycleState::kInactive);
// Hiding one of two visible windows does not change state.
manager.OnWindowStateEvent(win2, WindowStateEvent::kHide);
- WaitUntilUpdated(manager);
EXPECT_EQ(manager.GetLifecycleState(), AppLifecycleState::kInactive);
// Hiding only visible window transitions to hidden.
manager.OnWindowStateEvent(win1, WindowStateEvent::kHide);
- WaitUntilUpdated(manager);
EXPECT_EQ(manager.GetLifecycleState(), AppLifecycleState::kHidden);
// Transition directly from resumed to hidden when the window is hidden.
manager.OnWindowStateEvent(win1, WindowStateEvent::kShow);
manager.OnWindowStateEvent(win1, WindowStateEvent::kFocus);
manager.OnWindowStateEvent(win1, WindowStateEvent::kHide);
- WaitUntilUpdated(manager);
EXPECT_EQ(manager.GetLifecycleState(), AppLifecycleState::kHidden);
}
|
flutter
|
flutter
|
Dart
|
Dart
| 168,965
| 28,132
|
Flutter makes it easy and fast to build beautiful apps for mobile and beyond
|
flutter_flutter
|
BUG_FIX
|
correcting display behavior under Wayland
|
79364dbbdcc305cf411719d0ea53345b5048d53c
|
2022-04-20 06:07:02
|
mleers
|
Fix # 650: Replace broken weighted round robin link (#651)
| false
| 1
| 1
| 2
|
--- README.md
@@ -597,7 +597,7 @@ DNS is hierarchical, with a few authoritative servers at the top level. Your ro
Services such as [CloudFlare](https://www.cloudflare.com/dns/) and [Route 53](https://aws.amazon.com/route53/) provide managed DNS services. Some DNS services can route traffic through various methods:
-* [Weighted round robin](https://www.jscape.com/blog/load-balancing-algorithms)
+* [Weighted round robin](https://www.g33kinfo.com/info/round-robin-vs-weighted-round-robin-lb)
* Prevent traffic from going to servers under maintenance
* Balance between varying cluster sizes
* A/B testing
|
system-design-primer
|
donnemartin
|
Python
|
Python
| 290,909
| 48,355
|
Learn how to design large-scale systems. Prep for the system design interview. Includes Anki flashcards.
|
donnemartin_system-design-primer
|
BUG_FIX
|
Matched \bfix(e[ds]|ing)?\b in message
|
249b98e289d909b3311302fcc0f097d0a6d86cd4
|
2024-07-19 20:09:33
|
Jaida Wu
|
Update release dates
| false
| 3
| 3
| 6
|
--- status.md
@@ -39,8 +39,8 @@
| Xiaomi MIX Fold | cetus | 2021-03-30 | MIUI Fold | ✔️ Opening |
| Xiaomi MIX Fold 2 | zizhan | 2022-08-11 | MIUI Fold | ✔️ Opening |
| Xiaomi MIX Fold 3 | babylon | 2023-08-14 | MIUI Fold | ✔️ Opening |
-| Xiaomi MIX Fold 4 | goku | 2024-07-19 | Xiaomi HyperOS | ❌ Blocked |
-| Xiaomi MIX Flip | ruyi | 2024-07-19 | Xiaomi HyperOS | ❌ Blocked |
+| Xiaomi MIX Fold 4 | goku | 2024-XX-XX | Xiaomi HyperOS | ❌ Blocked |
+| Xiaomi MIX Flip | ruyi | 2024-XX-XX | Xiaomi HyperOS | ❌ Blocked |
| Xiaomi Civi | mona | 2021-09-27 | MIUI | ✔️ Opening |
| Xiaomi Civi 1S | zijin | 2022-04-21 | MIUI | ✔️ Opening |
| Xiaomi Civi 2 | ziyi | 2022-09-27 | MIUI | ✔️ Opening |
@@ -97,7 +97,7 @@
| Redmi K70E | duchamp | 2023-11-29 | Xiaomi HyperOS | ❌ Blocked |
| Redmi K70 | vermeer | 2023-11-29 | Xiaomi HyperOS | ❌ Blocked |
| Redmi K70 Pro | manet | 2023-11-29 | Xiaomi HyperOS | ❌ Blocked |
-| Redmi K70 Ultra | rothko | 2024-07-19 | Xiaomi HyperOS | ❌ Blocked |
+| Redmi K70 Ultra | rothko | 2024-XX-XX | Xiaomi HyperOS | ❌ Blocked |
| Redmi Pad SE | xun | 2023-09-21 | MIUI Pad | ✔️ Opening |
| Redmi Pad | yunluo | 2022-10-27 | MIUI Pad | ✔️ Opening |
| Redmi Pad Pro | dizi | 2024-04-10 | Xiaomi HyperOS | ❌ Blocked |
|
xiaomi-hyperos-bootloader-bypass
|
mlgmxyysd
|
PHP
|
PHP
| 3,496
| 367
|
A PoC that exploits a vulnerability to bypass the Xiaomi HyperOS community restrictions of BootLoader unlocked account bindings.
|
mlgmxyysd_xiaomi-hyperos-bootloader-bypass
|
DOC_CHANGE
|
changes in md file
|
9faf3963b6b567421ab29d263e00f0e6e59a39f5
|
2024-12-13 20:18:47
|
Patrick Steinhardt
|
t: introduce compatibility options to clar-based tests Our unit tests that don't yet use the clar unit testing framework ignore any option that they do not understand. It is thus fine to just pass test options we set up globally to those unit tests as they are simply ignored. This makes our life easier because we don't have to special case those options with Meson, where test options are set up globally via `meson test --test-args=`. But our clar-based unit testing framework is way stricter here and will fail in case it is passed an unknown option. Stub out these options with no-ops to make our life a bit easier. Note that this also requires us to remove the `-x` short option for `--exclude`. This is because `-x` has another meaning in our integration tests, as it enables shell tracing. I doubt there are a lot of people out there using it as we only got a small hand full of clar tests in the first place. So better change it now so that we can in the long run improve compatibility between the two different test drivers. Signed-off-by: Patrick Steinhardt <[email protected]> Signed-off-by: Junio C Hamano <[email protected]>
| false
| 30
| 1
| 31
|
--- parse-options.h
@@ -353,18 +353,6 @@ struct option {
.callback = parse_opt_noop_cb, \
}
-static char *parse_options_noop_ignored_value MAYBE_UNUSED;
-#define OPT_NOOP_ARG(s, l) { \
- .type = OPTION_CALLBACK, \
- .short_name = (s), \
- .long_name = (l), \
- .value = &parse_options_noop_ignored_value, \
- .argh = "ignored", \
- .help = N_("no-op (backward compatibility)"), \
- .flags = PARSE_OPT_HIDDEN, \
- .callback = parse_opt_noop_cb, \
-}
-
#define OPT_ALIAS(s, l, source_long_name) { \
.type = OPTION_ALIAS, \
.short_name = (s), \
--- t/unit-tests/unit-test.c
@@ -18,25 +18,8 @@ int cmd_main(int argc, const char **argv)
N_("immediately exit upon the first failed test")),
OPT_STRING_LIST('r', "run", &run_args, N_("suite[::test]"),
N_("run only test suite or individual test <suite[::test]>")),
- OPT_STRING_LIST(0, "exclude", &exclude_args, N_("suite"),
+ OPT_STRING_LIST('x', "exclude", &exclude_args, N_("suite"),
N_("exclude test suite <suite>")),
- /*
- * Compatibility wrappers so that we don't have to filter
- * options understood by integration tests.
- */
- OPT_NOOP_NOARG('d', "debug"),
- OPT_NOOP_NOARG(0, "github-workflow-markup"),
- OPT_NOOP_NOARG(0, "no-bin-wrappers"),
- OPT_NOOP_ARG(0, "root"),
- OPT_NOOP_ARG(0, "stress"),
- OPT_NOOP_NOARG(0, "tee"),
- OPT_NOOP_NOARG(0, "with-dashes"),
- OPT_NOOP_ARG(0, "valgrind"),
- OPT_NOOP_ARG(0, "valgrind-only"),
- OPT_NOOP_NOARG('v', "verbose"),
- OPT_NOOP_NOARG('V', "verbose-log"),
- OPT_NOOP_ARG(0, "verbose-only"),
- OPT_NOOP_NOARG('x', NULL),
OPT_END(),
};
struct strvec args = STRVEC_INIT;
|
git
| null |
C
|
C
| null | null |
Version control
|
_git
|
CODE_IMPROVEMENT
|
I guess everything was working fine, but changes in code to ease the unit testing process
|
31f2fbeb531455fd39ae7acf21c7a125a0176163
|
2024-08-02 15:44:19
|
dependabot[bot]
|
Bump the production-dependencies group with 7 updates (#195)
| false
| 12
| 12
| 24
|
--- mix.exs
@@ -48,16 +48,16 @@ defmodule Zoonk.MixProject do
{:ex_aws, "~> 2.5.4"},
{:ex_aws_s3, "~> 2.5.3"},
{:finch, "~> 0.18"},
- {:flame, "~> 0.3.0"},
+ {:flame, "~> 0.2.0"},
{:floki, "~> 0.36.2", only: :test},
{:gettext, "~> 0.24.0"},
{:hackney, "~> 1.20.1"},
{:image, "~> 0.53.0"},
{:jason, "~> 1.2"},
{:mix_audit, "~> 2.1.2", only: [:dev, :test], runtime: false},
- {:money, "~> 1.13.1"},
+ {:money, "~> 1.12.4"},
{:mox, "~> 1.1.0", only: :test},
- {:oban, "~> 2.18.0"},
+ {:oban, "~> 2.17.12"},
{:phoenix_ecto, "~> 4.6.1"},
{:phoenix_html, "~> 4.1.1"},
{:phoenix_live_dashboard, "~> 0.8.4"},
--- mix.lock
@@ -1,5 +1,5 @@
%{
- "bandit": {:hex, :bandit, "1.5.6", "688a68be4246c5c28a457ab2886585205025a7c86285ab030ee0993ed14a0239", [:mix], [{:hpax, "~> 1.0.0", [hex: :hpax, repo: "hexpm", optional: false]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}, {:thousand_island, "~> 1.0", [hex: :thousand_island, repo: "hexpm", optional: false]}, {:websock, "~> 0.5", [hex: :websock, repo: "hexpm", optional: false]}], "hexpm", "26b5e81b917e8353df2b61b831c223b78c7ebecf9607ed80f5a594f99b19d722"},
+ "bandit": {:hex, :bandit, "1.5.5", "df28f1c41f745401fe9e85a6882033f5f3442ab6d30c8a2948554062a4ab56e0", [:mix], [{:hpax, "~> 0.2.0", [hex: :hpax, repo: "hexpm", optional: false]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}, {:thousand_island, "~> 1.0", [hex: :thousand_island, repo: "hexpm", optional: false]}, {:websock, "~> 0.5", [hex: :websock, repo: "hexpm", optional: false]}], "hexpm", "f21579a29ea4bc08440343b2b5f16f7cddf2fea5725d31b72cf973ec729079e1"},
"bcrypt_elixir": {:hex, :bcrypt_elixir, "3.1.0", "0b110a9a6c619b19a7f73fa3004aa11d6e719a67e672d1633dc36b6b2290a0f7", [:make, :mix], [{:comeonin, "~> 5.3", [hex: :comeonin, repo: "hexpm", optional: false]}, {:elixir_make, "~> 0.6", [hex: :elixir_make, repo: "hexpm", optional: false]}], "hexpm", "2ad2acb5a8bc049e8d5aa267802631912bb80d5f4110a178ae7999e69dca1bf7"},
"bunt": {:hex, :bunt, "1.0.0", "081c2c665f086849e6d57900292b3a161727ab40431219529f13c4ddcf3e7a44", [:mix], [], "hexpm", "dc5f86aa08a5f6fa6b8096f0735c4e76d54ae5c9fa2c143e5a1fc7c1cd9bb6b5"},
"castore": {:hex, :castore, "1.0.8", "dedcf20ea746694647f883590b82d9e96014057aff1d44d03ec90f36a5c0dc6e", [:mix], [], "hexpm", "0b2b66d2ee742cb1d9cb8c8be3b43c3a70ee8651f37b75a8b982e036752983f1"},
@@ -21,25 +21,25 @@
"expo": {:hex, :expo, "0.5.2", "beba786aab8e3c5431813d7a44b828e7b922bfa431d6bfbada0904535342efe2", [:mix], [], "hexpm", "8c9bfa06ca017c9cb4020fabe980bc7fdb1aaec059fd004c2ab3bff03b1c599c"},
"file_system": {:hex, :file_system, "1.0.0", "b689cc7dcee665f774de94b5a832e578bd7963c8e637ef940cd44327db7de2cd", [:mix], [], "hexpm", "6752092d66aec5a10e662aefeed8ddb9531d79db0bc145bb8c40325ca1d8536d"},
"finch": {:hex, :finch, "0.18.0", "944ac7d34d0bd2ac8998f79f7a811b21d87d911e77a786bc5810adb75632ada4", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: false]}, {:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 1.3", [hex: :mint, repo: "hexpm", optional: false]}, {:nimble_options, "~> 0.4 or ~> 1.0", [hex: :nimble_options, repo: "hexpm", optional: false]}, {:nimble_pool, "~> 0.2.6 or ~> 1.0", [hex: :nimble_pool, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "69f5045b042e531e53edc2574f15e25e735b522c37e2ddb766e15b979e03aa65"},
- "flame": {:hex, :flame, "0.3.0", "6938cd027d4de6c4599145d23c8242afd35a20ddda278e8cf816ce49fd0e068d", [:mix], [{:castore, ">= 0.0.0", [hex: :castore, repo: "hexpm", optional: true]}, {:jason, ">= 0.0.0", [hex: :jason, repo: "hexpm", optional: true]}], "hexpm", "263ffb2f8eaffdcaa3241072e515cb6af86c0280c763a3986934b039cac36300"},
+ "flame": {:hex, :flame, "0.2.0", "6a87cfc9fde4d51899a90db209254479b1b9ee14e9e6080027497662187bcd83", [:mix], [{:castore, ">= 0.0.0", [hex: :castore, repo: "hexpm", optional: true]}, {:jason, ">= 0.0.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "eba9c68d9804fb6f437a76b077455886f33c34cc2ccb880af61ba7facfafd093"},
"floki": {:hex, :floki, "0.36.2", "a7da0193538c93f937714a6704369711998a51a6164a222d710ebd54020aa7a3", [:mix], [], "hexpm", "a8766c0bc92f074e5cb36c4f9961982eda84c5d2b8e979ca67f5c268ec8ed580"},
"gettext": {:hex, :gettext, "0.24.0", "6f4d90ac5f3111673cbefc4ebee96fe5f37a114861ab8c7b7d5b30a1108ce6d8", [:mix], [{:expo, "~> 0.5.1", [hex: :expo, repo: "hexpm", optional: false]}], "hexpm", "bdf75cdfcbe9e4622dd18e034b227d77dd17f0f133853a1c73b97b3d6c770e8b"},
"hackney": {:hex, :hackney, "1.20.1", "8d97aec62ddddd757d128bfd1df6c5861093419f8f7a4223823537bad5d064e2", [:rebar3], [{:certifi, "~> 2.12.0", [hex: :certifi, repo: "hexpm", optional: false]}, {:idna, "~> 6.1.0", [hex: :idna, repo: "hexpm", optional: false]}, {:metrics, "~> 1.0.0", [hex: :metrics, repo: "hexpm", optional: false]}, {:mimerl, "~> 1.1", [hex: :mimerl, repo: "hexpm", optional: false]}, {:parse_trans, "3.4.1", [hex: :parse_trans, repo: "hexpm", optional: false]}, {:ssl_verify_fun, "~> 1.1.0", [hex: :ssl_verify_fun, repo: "hexpm", optional: false]}, {:unicode_util_compat, "~> 0.7.0", [hex: :unicode_util_compat, repo: "hexpm", optional: false]}], "hexpm", "fe9094e5f1a2a2c0a7d10918fee36bfec0ec2a979994cff8cfe8058cd9af38e3"},
- "hpax": {:hex, :hpax, "1.0.0", "28dcf54509fe2152a3d040e4e3df5b265dcb6cb532029ecbacf4ce52caea3fd2", [:mix], [], "hexpm", "7f1314731d711e2ca5fdc7fd361296593fc2542570b3105595bb0bc6d0fad601"},
+ "hpax": {:hex, :hpax, "0.2.0", "5a58219adcb75977b2edce5eb22051de9362f08236220c9e859a47111c194ff5", [:mix], [], "hexpm", "bea06558cdae85bed075e6c036993d43cd54d447f76d8190a8db0dc5893fa2f1"},
"idna": {:hex, :idna, "6.1.1", "8a63070e9f7d0c62eb9d9fcb360a7de382448200fbbd1b106cc96d3d8099df8d", [:rebar3], [{:unicode_util_compat, "~> 0.7.0", [hex: :unicode_util_compat, repo: "hexpm", optional: false]}], "hexpm", "92376eb7894412ed19ac475e4a86f7b413c1b9fbb5bd16dccd57934157944cea"},
"image": {:hex, :image, "0.53.0", "77ba25c41992a2f230ef991040e110a0558badc22d83cb9a0faf9b68209a3961", [:mix], [{:bumblebee, "~> 0.3", [hex: :bumblebee, repo: "hexpm", optional: true]}, {:evision, "~> 0.1.33 or ~> 0.2", [hex: :evision, repo: "hexpm", optional: true]}, {:exla, "~> 0.5", [hex: :exla, repo: "hexpm", optional: true]}, {:jason, "~> 1.4", [hex: :jason, repo: "hexpm", optional: true]}, {:kino, "~> 0.13", [hex: :kino, repo: "hexpm", optional: true]}, {:nx, "~> 0.7", [hex: :nx, repo: "hexpm", optional: true]}, {:nx_image, "~> 0.1", [hex: :nx_image, repo: "hexpm", optional: true]}, {:phoenix_html, "~> 2.1 or ~> 3.2 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: false]}, {:plug, "~> 1.13", [hex: :plug, repo: "hexpm", optional: true]}, {:req, "~> 0.4", [hex: :req, repo: "hexpm", optional: true]}, {:rustler, "> 0.0.0", [hex: :rustler, repo: "hexpm", optional: true]}, {:scholar, "~> 0.3", [hex: :scholar, repo: "hexpm", optional: true]}, {:sweet_xml, "~> 0.7", [hex: :sweet_xml, repo: "hexpm", optional: false]}, {:vix, "~> 0.23", [hex: :vix, repo: "hexpm", optional: false]}], "hexpm", "ce06fff64b4dcf34a64c2b0dc907d0ce51cca85566912a9d5b8ec3378fe3e902"},
- "jason": {:hex, :jason, "1.4.4", "b9226785a9aa77b6857ca22832cffa5d5011a667207eb2a0ad56adb5db443b8a", [:mix], [{:decimal, "~> 1.0 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: true]}], "hexpm", "c5eb0cab91f094599f94d55bc63409236a8ec69a21a67814529e8d5f6cc90b3b"},
+ "jason": {:hex, :jason, "1.4.3", "d3f984eeb96fe53b85d20e0b049f03e57d075b5acda3ac8d465c969a2536c17b", [:mix], [{:decimal, "~> 1.0 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: true]}], "hexpm", "9a90e868927f7c777689baa16d86f4d0e086d968db5c05d917ccff6d443e58a3"},
"metrics": {:hex, :metrics, "1.0.1", "25f094dea2cda98213cecc3aeff09e940299d950904393b2a29d191c346a8486", [:rebar3], [], "hexpm", "69b09adddc4f74a40716ae54d140f93beb0fb8978d8636eaded0c31b6f099f16"},
"mime": {:hex, :mime, "2.0.6", "8f18486773d9b15f95f4f4f1e39b710045fa1de891fada4516559967276e4dc2", [:mix], [], "hexpm", "c9945363a6b26d747389aac3643f8e0e09d30499a138ad64fe8fd1d13d9b153e"},
"mimerl": {:hex, :mimerl, "1.3.0", "d0cd9fc04b9061f82490f6581e0128379830e78535e017f7780f37fea7545726", [:rebar3], [], "hexpm", "a1e15a50d1887217de95f0b9b0793e32853f7c258a5cd227650889b38839fe9d"},
"mint": {:hex, :mint, "1.6.2", "af6d97a4051eee4f05b5500671d47c3a67dac7386045d87a904126fd4bbcea2e", [:mix], [{:castore, "~> 0.1.0 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: true]}, {:hpax, "~> 0.1.1 or ~> 0.2.0 or ~> 1.0", [hex: :hpax, repo: "hexpm", optional: false]}], "hexpm", "5ee441dffc1892f1ae59127f74afe8fd82fda6587794278d924e4d90ea3d63f9"},
"mix_audit": {:hex, :mix_audit, "2.1.4", "0a23d5b07350cdd69001c13882a4f5fb9f90fbd4cbf2ebc190a2ee0d187ea3e9", [:make, :mix], [{:jason, "~> 1.4", [hex: :jason, repo: "hexpm", optional: false]}, {:yaml_elixir, "~> 2.11", [hex: :yaml_elixir, repo: "hexpm", optional: false]}], "hexpm", "fd807653cc8c1cada2911129c7eb9e985e3cc76ebf26f4dd628bb25bbcaa7099"},
- "money": {:hex, :money, "1.13.1", "b437196bf698f85d2cad33ac3f65e1bc43a94673fddbf65605fe3a77922e208a", [:mix], [{:decimal, "~> 1.2 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: true]}, {:ecto, "~> 2.1 or ~> 3.0", [hex: :ecto, repo: "hexpm", optional: true]}, {:phoenix_html, "~> 2.0 or ~> 3.0 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: true]}], "hexpm", "d9719b775652b6249fa80aeecc5e505a5eeeab73f4f30ecd3f696c6830281dc5"},
+ "money": {:hex, :money, "1.12.4", "9d9817aa79d1317871f6b006721c264bf1910fb28ba2af50746514f0d7e8ddbe", [:mix], [{:decimal, "~> 1.0 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: true]}, {:ecto, "~> 1.0 or ~> 2.0 or ~> 3.0", [hex: :ecto, repo: "hexpm", optional: true]}, {:phoenix_html, "~> 2.0 or ~> 3.0 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: true]}], "hexpm", "87e4bb907df1da184cb4640569d8df99ee6d88c84ce4f5da03cb2fab8d433eb9"},
"mox": {:hex, :mox, "1.1.0", "0f5e399649ce9ab7602f72e718305c0f9cdc351190f72844599545e4996af73c", [:mix], [], "hexpm", "d44474c50be02d5b72131070281a5d3895c0e7a95c780e90bc0cfe712f633a13"},
"nimble_options": {:hex, :nimble_options, "1.1.1", "e3a492d54d85fc3fd7c5baf411d9d2852922f66e69476317787a7b2bb000a61b", [:mix], [], "hexpm", "821b2470ca9442c4b6984882fe9bb0389371b8ddec4d45a9504f00a66f650b44"},
"nimble_ownership": {:hex, :nimble_ownership, "0.3.1", "99d5244672fafdfac89bfad3d3ab8f0d367603ce1dc4855f86a1c75008bce56f", [:mix], [], "hexpm", "4bf510adedff0449a1d6e200e43e57a814794c8b5b6439071274d248d272a549"},
"nimble_pool": {:hex, :nimble_pool, "1.1.0", "bf9c29fbdcba3564a8b800d1eeb5a3c58f36e1e11d7b7fb2e084a643f645f06b", [:mix], [], "hexpm", "af2e4e6b34197db81f7aad230c1118eac993acc0dae6bc83bac0126d4ae0813a"},
- "oban": {:hex, :oban, "2.18.0", "092d20bfd3d70c7ecb70960f8548d300b54bb9937c7f2e56b388f3a9ed02ec68", [:mix], [{:ecto_sql, "~> 3.10", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:ecto_sqlite3, "~> 0.9", [hex: :ecto_sqlite3, repo: "hexpm", optional: true]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.16", [hex: :postgrex, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "aace1eff6f8227ae38d4274af967d96f051c2f0a5152f2ef9809dd1f97866745"},
+ "oban": {:hex, :oban, "2.17.12", "33fb0cbfb92b910d48dd91a908590fe3698bb85eacec8cd0d9bc6aa13dddd6d6", [:mix], [{:ecto_sql, "~> 3.10", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:ecto_sqlite3, "~> 0.9", [hex: :ecto_sqlite3, repo: "hexpm", optional: true]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.16", [hex: :postgrex, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "7a647d6cd6bb300073db17faabce22d80ae135da3baf3180a064fa7c4fa046e3"},
"parse_trans": {:hex, :parse_trans, "3.4.1", "6e6aa8167cb44cc8f39441d05193be6e6f4e7c2946cb2759f015f8c56b76e5ff", [:rebar3], [], "hexpm", "620a406ce75dada827b82e453c19cf06776be266f5a67cff34e1ef2cbb60e49a"},
"phoenix": {:hex, :phoenix, "1.7.14", "a7d0b3f1bc95987044ddada111e77bd7f75646a08518942c72a8440278ae7825", [:mix], [{:castore, ">= 0.0.0", [hex: :castore, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix_pubsub, "~> 2.1", [hex: :phoenix_pubsub, repo: "hexpm", optional: false]}, {:phoenix_template, "~> 1.0", [hex: :phoenix_template, repo: "hexpm", optional: false]}, {:phoenix_view, "~> 2.0", [hex: :phoenix_view, repo: "hexpm", optional: true]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 2.7", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:plug_crypto, "~> 1.2 or ~> 2.0", [hex: :plug_crypto, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}, {:websock_adapter, "~> 0.5.3", [hex: :websock_adapter, repo: "hexpm", optional: false]}], "hexpm", "c7859bc56cc5dfef19ecfc240775dae358cbaa530231118a9e014df392ace61a"},
"phoenix_ecto": {:hex, :phoenix_ecto, "4.6.2", "3b83b24ab5a2eb071a20372f740d7118767c272db386831b2e77638c4dcc606d", [:mix], [{:ecto, "~> 3.5", [hex: :ecto, repo: "hexpm", optional: false]}, {:phoenix_html, "~> 2.14.2 or ~> 3.0 or ~> 4.1", [hex: :phoenix_html, repo: "hexpm", optional: true]}, {:plug, "~> 1.9", [hex: :plug, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.16 or ~> 1.0", [hex: :postgrex, repo: "hexpm", optional: true]}], "hexpm", "3f94d025f59de86be00f5f8c5dd7b5965a3298458d21ab1c328488be3b5fcd59"},
@@ -52,15 +52,15 @@
"plug": {:hex, :plug, "1.16.1", "40c74619c12f82736d2214557dedec2e9762029b2438d6d175c5074c933edc9d", [:mix], [{:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:plug_crypto, "~> 1.1.1 or ~> 1.2 or ~> 2.0", [hex: :plug_crypto, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4.3 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "a13ff6b9006b03d7e33874945b2755253841b238c34071ed85b0e86057f8cddc"},
"plug_crypto": {:hex, :plug_crypto, "2.1.0", "f44309c2b06d249c27c8d3f65cfe08158ade08418cf540fd4f72d4d6863abb7b", [:mix], [], "hexpm", "131216a4b030b8f8ce0f26038bc4421ae60e4bb95c5cf5395e1421437824c4fa"},
"postgrex": {:hex, :postgrex, "0.18.0", "f34664101eaca11ff24481ed4c378492fed2ff416cd9b06c399e90f321867d7e", [:mix], [{:db_connection, "~> 2.1", [hex: :db_connection, repo: "hexpm", optional: false]}, {:decimal, "~> 1.5 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:table, "~> 0.1.0", [hex: :table, repo: "hexpm", optional: true]}], "hexpm", "a042989ba1bc1cca7383ebb9e461398e3f89f868c92ce6671feb7ef132a252d1"},
- "req": {:hex, :req, "0.5.4", "e375e4812adf83ffcf787871d7a124d873e983e3b77466e6608b973582f7f837", [:mix], [{:brotli, "~> 0.3.1", [hex: :brotli, repo: "hexpm", optional: true]}, {:ezstd, "~> 1.0", [hex: :ezstd, repo: "hexpm", optional: true]}, {:finch, "~> 0.17", [hex: :finch, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}, {:mime, "~> 2.0.6 or ~> 2.1", [hex: :mime, repo: "hexpm", optional: false]}, {:nimble_csv, "~> 1.0", [hex: :nimble_csv, repo: "hexpm", optional: true]}, {:plug, "~> 1.0", [hex: :plug, repo: "hexpm", optional: true]}], "hexpm", "a17998ffe2ef54f79bfdd782ef9f4cbf987d93851e89444cbc466a6a25eee494"},
+ "req": {:hex, :req, "0.5.2", "70b4976e5fbefe84e5a57fd3eea49d4e9aa0ac015301275490eafeaec380f97f", [:mix], [{:brotli, "~> 0.3.1", [hex: :brotli, repo: "hexpm", optional: true]}, {:ezstd, "~> 1.0", [hex: :ezstd, repo: "hexpm", optional: true]}, {:finch, "~> 0.17", [hex: :finch, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}, {:mime, "~> 2.0.6 or ~> 2.1", [hex: :mime, repo: "hexpm", optional: false]}, {:nimble_csv, "~> 1.0", [hex: :nimble_csv, repo: "hexpm", optional: true]}, {:plug, "~> 1.0", [hex: :plug, repo: "hexpm", optional: true]}], "hexpm", "0c63539ab4c2d6ced6114d2684276cef18ac185ee00674ee9af4b1febba1f986"},
"resend": {:hex, :resend, "0.4.2", "52f27c83068fbc558cc372df95978fe369da6fb2c7efc83ae316d41afa51bc0b", [:mix], [{:hackney, "~> 1.9", [hex: :hackney, repo: "hexpm", optional: false]}, {:swoosh, "~> 1.3", [hex: :swoosh, repo: "hexpm", optional: false]}, {:tesla, "~> 1.5", [hex: :tesla, repo: "hexpm", optional: false]}], "hexpm", "bfec17802467c4a81190c576edcae816e882224c0e90573adae9fb45c29b6528"},
"sentry": {:hex, :sentry, "10.6.2", "a867ab728d424e187ccb2bccc388170a740a79bc0ddccabd72d303b203acbe0e", [:mix], [{:hackney, "~> 1.8", [hex: :hackney, repo: "hexpm", optional: true]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: true]}, {:nimble_options, "~> 1.0", [hex: :nimble_options, repo: "hexpm", optional: false]}, {:nimble_ownership, "~> 0.3.0", [hex: :nimble_ownership, repo: "hexpm", optional: false]}, {:phoenix, "~> 1.6", [hex: :phoenix, repo: "hexpm", optional: true]}, {:phoenix_live_view, "~> 0.20", [hex: :phoenix_live_view, repo: "hexpm", optional: true]}, {:plug, "~> 1.6", [hex: :plug, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: true]}], "hexpm", "31bb84247274f9262fd300df0e3eb73302e4849cc6b7a6560bb2465f03fbd446"},
"sobelow": {:hex, :sobelow, "0.13.0", "218afe9075904793f5c64b8837cc356e493d88fddde126a463839351870b8d1e", [:mix], [{:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "cd6e9026b85fc35d7529da14f95e85a078d9dd1907a9097b3ba6ac7ebbe34a0d"},
"ssl_verify_fun": {:hex, :ssl_verify_fun, "1.1.7", "354c321cf377240c7b8716899e182ce4890c5938111a1296add3ec74cf1715df", [:make, :mix, :rebar3], [], "hexpm", "fe4c190e8f37401d30167c8c405eda19469f34577987c76dde613e838bbc67f8"},
"styler": {:hex, :styler, "1.0.0-rc.2", "3fb6949ef1fa07128415631827d6331523923ca9d38c2f113bea5f6c98428440", [:mix], [], "hexpm", "35f8f0eca03c9547f8f1319aa6a95105c6a23586177dbb132e1a5c31d8c708dd"},
"sweet_xml": {:hex, :sweet_xml, "0.7.4", "a8b7e1ce7ecd775c7e8a65d501bc2cd933bff3a9c41ab763f5105688ef485d08", [:mix], [], "hexpm", "e7c4b0bdbf460c928234951def54fe87edf1a170f6896675443279e2dbeba167"},
- "swoosh": {:hex, :swoosh, "1.16.10", "04be6e2eb1a31aa0aa21a731175c81cc3998189456a92daf13d44a5c754afcf5", [:mix], [{:bandit, ">= 1.0.0", [hex: :bandit, repo: "hexpm", optional: true]}, {:cowboy, "~> 1.1 or ~> 2.4", [hex: :cowboy, repo: "hexpm", optional: true]}, {:ex_aws, "~> 2.1", [hex: :ex_aws, repo: "hexpm", optional: true]}, {:finch, "~> 0.6", [hex: :finch, repo: "hexpm", optional: true]}, {:gen_smtp, "~> 0.13 or ~> 1.0", [hex: :gen_smtp, repo: "hexpm", optional: true]}, {:hackney, "~> 1.9", [hex: :hackney, repo: "hexpm", optional: true]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}, {:mail, "~> 0.2", [hex: :mail, repo: "hexpm", optional: true]}, {:mime, "~> 1.1 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mua, "~> 0.2.3", [hex: :mua, repo: "hexpm", optional: true]}, {:multipart, "~> 0.4", [hex: :multipart, repo: "hexpm", optional: true]}, {:plug, "~> 1.9", [hex: :plug, repo: "hexpm", optional: true]}, {:plug_cowboy, ">= 1.0.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:req, "~> 0.5 or ~> 1.0", [hex: :req, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.2 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "756be04db173c0cbe318f1dfe2bcc88aa63aed78cf5a4b02b61b36ee11fc716a"},
- "tabler_icons": {:git, "https://github.com/tabler/tabler-icons.git", "efa07c00285dabe522803946fc0c662005e02460", [branch: "main", sparse: "icons"]},
+ "swoosh": {:hex, :swoosh, "1.16.9", "20c6a32ea49136a4c19f538e27739bb5070558c0fa76b8a95f4d5d5ca7d319a1", [:mix], [{:bandit, ">= 1.0.0", [hex: :bandit, repo: "hexpm", optional: true]}, {:cowboy, "~> 1.1 or ~> 2.4", [hex: :cowboy, repo: "hexpm", optional: true]}, {:ex_aws, "~> 2.1", [hex: :ex_aws, repo: "hexpm", optional: true]}, {:finch, "~> 0.6", [hex: :finch, repo: "hexpm", optional: true]}, {:gen_smtp, "~> 0.13 or ~> 1.0", [hex: :gen_smtp, repo: "hexpm", optional: true]}, {:hackney, "~> 1.9", [hex: :hackney, repo: "hexpm", optional: true]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}, {:mail, "~> 0.2", [hex: :mail, repo: "hexpm", optional: true]}, {:mime, "~> 1.1 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mua, "~> 0.2.0", [hex: :mua, repo: "hexpm", optional: true]}, {:multipart, "~> 0.4", [hex: :multipart, repo: "hexpm", optional: true]}, {:plug, "~> 1.9", [hex: :plug, repo: "hexpm", optional: true]}, {:plug_cowboy, ">= 1.0.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:req, "~> 0.5 or ~> 1.0", [hex: :req, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.2 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "878b1a7a6c10ebbf725a3349363f48f79c5e3d792eb621643b0d276a38acc0a6"},
+ "tabler_icons": {:git, "https://github.com/tabler/tabler-icons.git", "efa07c00285dabe522803946fc0c662005e02460", [branch: "main", sparse: "icons", depth: 1]},
"tailwind": {:hex, :tailwind, "0.2.3", "277f08145d407de49650d0a4685dc062174bdd1ae7731c5f1da86163a24dfcdb", [:mix], [{:castore, ">= 0.0.0", [hex: :castore, repo: "hexpm", optional: false]}], "hexpm", "8e45e7a34a676a7747d04f7913a96c770c85e6be810a1d7f91e713d3a3655b5d"},
"tailwind_formatter": {:hex, :tailwind_formatter, "0.4.0", "2aa6f391b3b6fe52a18fe75b75c76d541c3478a31bcfc1ebe987819fa32653bf", [:mix], [], "hexpm", "6465afc3e864937fcb477524be45a2d9d73a1b1ee904d85f1592b80dbcb0b741"},
"telemetry": {:hex, :telemetry, "1.2.1", "68fdfe8d8f05a8428483a97d7aab2f268aaff24b49e0f599faa091f1d4e7f61c", [:rebar3], [], "hexpm", "dad9ce9d8effc621708f99eac538ef1cbe05d6a874dd741de2e689c47feafed5"},
|
uneebee
|
zoonk
|
Elixir
|
Elixir
| 1,339
| 83
|
Platform for creating interactive courses.
|
zoonk_uneebee
|
CONFIG_CHANGE
|
version updates are done
|
2d79aeb117a4532761ffd26c4093f6cbf4a8d337
|
2023-06-04 22:13:35
|
Romain Vimont
|
Simplify command in documentation If --no-video is passed, --no-playback is equivalent to --no-audio-playback.
| false
| 1
| 1
| 2
|
--- doc/audio.md
@@ -56,7 +56,7 @@ For example, to use the device as a dictaphone and record a capture directly on
the computer:
```
-scrcpy --audio-source=mic --no-video --no-playback --record=file.opus
+scrcpy --audio-source=mic --no-video --no-audio-playback --record=file.opus
```
|
scrcpy
|
genymobile
|
C
|
C
| 118,486
| 11,201
|
Display and control your Android device
|
genymobile_scrcpy
|
DOC_CHANGE
|
changes in md file
|
bc0a0a86516fc47c362b90042a154c82305828ae
| null |
Jerry Nieuviarts
|
Clone commande do not get submodules recursively (#320)
| false
| 1
| 1
| 0
|
--- README.md
@@ -99,7 +99,7 @@ Once you completed the above steps, you can build and test SyntaxNet with the
following commands:
```shell
- git clone --recursive https://github.com/tensorflow/models.git
+ git clone --recursive --recurse-submodules https://github.com/tensorflow/models.git
cd models/syntaxnet/tensorflow
./configure
cd ..
|
tensorflow_models.json
| null | null | null | null | null | null |
tensorflow_models.json
|
CONFIG_CHANGE
|
5, obvious
|
728de0e2a75e9b2358ba7c8902cf3a28453c5f4e
|
2023-02-07 07:31:20
|
Jon Shier
|
Move CI to Xcode 14.2 (#3685) ### Goals :soccer:
This PR updates the CI runner to use Xcode 14.2.
| false
| 13
| 6
| 19
|
--- .github/workflows/ci.yml
@@ -31,10 +31,6 @@ jobs:
fail-fast: false
matrix:
include:
- - xcode: "Xcode_14.2.app"
- runsOn: macOS-12
- name: "macOS 12, Xcode 14.2, Swift 5.7.2"
- testPlan: "macOS"
- xcode: "Xcode_14.1.app"
runsOn: macOS-12
name: "macOS 12, Xcode 14.1, Swift 5.7.1"
@@ -73,7 +69,7 @@ jobs:
name: Test Catalyst
runs-on: macOS-12
env:
- DEVELOPER_DIR: /Applications/Xcode_14.2.app/Contents/Developer
+ DEVELOPER_DIR: /Applications/Xcode_14.1.app/Contents/Developer
timeout-minutes: 10
steps:
- uses: actions/checkout@v3
@@ -85,13 +81,13 @@ jobs:
name: Test Latest (iOS, tvOS, watchOS)
runs-on: macOS-12
env:
- DEVELOPER_DIR: "/Applications/Xcode_14.2.app/Contents/Developer"
+ DEVELOPER_DIR: "/Applications/Xcode_14.1.app/Contents/Developer"
timeout-minutes: 10
strategy:
fail-fast: false
matrix:
include:
- - destination: "OS=16.2,name=iPhone 14 Pro"
+ - destination: "OS=16.1,name=iPhone 14 Pro"
name: "iOS"
scheme: "Alamofire iOS"
- destination: "OS=16.1,name=Apple TV"
@@ -110,7 +106,7 @@ jobs:
name: "Test Old iOS"
runs-on: firebreak
env:
- DEVELOPER_DIR: "/Applications/Xcode_14.2.app/Contents/Developer"
+ DEVELOPER_DIR: "/Applications/Xcode_14.1.app/Contents/Developer"
timeout-minutes: 10
strategy:
fail-fast: false
@@ -138,7 +134,7 @@ jobs:
name: Test Old tvOS
runs-on: firebreak
env:
- DEVELOPER_DIR: /Applications/Xcode_14.2.app/Contents/Developer
+ DEVELOPER_DIR: /Applications/Xcode_14.1.app/Contents/Developer
timeout-minutes: 10
strategy:
fail-fast: false
@@ -166,7 +162,7 @@ jobs:
name: Test Old watchOS
runs-on: firebreak
env:
- DEVELOPER_DIR: /Applications/Xcode_14.2.app/Contents/Developer
+ DEVELOPER_DIR: /Applications/Xcode_14.1.app/Contents/Developer
timeout-minutes: 10
strategy:
fail-fast: false
@@ -194,9 +190,6 @@ jobs:
fail-fast: false
matrix:
include:
- - xcode: "Xcode_14.2.app"
- runsOn: macOS-12
- name: "macOS 12, SPM 5.7.2 Test"
- xcode: "Xcode_14.1.app"
runsOn: macOS-12
name: "macOS 12, SPM 5.7.1 Test"
|
alamofire
|
alamofire
|
Swift
|
Swift
| 41,720
| 7,598
|
Elegant HTTP Networking in Swift
|
alamofire_alamofire
|
CODE_IMPROVEMENT
|
Code change: type annotation added
|
9138035320657b66f63f111793509991e50c1d7d
|
2023-11-07 22:34:28
|
boldkoala4615
|
Clarify handling of snapshots in README (#141) * docs: clarify that snapshots have to be decompressed
* Update README.md
docs: clarify that a volume has to be mounted for snapshots
| false
| 7
| 3
| 10
|
--- README.md
@@ -59,7 +59,7 @@ If you encounter problems with your node, please open a [GitHub issue](https://g
docker compose up --build
```
-4. You should now be able to `curl` your Base node:
+3. You should now be able to `curl` your Base node:
```
curl -d '{"id":0,"jsonrpc":"2.0","method":"eth_getBlockByNumber","params":["latest",false]}' \
@@ -68,7 +68,7 @@ curl -d '{"id":0,"jsonrpc":"2.0","method":"eth_getBlockByNumber","params":["late
Note: Some L1 nodes (e.g. Erigon) do not support fetching storage proofs. You can work around this by specifying `--l1.trustrpc` when starting op-node (add it in `op-node-entrypoint` and rebuild the docker image with `docker compose build`.) Do not do this unless you fully trust the L1 node provider.
-5. Map a local data directory for `op-geth` by adding a volume mapping to the `docker-compose.yaml`:
+You can map a local data directory for `op-geth` by adding a volume mapping to the `docker-compose.yaml`:
```yaml
services:
@@ -78,8 +78,6 @@ services:
- $HOME/data/base:/data
```
-This is where your node data will be stored. This is for example where you would extract your [snapshot](#snapshots) to.
-
#### Running in single container with `supervisord`
If you'd like to run the node in a single container instead of `docker-compose`, you can use the `supervisord` entrypoint.
@@ -93,7 +91,7 @@ docker run --env-file .env.goerli -e OP_NODE_L2_ENGINE_RPC=ws://localhost:8551 -
### Snapshots
-If you're a prospective or current Base Node operator and would like to restore from a snapshot to save time on the initial sync, it's always possible to download and decompress the latest available snapshot of the Base chain on mainnet and/or testnet by using the following CLI commands. The snapshots are updated every hour.
+If you're a prospective or current Base Node operator and would like to restore from a snapshot to save time on the initial sync, it's possible to always get the latest available snapshot of the Base chain on mainnet and/or testnet by using the following CLI commands. The snapshots are updated every hour.
**Mainnet**
@@ -107,8 +105,6 @@ wget https://base-mainnet-archive-snapshots.s3.us-east-1.amazonaws.com/$(curl ht
wget https://base-goerli-archive-snapshots.s3.us-east-1.amazonaws.com/$(curl https://base-goerli-archive-snapshots.s3.us-east-1.amazonaws.com/latest)
```
-Use `tar -xvf` to decompress the downloaded archive to the local data directory you previously configured a volume mapping for.
-
### Syncing
Sync speed depends on your L1 node, as the majority of the chain is derived from data submitted to the L1. You can check your syncing status using the `optimism_syncStatus` RPC on the `op-node` container. Example:
|
node
|
base
|
Shell
|
Shell
| 68,555
| 2,658
|
Everything required to run your own Base node
|
base_node
|
DOC_CHANGE
|
changes in readme
|
2f7a40204088b5c5c9e9eee5e0e7b4445105c28d
| null |
Joseph C. Miller, II
|
Use yellow instead of red for termination message
| false
| 1
| 1
| 0
|
--- main.py
@@ -186,7 +186,7 @@ def construct_prompt():
if config.ai_name:
logger.typewriter_log(
f"Welcome back! ",
- Fore.GREEN,
+ Fore.YELLOW,
f"Would you like me to return to being {config.ai_name}?",
speak_text=True)
should_continue = utils.clean_input(f"""Continue with the last settings?
|
Significant-Gravitas_AutoGPT.json
| null | null | null | null | null | null |
Significant-Gravitas_AutoGPT.json
|
NEW_FEAT
|
5, obvious
|
8bd31758b9afff8043d36b5087a300a92d3ae359
|
2024-09-22 13:45:02
|
dependabot[bot]
|
build(deps): bump org.apache.maven.plugins:maven-surefire-plugin (#3031) Bumps [org.apache.maven.plugins:maven-surefire-plugin](https://github.com/apache/maven-surefire) from 3.3.1 to 3.5.0.
- [Release notes](https://github.com/apache/maven-surefire/releases)
- [Commits](https://github.com/apache/maven-surefire/compare/surefire-3.3.1...surefire-3.5.0)
---
updated-dependencies:
- dependency-name: org.apache.maven.plugins:maven-surefire-plugin
dependency-type: direct:production
update-type: version-update:semver-minor
...
Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
| false
| 1
| 1
| 2
|
--- pom.xml
@@ -44,7 +44,7 @@
<gson.version>2.11.0</gson.version>
<guice.version>6.0.0</guice.version>
<system-lambda.version>1.1.0</system-lambda.version>
- <maven-surefire-plugin.version>3.5.0</maven-surefire-plugin.version>
+ <maven-surefire-plugin.version>3.3.1</maven-surefire-plugin.version>
<maven-checkstyle-plugin.version>3.4.0</maven-checkstyle-plugin.version>
<license-maven-plugin.version>4.5</license-maven-plugin.version>
<urm-maven-plugin.version>2.1.1</urm-maven-plugin.version>
|
java-design-patterns
|
iluwatar
|
Java
|
Java
| 90,911
| 26,831
|
Design patterns implemented in Java
|
iluwatar_java-design-patterns
|
CONFIG_CHANGE
|
Version/release update
|
53eaf3e4e444e5cf26e70a94d09f60af976edf2b
|
2025-03-04 04:55:46
|
Riccardo Cipolleschi
|
Back out "fix: avoid race condition crash in [RCTDataRequestHandler invalidate]" (#49797) Summary: Pull Request resolved: https://github.com/facebook/react-native/pull/49797 Backing D70314889 as it was breaking some internal tests. I verified that before the backout the tests were failing and after the backout they were not. ## Changelog: [iOS][Changed] - Reverted fix: avoid race condition crash in [RCTDataRequestHandler invalidate]. Reviewed By: Abbondanzo Differential Revision: D70511155 fbshipit-source-id: 276f6947aa6bb648c9c9eeb5c342f336acc8a26f
| false
| 11
| 21
| 32
|
--- packages/react-native/Libraries/Network/RCTDataRequestHandler.mm
@@ -49,8 +49,13 @@ - (NSOperation *)sendRequest:(NSURLRequest *)request withDelegate:(id<RCTURLRequ
_queue.maxConcurrentOperationCount = 2;
}
- __weak __block NSBlockOperation *weakOp;
- __block NSBlockOperation *op = [NSBlockOperation blockOperationWithBlock:^{
+ __weak NSBlockOperation *weakOp;
+ NSBlockOperation *op = [NSBlockOperation blockOperationWithBlock:^{
+ NSBlockOperation *strongOp = weakOp; // Strong reference to avoid deallocation during execution
+ if (strongOp == nil || [strongOp isCancelled]) {
+ return;
+ }
+
// Get mime type
NSRange firstSemicolon = [request.URL.resourceSpecifier rangeOfString:@";"];
NSString *mimeType =
@@ -62,15 +67,15 @@ - (NSOperation *)sendRequest:(NSURLRequest *)request withDelegate:(id<RCTURLRequ
expectedContentLength:-1
textEncodingName:nil];
- [delegate URLRequest:weakOp didReceiveResponse:response];
+ [delegate URLRequest:strongOp didReceiveResponse:response];
// Load data
NSError *error;
NSData *data = [NSData dataWithContentsOfURL:request.URL options:NSDataReadingMappedIfSafe error:&error];
if (data) {
- [delegate URLRequest:weakOp didReceiveData:data];
+ [delegate URLRequest:strongOp didReceiveData:data];
}
- [delegate URLRequest:weakOp didCompleteWithError:error];
+ [delegate URLRequest:strongOp didCompleteWithError:error];
}];
weakOp = op;
--- packages/react-native/Libraries/Network/RCTFileRequestHandler.mm
@@ -53,14 +53,19 @@ - (NSOperation *)sendRequest:(NSURLRequest *)request withDelegate:(id<RCTURLRequ
_fileQueue.maxConcurrentOperationCount = 4;
}
- __weak __block NSBlockOperation *weakOp;
- __block NSBlockOperation *op = [NSBlockOperation blockOperationWithBlock:^{
+ __weak NSBlockOperation *weakOp;
+ NSBlockOperation *op = [NSBlockOperation blockOperationWithBlock:^{
+ NSBlockOperation *strongOp = weakOp; // Strong reference to avoid deallocation during execution
+ if (strongOp == nil || [strongOp isCancelled]) {
+ return;
+ }
+
// Get content length
NSError *error = nil;
NSFileManager *fileManager = [NSFileManager new];
NSDictionary<NSString *, id> *fileAttributes = [fileManager attributesOfItemAtPath:request.URL.path error:&error];
if (!fileAttributes) {
- [delegate URLRequest:weakOp didCompleteWithError:error];
+ [delegate URLRequest:strongOp didCompleteWithError:error];
return;
}
@@ -77,14 +82,14 @@ - (NSOperation *)sendRequest:(NSURLRequest *)request withDelegate:(id<RCTURLRequ
expectedContentLength:[fileAttributes[NSFileSize] ?: @-1 integerValue]
textEncodingName:nil];
- [delegate URLRequest:weakOp didReceiveResponse:response];
+ [delegate URLRequest:strongOp didReceiveResponse:response];
// Load data
NSData *data = [NSData dataWithContentsOfURL:request.URL options:NSDataReadingMappedIfSafe error:&error];
if (data) {
- [delegate URLRequest:weakOp didReceiveData:data];
+ [delegate URLRequest:strongOp didReceiveData:data];
}
- [delegate URLRequest:weakOp didCompleteWithError:error];
+ [delegate URLRequest:strongOp didCompleteWithError:error];
}];
weakOp = op;
|
react-native
|
facebook
|
C++
|
C++
| 120,863
| 24,536
|
A framework for building native applications using React
|
facebook_react-native
|
BUG_FIX
|
Obvious
|
e0c76d95abfc1621259864adb3d101cf6f1f90fc
|
2024-12-11 21:28:51
|
Kir Kolyshkin
|
syscall: remove a wrong comment in Clearenv The comment being removed was added by commit ff3173849e (which predates Gerrit and Rietveld, so no CL link), and at the time it made sense. Since CL 148370043 (and up to the current implementation of Clearenv) the env map, which is populated by copyenv, is actually used, so the comment is no longer valid. It is also misleading, so it's best to remove it. Change-Id: I8bd2e8bca6262759538e5bcbd396f0c71cca6a4c Reviewed-on: https://go-review.googlesource.com/c/go/+/635078 Reviewed-by: Carlos Amedee <[email protected]> Reviewed-by: Ian Lance Taylor <[email protected]> LUCI-TryBot-Result: Go LUCI <[email protected]> Auto-Submit: Ian Lance Taylor <[email protected]>
| false
| 1
| 1
| 2
|
--- src/syscall/env_unix.go
@@ -124,7 +124,7 @@ func Setenv(key, value string) error {
}
func Clearenv() {
- envOnce.Do(copyenv)
+ envOnce.Do(copyenv) // prevent copyenv in Getenv/Setenv
envLock.Lock()
defer envLock.Unlock()
|
go
|
golang
|
Go
|
Go
| 126,191
| 17,926
|
The Go programming language
|
golang_go
|
CONFIG_CHANGE
|
Very small changes
|
d73e0e1e5a2d1b785a51c479f6c7dcd0e49937f5
|
2024-07-24 16:11:52
|
dignow
|
fix: video service (#8812) 1. Unset refresh flag if just refreshed.
2. Reduce the scope of the lock.
Signed-off-by: dignow <[email protected]>
| false
| 10
| 4
| 14
|
--- src/flutter.rs
@@ -1063,7 +1063,7 @@ impl FlutterHandler {
}
// We need `is_sent` here. Because we use texture render for multi-displays session.
//
- // Eg. We have two windows, one is display 1, the other is displays 0&1.
+ // Eg. We have to windows, one is display 1, the other is displays 0&1.
// When image of display 0 is received, we will not send the event.
//
// 1. "display 1" will not send the event.
--- src/server/video_service.rs
@@ -517,22 +517,16 @@ fn run(vs: VideoService) -> ResultType<()> {
drop(video_qos);
if sp.is_option_true(OPTION_REFRESH) {
- if LAST_REFRESH_TIME
- .lock()
- .unwrap()
+ let mut last_refresh_lock = LAST_REFRESH_TIME.lock().unwrap();
+ if last_refresh_lock
.get(&vs.idx)
.map(|x| x.elapsed().as_millis() > REFRESH_MIN_INTERVAL_MILLIS)
.unwrap_or(true)
{
let _ = try_broadcast_display_changed(&sp, display_idx, &c, true);
- LAST_REFRESH_TIME
- .lock()
- .unwrap()
- .insert(vs.idx, Instant::now());
+ last_refresh_lock.insert(vs.idx, Instant::now());
log::info!("switch to refresh");
bail!("SWITCH");
- } else {
- sp.set_option_bool(OPTION_REFRESH, false);
}
}
if codec_format != Encoder::negotiated_codec() {
|
rustdesk
|
rustdesk
|
Rust
|
Rust
| 83,345
| 11,693
|
An open-source remote desktop application designed for self-hosting, as an alternative to TeamViewer.
|
rustdesk_rustdesk
|
BUG_FIX
|
obvious
|
4b93ccef852da48993b1d6467fb30f22086ed1d2
|
2024-12-29 06:53:34
|
Rawal Ritesh
|
fix(curriculum): fix typo for hint in meta description video (#57815)
| false
| 3
| 3
| 6
|
--- curriculum/challenges/english/25-front-end-development/lecture-html-fundamentals/67083952f800051a8a21fcfd.md
@@ -94,7 +94,7 @@ Inside the `figure` element.
### --feedback--
-Refer back to the middle of the video where there are examples showing where the page's descriptions typically show up.
+Refer back to the middle of the video where there are examples showing where the page's descriptions typically shows up.
---
@@ -102,7 +102,7 @@ Inside the `footer` element.
### --feedback--
-Refer back to the middle of the video where there are examples showing where the page's descriptions typically show up.
+Refer back to the middle of the video where there are examples showing where the page's descriptions typically shows up.
---
@@ -114,7 +114,7 @@ In a popup alert message.
### --feedback--
-Refer back to the middle of the video where there are examples showing where the page's descriptions typically show up.
+Refer back to the middle of the video where there are examples showing where the page's descriptions typically shows up.
## --video-solution--
|
freecodecamp
|
freecodecamp
|
TypeScript
|
TypeScript
| 410,748
| 39,092
|
freeCodeCamp.org's open-source codebase and curriculum. Learn to code for free.
|
freecodecamp_freecodecamp
|
BUG_FIX
|
Matched \bfix(e[ds]|ing)?\b in message
|
73d480d40171b101e37ad04d7195893f01c72052
|
2024-09-05 13:38:27
|
John Kleinschmidt
|
build: fix telemetry error when using autoninja (#43563)
| false
| 4
| 0
| 4
|
--- appveyor-woa.yml
@@ -95,8 +95,6 @@ for:
- git clone --depth=1 https://chromium.googlesource.com/chromium/tools/depot_tools.git
- ps: New-Item -Name depot_tools\.disable_auto_update -ItemType File
- depot_tools\bootstrap\win_tools.bat
- - ps: |
- Set-Content -Path $pwd\depot_tools\build_telemetry.cfg -Value '{"user": "[email protected]", "status": "opt-out", "countdown": 10, "version": 1}'
- ps: $env:PATH="$pwd\depot_tools;$env:PATH"
- ps: >-
if (Test-Path -Path "$pwd\src\electron") {
--- appveyor.yml
@@ -93,8 +93,6 @@ for:
- git clone --depth=1 https://chromium.googlesource.com/chromium/tools/depot_tools.git
- ps: New-Item -Name depot_tools\.disable_auto_update -ItemType File
- depot_tools\bootstrap\win_tools.bat
- - ps: |
- Set-Content -Path $pwd\depot_tools\build_telemetry.cfg -Value '{"user": "[email protected]", "status": "opt-out", "countdown": 10, "version": 1}'
- ps: $env:PATH="$pwd\depot_tools;$env:PATH"
- ps: >-
if (Test-Path -Path "$pwd\src\electron") {
|
electron
|
electron
|
C++
|
C++
| 115,677
| 15,852
|
:electron: Build cross-platform desktop apps with JavaScript, HTML, and CSS
|
electron_electron
|
BUG_FIX
|
obvious
|
e771bd9159f7f9e5826c249547fed5876678c6c2
|
2023-07-15 14:37:28
|
Zhanyu Guo
|
finish d_star
| false
| 214
| 29
| 243
|
--- examples/simulation_global.mlx
Binary files a/examples/simulation_global.mlx and b/examples/simulation_global.mlx differ
--- examples/simulation_global_interactive.mlx
Binary files /dev/null and b/examples/simulation_global_interactive.mlx differ
--- examples/simulation_global_replan.mlx
Binary files a/examples/simulation_global_replan.mlx and /dev/null differ
--- examples/simulation_local.mlx
Binary files a/examples/simulation_local.mlx and b/examples/simulation_local.mlx differ
--- examples/simulation_total.mlx
Binary files a/examples/simulation_total.mlx and b/examples/simulation_total.mlx differ
--- global_planner/graph_search/d_star.m
@@ -3,7 +3,7 @@
% @breif: D* motion planning
% @paper: Optimal and Efficient Path Planning for Partially-Known Environments.
% @author: Zhanyu Guo
-% @update: 2023.7.15
+% @update: 2023.7.13
%
% ========= MAP =========
@@ -13,81 +13,42 @@
%
% initialize
-path = [];
-flag = false;
-cost = 0;
EXPAND = [];
siz = size(map);
-persistent MAP; % static variable
-if isempty(MAP) % first plan
- MAP = zeros(siz(1) * siz(2), 7);
- for y = 1:siz(2)
- for x = 1:siz(1)
- cur_ind = sub2ind(siz, x, y);
- MAP(cur_ind, 1) = x; % x
- MAP(cur_ind, 2) = y; % y
- MAP(cur_ind, 3) = 0; % tag = NEW
- MAP(cur_ind, 4) = Inf; % h
- MAP(cur_ind, 5) = Inf; % key
- end
+MAP = zeros(siz(1) * siz(2), 7);
+for y = 1:cols
+ for x = 1:rows
+ ind = sub2ind(siz, x, y);
+ MAP(ind, 1) = x; % x
+ MAP(ind, 2) = y; % y
+ MAP(ind, 3) = 0; % t
+ MAP(ind, 4) = Inf; % h
+ MAP(ind, 5) = Inf; % k
end
- start_ind = sub2ind(siz, start(1), start(2));
- goal_ind = sub2ind(siz, goal(1), goal(2));
- MAP = insert(MAP, goal_ind, 0);
-
- while 1
- [MAP, EXPAND, k_min] = processState(MAP, EXPAND, siz, map);
- if k_min == -1
- return
- end
+end
- if MAP(start_ind, 3) == 2
- flag = true;
- break
- end
- end
- % extract path
- [path, cost] = extract_path(MAP, siz, start_ind, map);
-else
- start_ind = sub2ind(siz, start(1), start(2));
- cur_ind = start_ind;
+MAP = insert(goal, 0, MAP, siz);
- while 1
- if isequal(MAP(cur_ind, 6:7), [0, 0])
- break
- end
- par_ind = sub2ind(siz, MAP(cur_ind, 6), MAP(cur_ind, 7));
- if isCollision(MAP(cur_ind, :), MAP(par_ind, :), map)
- [MAP, EXPAND] = modify(cur_ind, par_ind, MAP, siz, map, EXPAND);
- continue
- end
- cur_ind = par_ind;
- end
- [path, cost] = extract_path(MAP, siz, start_ind, map);
-end
+start_ind = sub2ind(siz, start(1), start(2));
-i = 1;
-expand_num = size(EXPAND, 1);
-while i <= expand_num
- if map(EXPAND(i, 1), EXPAND(i, 2)) == 2
- EXPAND(i, :) = [];
- expand_num = expand_num - 1;
- continue
+while 1
+ [MAP, k_min] = processState(MAP, siz);
+ if k_min == -1 || MAP(start_ind, 3) == 2
+ break
end
- i = i + 1;
end
-end
%%
-function MAP = insert(MAP, ind, h_new)
+function MAP = insert(node, h_new, MAP, siz)
%
% ========= MAP =========
% [x, y, t, h, k, px, py]
% =======================
% NEW = 0, OPEN = 1, CLOSED = 2
%
+ind = sub2ind(siz, node(1), node(2));
if MAP(ind, 3) == 0
MAP(ind, 5) = h_new;
elseif MAP(ind, 3) == 1
@@ -98,13 +59,11 @@
MAP(ind, 4) = h_new;
MAP(ind, 3) = 1;
-
end
-function [MAP, EXPAND, k_min] = processState(MAP, EXPAND, siz, map)
-% get open list
+function [MAP, k_min] = processState(MAP, siz)
+% get open
OPEN = MAP(MAP(:, 3) == 1, :);
-
if isempty(OPEN)
k_min = -1;
return
@@ -112,80 +71,22 @@
% get node with min k in open
[k_old, open_ind] = min(OPEN(:, 5));
-cur_ind = sub2ind(siz, OPEN(open_ind, 1), OPEN(open_ind, 2));
+map_ind = sub2ind(siz, OPEN(open_ind, 1), OPEN(open_ind, 2));
% set to closed
-MAP(cur_ind, 3) = 2;
-
-% add to expand
-if ~loc_list(MAP(cur_ind, :), EXPAND, [1, 2])
- EXPAND = [EXPAND; MAP(cur_ind, 1:2)];
-end
+MAP(map_ind, 3) = 2;
% get neighbors
-motion = [-1, -1; ...
- 0, -1; ...
- 1, -1; ...
- -1, 0; ...
- 1, 0; ...
- -1, 1; ...
- 0, 1; ...
- 1, 1];
-
-motion_num = size(motion, 1);
-neighbors = zeros(motion_num, 4);
-
-for i = 1:motion_num
- neighbors(i, 1) = MAP(cur_ind, 1) + motion(i, 1);
- neighbors(i, 2) = MAP(cur_ind, 2) + motion(i, 2);
+% TODO
- neb_ind = sub2ind(siz, neighbors(i, 1), neighbors(i, 2));
- neighbors(i, 3) = neb_ind;
- neighbors(i, 4) = getCost(MAP(cur_ind, :), MAP(neb_ind, :), map);
+if k_old < MAP(map_ind, 4)
+
end
-if k_old < MAP(cur_ind, 4)
- for i = 1:motion_num
- neb_ind = neighbors(i, 3);
- if MAP(neb_ind, 3) ~= 0 ...
- && MAP(neb_ind, 4) <= k_old ...
- && MAP(cur_ind, 4) > MAP(neb_ind, 4) + neighbors(i, 4)
- MAP(cur_ind, 6) = MAP(neb_ind, 1);
- MAP(cur_ind, 7) = MAP(neb_ind, 2);
- MAP(cur_ind, 4) = MAP(neb_ind, 4) + neighbors(i, 4);
- end
- end
-end
+if k_old == MAP(map_ind, 4)
-if k_old == MAP(cur_ind, 4)
- for i = 1:motion_num
- neb_ind = neighbors(i, 3);
- if MAP(neb_ind, 3) == 0 ...
- || ((MAP(neb_ind, 6) == MAP(cur_ind, 1) && MAP(neb_ind, 7) == MAP(cur_ind, 2)) && MAP(neb_ind, 4) ~= MAP(cur_ind, 4) + neighbors(i, 4)) ...
- || ((MAP(neb_ind, 6) ~= MAP(cur_ind, 1) || MAP(neb_ind, 7) ~= MAP(cur_ind, 2)) && MAP(neb_ind, 4) > MAP(cur_ind, 4) + neighbors(i, 4))
- MAP(neb_ind, 6) = MAP(cur_ind, 1);
- MAP(neb_ind, 7) = MAP(cur_ind, 2);
- MAP = insert(MAP, neb_ind, MAP(cur_ind, 4) + neighbors(i, 4));
- end
- end
else
- for i = 1:motion_num
- neb_ind = neighbors(i, 3);
- if MAP(neb_ind, 3) == 0 ...
- || ((MAP(neb_ind, 6) == MAP(cur_ind, 1) && MAP(neb_ind, 7) == MAP(cur_ind, 2)) && MAP(neb_ind, 4) ~= MAP(cur_ind, 4) + neighbors(i, 4))
- MAP(neb_ind, 6) = MAP(cur_ind, 1);
- MAP(neb_ind, 7) = MAP(cur_ind, 2);
- MAP = insert(MAP, neb_ind, MAP(cur_ind, 4) + neighbors(i, 4));
- elseif (MAP(neb_ind, 6) ~= MAP(cur_ind, 1) || MAP(neb_ind, 7) ~= MAP(cur_ind, 2)) ...
- && MAP(neb_ind, 4) > MAP(cur_ind, 4) + neighbors(i, 4)
- MAP = insert(MAP, cur_ind, MAP(cur_ind, 4));
- elseif (MAP(neb_ind, 6) ~= MAP(cur_ind, 1) || MAP(neb_ind, 7) ~= MAP(cur_ind, 2)) ...
- && MAP(cur_ind, 4) > MAP(neb_ind, 4) + neighbors(i, 4) ...
- && MAP(neb_ind, 3) == 2 ...
- && MAP(neb_ind, 4) > k_old
- MAP = insert(MAP, neb_ind, MAP(neb_ind, 4));
- end
- end
+
end
% get open
@@ -199,85 +100,3 @@
[k_min, ~] = min(OPEN(:, 5));
end
-
-function index = loc_list(node, list, range)
-% @breif: locate the node in given list
-num = size(list);
-index = 0;
-if ~num(1)
- return
-else
- for i = 1:num(1)
- if isequal(node(range), list(i, range))
- index = i;
- return
- end
- end
-end
-
-end
-
-function flag = isCollision(node1, node2, map)
-flag = false;
-
-if map(node1(1), node1(2)) == 2
- flag = true;
-end
-
-if map(node2(1), node2(2)) == 2
- flag = true;
-end
-
-end
-
-function cost = getCost(node1, node2, map)
-if isCollision(node1, node2, map)
- cost = Inf;
-else
- if abs(node1(1) - node2(1)) + abs(node1(2) - node2(2)) > 1
- cost = 1.414;
- else
- cost = 1;
- end
-end
-
-end
-
-function [path, cost] = extract_path(MAP, siz, start_ind, map)
-% @breif: Extract the path based on the CLOSED set.
-path = [];
-cost = 0;
-
-cur_ind = start_ind;
-path = [path; MAP(cur_ind, 1:2)];
-while 1
- if isequal(MAP(cur_ind, 6:7), [0, 0])
- break
- end
-
- par_ind = sub2ind(siz, MAP(cur_ind, 6), MAP(cur_ind, 7));
- cost = cost + getCost(MAP(cur_ind, :), MAP(par_ind, :), map);
-
- cur_ind = par_ind;
- path = [path; MAP(cur_ind, 1:2)];
-end
-
-end
-
-function [MAP, EXPAND] = modify(cur_ind, par_ind, MAP, siz, map, EXPAND)
-if MAP(cur_ind, 3) == 2
- MAP = insert(MAP, cur_ind, MAP(cur_ind, 4));
-end
-
-if MAP(par_ind, 3) == 2
- MAP = insert(MAP, par_ind, MAP(par_ind, 4));
-end
-
-while 1
- [MAP, EXPAND, k_min] = processState(MAP, EXPAND, siz, map);
- if k_min >= MAP(cur_ind, 4)
- break
- end
-end
-
-end
--- utils/plot/plot_expand.m
@@ -9,8 +9,7 @@ function plot_expand(expand, map_size, G, planner_name)
if strcmp(planner_name, 'a_star') || ...
strcmp(planner_name, 'gbfs') || ...
strcmp(planner_name, 'dijkstra') || ...
- strcmp(planner_name, 'jps') || ...
- strcmp(planner_name, 'd_star')
+ strcmp(planner_name, 'jps')
plot_square(expand, map_size, G, "#ddd");
end
--- utils/plot/plot_square.m
@@ -6,9 +6,6 @@
% @update: 2023.1.29
%%
- if isempty(pts)
- return
- end
[ptsX, ptsY] = index_to_map(pts(:, 1) + map_size(1) * (pts(:, 2) - 1), map_size, G);
ptsNum = length(ptsX);
for i=1:ptsNum
|
matlab_motion_planning
|
ai-winter
|
MATLAB
|
MATLAB
| 419
| 66
|
Motion planning and Navigation of AGV/AMR:matlab implementation of Dijkstra, A*, Theta*, JPS, D*, LPA*, D* Lite, RRT, RRT*, RRT-Connect, Informed RRT*, ACO, Voronoi, PID, LQR, MPC, APF, RPP, DWA, DDPG, Bezier, B-spline, Dubins, Reeds-Shepp etc.
|
ai-winter_matlab_motion_planning
|
NEW_FEAT
|
probably new functionality added
|
7d9443d3ead0178f0c360ef96ac7e24145139be0
|
2023-03-08 20:17:59
|
Richard McElreath
|
lecture 20 slides
| false
| 0
| 0
| 0
|
--- slides/Lecture_20-horoscopes.pdf
Binary files a/slides/Lecture_20-horoscopes.pdf and /dev/null differ
|
stat_rethinking_2024
|
rmcelreath
|
R
|
R
| 1,474
| 151
| null |
rmcelreath_stat_rethinking_2024
|
DOC_CHANGE
|
Obvious
|
81f13b2c54184706c361beb85b4a59f26cc3848a
|
2024-06-28 17:31:15
|
Toshiaki Takeuchi
|
Updated the method to detec LLMs with MATLAB
| false
| 0
| 0
| 0
|
--- MatGPT.mlapp
Binary files a/MatGPT.mlapp and b/MatGPT.mlapp differ
|
matgpt
|
toshiakit
|
MATLAB
|
MATLAB
| 218
| 33
|
MATLAB app to access ChatGPT API from OpenAI
|
toshiakit_matgpt
|
NEW_FEAT
|
new feature to detect LLMs
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.