chore: merge main (#5991)

### Definition of Ready

- [ ] I am happy with the code
- [ ] Short description of the feature/issue is added in the pr
description
- [ ] PR is linked to the corresponding user story
- [ ] Acceptance criteria are met
- [ ] All open todos and follow ups are defined in a new ticket and
justified
- [ ] Deviations from the acceptance criteria and design are agreed with
the PO and documented.
- [ ] No debug or dead code
- [ ] My code has no repetitions
- [ ] Critical parts are tested automatically
- [ ] Where possible E2E tests are implemented
- [ ] Documentation/examples are up-to-date
- [ ] All non-functional requirements are met
- [ ] Functionality of the acceptance criteria is checked manually on
the dev system.
This commit is contained in:
Silvan 2023-06-08 10:14:43 +02:00 committed by GitHub
commit eee4450a05
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
110 changed files with 3883 additions and 465 deletions

View File

@ -1,7 +1,7 @@
name: Bug Report name: Bug Report
description: "Create a bug report to help us improve ZITADEL. Click [here](https://github.com/zitadel/zitadel/blob/main/CONTRIBUTING.md#product-management) to see how we process your issue." description: "Create a bug report to help us improve ZITADEL. Click [here](https://github.com/zitadel/zitadel/blob/main/CONTRIBUTING.md#product-management) to see how we process your issue."
title: "[Bug]: " title: "[Bug]: "
labels: ["type: bug", "state: triage"] labels: ["bug"]
body: body:
- type: markdown - type: markdown
attributes: attributes:
@ -27,11 +27,16 @@ body:
- Self-hosted - Self-hosted
validations: validations:
required: true required: true
- type: textarea - type: input
id: description id: version
attributes: attributes:
label: Describe the bug label: Version
description: A clear and concise description of what the bug is. description: Which version of ZITADEL are you using.
- type: textarea
id: impact
attributes:
label: Describe the problem caused by this bug
description: A clear and concise description of the problem you have and what the bug is.
validations: validations:
required: true required: true
- type: textarea - type: textarea
@ -57,11 +62,6 @@ body:
attributes: attributes:
label: Expected behavior label: Expected behavior
description: A clear and concise description of what you expected to happen. description: A clear and concise description of what you expected to happen.
- type: input
id: version
attributes:
label: Version
description: Which version of ZITADEL are you using.
- type: textarea - type: textarea
id: os id: os
attributes: attributes:

31
.github/ISSUE_TEMPLATE/docs.yaml vendored Normal file
View File

@ -0,0 +1,31 @@
name: 📄 Documentation
description: Create an issue for missing or wrong documentation.
title:
labels: ["docs"]
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to fill out this issue.
- type: checkboxes
id: preflight
attributes:
label: Preflight Checklist
options:
- label:
I could not find a solution in the existing issues, docs, nor discussions
required: true
- label:
I have joined the [ZITADEL chat](https://zitadel.com/chat)
- type: textarea
id: docs
attributes:
label: Describe the docs your are missing or that are wrong
placeholder: As a [type of user], I want [some goal] so that [some reason].
validations:
required: true
- type: textarea
id: additional
attributes:
label: Additional Context
description: Please add any other infos that could be useful.

55
.github/ISSUE_TEMPLATE/improvement.yaml vendored Normal file
View File

@ -0,0 +1,55 @@
name: 🛠️ Improvement
description:
title:
labels: ["improvement"]
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to fill out this improvement request
- type: checkboxes
id: preflight
attributes:
label: Preflight Checklist
options:
- label:
I could not find a solution in the existing issues, docs, nor discussions
required: true
- label:
I have joined the [ZITADEL chat](https://zitadel.com/chat)
- type: textarea
id: problem
attributes:
label: Describe your problem
description: Please describe your problem this improvement is supposed to solve.
placeholder: Describe the problem you have
validations:
required: true
- type: textarea
id: solution
attributes:
label: Describe your ideal solution
description: Which solution do you propose?
placeholder: As a [type of user], I want [some goal] so that [some reason].
validations:
required: true
- type: input
id: version
attributes:
label: Version
description: Which version of the ZITADEL are you using.
- type: dropdown
id: environment
attributes:
label: Environment
description: How do you use ZITADEL?
options:
- ZITADEL Cloud
- Self-hosted
validations:
required: true
- type: textarea
id: additional
attributes:
label: Additional Context
description: Please add any other infos that could be useful.

55
.github/ISSUE_TEMPLATE/proposal.yaml vendored Normal file
View File

@ -0,0 +1,55 @@
name: 💡 Proposal / Feature request
description:
title:
labels: ["enhancement"]
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to fill out this proposal / feature reqeust
- type: checkboxes
id: preflight
attributes:
label: Preflight Checklist
options:
- label:
I could not find a solution in the existing issues, docs, nor discussions
required: true
- label:
I have joined the [ZITADEL chat](https://zitadel.com/chat)
- type: textarea
id: problem
attributes:
label: Describe your problem
description: Please describe your problem this proposal / feature is supposed to solve.
placeholder: Describe the problem you have.
validations:
required: true
- type: textarea
id: solution
attributes:
label: Describe your ideal solution
description: Which solution do you propose?
placeholder: As a [type of user], I want [some goal] so that [some reason].
validations:
required: true
- type: input
id: version
attributes:
label: Version
description: Which version of ZITADEL are you using.
- type: dropdown
id: environment
attributes:
label: Environment
description: How do you use ZITADEL?
options:
- ZITADEL Cloud
- Self-hosted
validations:
required: true
- type: textarea
id: additional
attributes:
label: Additional Context
description: Please add any other infos that could be useful.

View File

@ -1,14 +0,0 @@
---
name: User Story
about: A user story is a brief description of a feature that has to be implemented from the perspective of the end user.
title: ''
assignees: ''
---
As a [type of user], I want [some goal] so that [some reason].
```[tasklist]
### Acceptance Criteria
- [ ] ...
```

View File

@ -43,7 +43,7 @@ jobs:
go run main.go init --config internal/integration/config/zitadel.yaml --config internal/integration/config/${INTEGRATION_DB_FLAVOR}.yaml go run main.go init --config internal/integration/config/zitadel.yaml --config internal/integration/config/${INTEGRATION_DB_FLAVOR}.yaml
go run main.go setup --masterkeyFromEnv --config internal/integration/config/zitadel.yaml --config internal/integration/config/${INTEGRATION_DB_FLAVOR}.yaml go run main.go setup --masterkeyFromEnv --config internal/integration/config/zitadel.yaml --config internal/integration/config/${INTEGRATION_DB_FLAVOR}.yaml
- name: Run integration tests - name: Run integration tests
run: go test -tags=integration -race -parallel 1 -v -coverprofile=profile.cov -coverpkg=./internal/...,./cmd/... ./internal/integration ./internal/api/grpc/... run: go test -tags=integration -race -p 1 -v -coverprofile=profile.cov -coverpkg=./internal/...,./cmd/... ./internal/integration ./internal/api/grpc/...
- name: Publish go coverage - name: Publish go coverage
uses: codecov/codecov-action@v3.1.0 uses: codecov/codecov-action@v3.1.0
with: with:

View File

@ -208,7 +208,7 @@ export INTEGRATION_DB_FLAVOR="cockroach" ZITADEL_MASTERKEY="MasterkeyNeedsToHave
docker compose -f internal/integration/config/docker-compose.yaml up --wait ${INTEGRATION_DB_FLAVOR} docker compose -f internal/integration/config/docker-compose.yaml up --wait ${INTEGRATION_DB_FLAVOR}
go run main.go init --config internal/integration/config/zitadel.yaml --config internal/integration/config/${INTEGRATION_DB_FLAVOR}.yaml go run main.go init --config internal/integration/config/zitadel.yaml --config internal/integration/config/${INTEGRATION_DB_FLAVOR}.yaml
go run main.go setup --masterkeyFromEnv --config internal/integration/config/zitadel.yaml --config internal/integration/config/${INTEGRATION_DB_FLAVOR}.yaml go run main.go setup --masterkeyFromEnv --config internal/integration/config/zitadel.yaml --config internal/integration/config/${INTEGRATION_DB_FLAVOR}.yaml
go test -tags=integration -race -parallel 1 ./internal/integration ./internal/api/grpc/... go test -count 1 -tags=integration -race -p 1 ./internal/integration ./internal/api/grpc/...
docker compose -f internal/integration/config/docker-compose.yaml down docker compose -f internal/integration/config/docker-compose.yaml down
``` ```

View File

@ -155,7 +155,9 @@ Use [Console](https://zitadel.com/docs/guides/manage/console/overview) or our [A
## Security ## Security
See the policy [here](./SECURITY.md) See the policy [here](./SECURITY.md).
[Technical Advisories](https://zitadel.com/docs/support/technical_advisory) are published regarding major issues with the ZITADEL platform that could potentially impact security or stability in production environments.
## License ## License

View File

@ -13,7 +13,7 @@
<th class="availability" mat-header-cell *matHeaderCellDef> <th class="availability" mat-header-cell *matHeaderCellDef>
<span>{{ 'IDP.AVAILABILITY' | translate }}</span> <span>{{ 'IDP.AVAILABILITY' | translate }}</span>
</th> </th>
<td class="availability" [routerLink]="routerLinkForRow(idp)" mat-cell *matCellDef="let idp"> <td class="pointer availability" mat-cell *matCellDef="let idp">
<i <i
matTooltip="{{ 'IDP.AVAILABLE' | translate }}" matTooltip="{{ 'IDP.AVAILABLE' | translate }}"
*ngIf="isEnabled(idp) && idp.state === IDPState.IDP_STATE_ACTIVE" *ngIf="isEnabled(idp) && idp.state === IDPState.IDP_STATE_ACTIVE"
@ -29,14 +29,14 @@
<ng-container matColumnDef="name"> <ng-container matColumnDef="name">
<th mat-header-cell *matHeaderCellDef>{{ 'IDP.NAME' | translate }}</th> <th mat-header-cell *matHeaderCellDef>{{ 'IDP.NAME' | translate }}</th>
<td class="pointer" [routerLink]="routerLinkForRow(idp)" mat-cell *matCellDef="let idp"> <td class="pointer" mat-cell *matCellDef="let idp">
<span>{{ idp?.name }}</span> <span>{{ idp?.name }}</span>
</td> </td>
</ng-container> </ng-container>
<ng-container matColumnDef="type"> <ng-container matColumnDef="type">
<th mat-header-cell *matHeaderCellDef>{{ 'IDP.TYPE' | translate }}</th> <th mat-header-cell *matHeaderCellDef>{{ 'IDP.TYPE' | translate }}</th>
<td class="pointer" [routerLink]="routerLinkForRow(idp)" mat-cell *matCellDef="let idp"> <td class="pointer" mat-cell *matCellDef="let idp">
<div [ngSwitch]="idp.type"> <div [ngSwitch]="idp.type">
<div class="idp-table-provider-type" *ngSwitchCase="ProviderType.PROVIDER_TYPE_AZURE_AD"> <div class="idp-table-provider-type" *ngSwitchCase="ProviderType.PROVIDER_TYPE_AZURE_AD">
<img class="idp-logo" src="./assets/images/idp/ms.svg" alt="azure ad" /> <img class="idp-logo" src="./assets/images/idp/ms.svg" alt="azure ad" />
@ -87,7 +87,7 @@
<ng-container matColumnDef="state"> <ng-container matColumnDef="state">
<th mat-header-cell *matHeaderCellDef>{{ 'IDP.STATE' | translate }}</th> <th mat-header-cell *matHeaderCellDef>{{ 'IDP.STATE' | translate }}</th>
<td class="pointer" [routerLink]="routerLinkForRow(idp)" mat-cell *matCellDef="let idp"> <td class="pointer" mat-cell *matCellDef="let idp">
<span <span
class="state" class="state"
[ngClass]="{ [ngClass]="{
@ -101,21 +101,21 @@
<ng-container matColumnDef="creationDate"> <ng-container matColumnDef="creationDate">
<th mat-header-cell *matHeaderCellDef>{{ 'IDP.CREATIONDATE' | translate }}</th> <th mat-header-cell *matHeaderCellDef>{{ 'IDP.CREATIONDATE' | translate }}</th>
<td class="pointer" [routerLink]="routerLinkForRow(idp)" mat-cell *matCellDef="let idp"> <td class="pointer" mat-cell *matCellDef="let idp">
<span>{{ idp.details.creationDate | timestampToDate | localizedDate : 'dd. MMM, HH:mm' }}</span> <span>{{ idp.details.creationDate | timestampToDate | localizedDate : 'dd. MMM, HH:mm' }}</span>
</td> </td>
</ng-container> </ng-container>
<ng-container matColumnDef="changeDate"> <ng-container matColumnDef="changeDate">
<th mat-header-cell *matHeaderCellDef>{{ 'IDP.CHANGEDATE' | translate }}</th> <th mat-header-cell *matHeaderCellDef>{{ 'IDP.CHANGEDATE' | translate }}</th>
<td class="pointer" [routerLink]="routerLinkForRow(idp)" mat-cell *matCellDef="let idp"> <td class="pointer" mat-cell *matCellDef="let idp">
<span>{{ idp.details.changeDate | timestampToDate | localizedDate : 'dd. MMM, HH:mm' }}</span> <span>{{ idp.details.changeDate | timestampToDate | localizedDate : 'dd. MMM, HH:mm' }}</span>
</td> </td>
</ng-container> </ng-container>
<ng-container matColumnDef="owner"> <ng-container matColumnDef="owner">
<th mat-header-cell *matHeaderCellDef>{{ 'IDP.OWNER' | translate }}</th> <th mat-header-cell *matHeaderCellDef>{{ 'IDP.OWNER' | translate }}</th>
<td class="pointer" [routerLink]="routerLinkForRow(idp)" mat-cell *matCellDef="let idp"> <td class="pointer" mat-cell *matCellDef="let idp">
{{ 'IDP.OWNERTYPES.' + idp.owner | translate }} {{ 'IDP.OWNERTYPES.' + idp.owner | translate }}
</td> </td>
</ng-container> </ng-container>
@ -140,7 +140,7 @@
" "
mat-icon-button mat-icon-button
matTooltip="{{ 'IDP.SETAVAILABLE' | translate }}" matTooltip="{{ 'IDP.SETAVAILABLE' | translate }}"
(click)="addIdp(idp)" (click)="addIdp(idp); $event.stopPropagation()"
> >
<i class="las la-check-circle"></i> <i class="las la-check-circle"></i>
</button> </button>
@ -160,7 +160,7 @@
" "
mat-icon-button mat-icon-button
matTooltip="{{ 'IDP.SETUNAVAILABLE' | translate }}" matTooltip="{{ 'IDP.SETUNAVAILABLE' | translate }}"
(click)="removeIdp(idp)" (click)="removeIdp(idp); $event.stopPropagation()"
> >
<i class="las la-times-circle"></i> <i class="las la-times-circle"></i>
</button> </button>
@ -185,7 +185,7 @@
mat-icon-button mat-icon-button
color="warn" color="warn"
matTooltip="{{ 'ACTIONS.REMOVE' | translate }}" matTooltip="{{ 'ACTIONS.REMOVE' | translate }}"
(click)="deleteIdp(idp)" (click)="deleteIdp(idp); $event.stopPropagation()"
> >
<i class="las la-trash"></i> <i class="las la-trash"></i>
</button> </button>
@ -194,7 +194,7 @@
</ng-container> </ng-container>
<tr mat-header-row *matHeaderRowDef="displayedColumns"></tr> <tr mat-header-row *matHeaderRowDef="displayedColumns"></tr>
<tr class="highlight" mat-row *matRowDef="let row; columns: displayedColumns"></tr> <tr class="highlight" (click)="navigateToIDP(row)" mat-row *matRowDef="let row; columns: displayedColumns"></tr>
</table> </table>
</div> </div>

View File

@ -2,7 +2,7 @@ import { SelectionModel } from '@angular/cdk/collections';
import { Component, EventEmitter, Input, OnInit, Output, ViewChild } from '@angular/core'; import { Component, EventEmitter, Input, OnInit, Output, ViewChild } from '@angular/core';
import { MatLegacyDialog as MatDialog } from '@angular/material/legacy-dialog'; import { MatLegacyDialog as MatDialog } from '@angular/material/legacy-dialog';
import { MatLegacyTableDataSource as MatTableDataSource } from '@angular/material/legacy-table'; import { MatLegacyTableDataSource as MatTableDataSource } from '@angular/material/legacy-table';
import { RouterLink } from '@angular/router'; import { Router, RouterLink } from '@angular/router';
import { TranslateService } from '@ngx-translate/core'; import { TranslateService } from '@ngx-translate/core';
import { Duration } from 'google-protobuf/google/protobuf/duration_pb'; import { Duration } from 'google-protobuf/google/protobuf/duration_pb';
import { BehaviorSubject, Observable } from 'rxjs'; import { BehaviorSubject, Observable } from 'rxjs';
@ -31,6 +31,8 @@ import { AdminService } from 'src/app/services/admin.service';
import { ManagementService } from 'src/app/services/mgmt.service'; import { ManagementService } from 'src/app/services/mgmt.service';
import { ToastService } from 'src/app/services/toast.service'; import { ToastService } from 'src/app/services/toast.service';
import { OverlayWorkflowService } from 'src/app/services/overlay/overlay-workflow.service';
import { ContextChangedWorkflowOverlays } from 'src/app/services/overlay/workflows';
import { PageEvent, PaginatorComponent } from '../paginator/paginator.component'; import { PageEvent, PaginatorComponent } from '../paginator/paginator.component';
import { PolicyComponentServiceType } from '../policies/policy-component-types.enum'; import { PolicyComponentServiceType } from '../policies/policy-component-types.enum';
import { WarnDialogComponent } from '../warn-dialog/warn-dialog.component'; import { WarnDialogComponent } from '../warn-dialog/warn-dialog.component';
@ -60,7 +62,13 @@ export class IdpTableComponent implements OnInit {
public IDPStylingType: any = IDPStylingType; public IDPStylingType: any = IDPStylingType;
public loginPolicy!: LoginPolicy.AsObject; public loginPolicy!: LoginPolicy.AsObject;
constructor(public translate: TranslateService, private toast: ToastService, private dialog: MatDialog) { constructor(
private workflowService: OverlayWorkflowService,
public translate: TranslateService,
private toast: ToastService,
private dialog: MatDialog,
private router: Router,
) {
this.selection.changed.subscribe(() => { this.selection.changed.subscribe(() => {
this.changedSelection.emit(this.selection.selected); this.changedSelection.emit(this.selection.selected);
}); });
@ -241,6 +249,16 @@ export class IdpTableComponent implements OnInit {
} }
} }
navigateToIDP(row: Provider.AsObject) {
this.router.navigate(this.routerLinkForRow(row)).then(() => {
if (this.serviceType === PolicyComponentServiceType.MGMT && row.owner === IDPOwnerType.IDP_OWNER_TYPE_SYSTEM) {
setTimeout(() => {
this.workflowService.startWorkflow(ContextChangedWorkflowOverlays, null);
}, 1000);
}
});
}
private async getIdps(): Promise<IDPLoginPolicyLink.AsObject[]> { private async getIdps(): Promise<IDPLoginPolicyLink.AsObject[]> {
switch (this.serviceType) { switch (this.serviceType) {
case PolicyComponentServiceType.MGMT: case PolicyComponentServiceType.MGMT:

View File

@ -66,3 +66,17 @@ export const OrgContextChangedWorkflowOverlays: CnslOverlay[] = [
}, },
}, },
]; ];
export const ContextChangedWorkflowOverlays: CnslOverlay[] = [
{
id: 'contextswitcher',
origin: 'orgbutton',
toHighlight: ['orgbutton'],
content: {
i18nText: 'OVERLAYS.SWITCHEDTOINSTANCE.TEXT',
},
requirements: {
permission: ['iam.read'],
},
},
];

View File

@ -204,6 +204,9 @@
}, },
"CONTEXTCHANGED": { "CONTEXTCHANGED": {
"TEXT": "Achtung! Soeben wurde die Organisation gewechselt." "TEXT": "Achtung! Soeben wurde die Organisation gewechselt."
},
"SWITCHEDTOINSTANCE": {
"TEXT": "Soeben wurde die Ansicht auf Instanz gewechselt."
} }
}, },
"FILTER": { "FILTER": {

View File

@ -204,7 +204,10 @@
"TEXT": "This navigation changes based on your selected organization above or your instance" "TEXT": "This navigation changes based on your selected organization above or your instance"
}, },
"CONTEXTCHANGED": { "CONTEXTCHANGED": {
"TEXT": "Attention! The organization context has changed." "TEXT": "The organization context has changed."
},
"SWITCHEDTOINSTANCE": {
"TEXT": "The view just changed to instance!"
} }
}, },
"FILTER": { "FILTER": {

View File

@ -205,6 +205,9 @@
}, },
"CONTEXTCHANGED": { "CONTEXTCHANGED": {
"TEXT": "¡Atención! El contexto de la organización ha cambiado." "TEXT": "¡Atención! El contexto de la organización ha cambiado."
},
"SWITCHEDTOINSTANCE": {
"TEXT": "¡La vista acaba de cambiar a instancia!"
} }
}, },
"FILTER": { "FILTER": {

View File

@ -204,6 +204,9 @@
}, },
"CONTEXTCHANGED": { "CONTEXTCHANGED": {
"TEXT": "Attention ! Le contexte de l'organisation a changé." "TEXT": "Attention ! Le contexte de l'organisation a changé."
},
"SWITCHEDTOINSTANCE": {
"TEXT": "La vue vient de changer en instance !"
} }
}, },
"FILTER": { "FILTER": {

View File

@ -204,6 +204,9 @@
}, },
"CONTEXTCHANGED": { "CONTEXTCHANGED": {
"TEXT": "Attenzione! L'organizzazione è appena stata cambiata." "TEXT": "Attenzione! L'organizzazione è appena stata cambiata."
},
"SWITCHEDTOINSTANCE": {
"TEXT": "La visualizzazione è appena stata modificata in istanza!"
} }
}, },
"FILTER": { "FILTER": {

View File

@ -205,6 +205,9 @@
}, },
"CONTEXTCHANGED": { "CONTEXTCHANGED": {
"TEXT": "注意! 組織のコンテキストが変更されました。" "TEXT": "注意! 組織のコンテキストが変更されました。"
},
"SWITCHEDTOINSTANCE": {
"TEXT": "ビューがインスタンスに変更されました。"
} }
}, },
"FILTER": { "FILTER": {

View File

@ -204,6 +204,9 @@
}, },
"CONTEXTCHANGED": { "CONTEXTCHANGED": {
"TEXT": "Uwaga! Kontekst organizacji uległ zmianie." "TEXT": "Uwaga! Kontekst organizacji uległ zmianie."
},
"SWITCHEDTOINSTANCE": {
"TEXT": "Widok właśnie zmienił się na instancję!"
} }
}, },
"FILTER": { "FILTER": {

View File

@ -204,6 +204,9 @@
}, },
"CONTEXTCHANGED": { "CONTEXTCHANGED": {
"TEXT": "注意!组织环境发生了变化。" "TEXT": "注意!组织环境发生了变化。"
},
"SWITCHEDTOINSTANCE": {
"TEXT": "视图刚刚更改为实例!"
} }
}, },
"FILTER": { "FILTER": {

View File

@ -1,5 +1,5 @@
--- ---
title: Endpoints title: OpenID Connect Endpoints
--- ---
import Tabs from "@theme/Tabs"; import Tabs from "@theme/Tabs";
@ -101,7 +101,7 @@ no additional parameters required
| state | Opaque value used to maintain state between the request and the callback. Used for Cross-Site Request Forgery (CSRF) mitigation as well, therefore highly **recommended**. | | state | Opaque value used to maintain state between the request and the callback. Used for Cross-Site Request Forgery (CSRF) mitigation as well, therefore highly **recommended**. |
| ui_locales | Spaces delimited list of preferred locales for the login UI, e.g. `de-CH de en`. If none is provided or matches the possible locales provided by the login UI, the `accept-language` header of the browser will be taken into account. | | ui_locales | Spaces delimited list of preferred locales for the login UI, e.g. `de-CH de en`. If none is provided or matches the possible locales provided by the login UI, the `accept-language` header of the browser will be taken into account. |
### Successful Code Response ### Successful code response
When your `response_type` was `code` and no error occurred, the following response will be returned: When your `response_type` was `code` and no error occurred, the following response will be returned:
@ -110,7 +110,7 @@ When your `response_type` was `code` and no error occurred, the following respon
| code | Opaque string which will be necessary to request tokens on the token endpoint | | code | Opaque string which will be necessary to request tokens on the token endpoint |
| state | Unmodified `state` parameter from the request | | state | Unmodified `state` parameter from the request |
### Successful Implicit Response ### Successful implicit response
When your `response_type` was either `it_token` or `id_token token` and no error occurred, the following response will be returned: When your `response_type` was either `it_token` or `id_token token` and no error occurred, the following response will be returned:
@ -123,7 +123,7 @@ When your `response_type` was either `it_token` or `id_token token` and no error
| scope | Scopes of the `access_token`. These might differ from the provided `scope` parameter. | | scope | Scopes of the `access_token`. These might differ from the provided `scope` parameter. |
| state | Unmodified `state` parameter from the request | | state | Unmodified `state` parameter from the request |
### Error Response ### Error response
Regardless of the authorization flow chosen, if an error occurs the following response will be returned to the redirect_uri. Regardless of the authorization flow chosen, if an error occurs the following response will be returned to the redirect_uri.
@ -158,11 +158,11 @@ The token_endpoint will as the name suggests return various tokens (access, id a
When using [`authorization_code`](#authorization-code-grant-code-exchange) flow call this endpoint after receiving the code from the authorization_endpoint. When using [`authorization_code`](#authorization-code-grant-code-exchange) flow call this endpoint after receiving the code from the authorization_endpoint.
When using [`refresh_token`](#authorization-code-grant-code-exchange) or [`urn:ietf:params:oauth:grant-type:jwt-bearer` (JWT Profile)](#jwt-profile-grant) you will call this endpoint directly. When using [`refresh_token`](#authorization-code-grant-code-exchange) or [`urn:ietf:params:oauth:grant-type:jwt-bearer` (JWT Profile)](#jwt-profile-grant) you will call this endpoint directly.
### Authorization Code Grant (Code Exchange) ### Authorization code grant (Code Exchange)
As mention above, when using `authorization_code` grant, this endpoint will be your second request for authorizing a user with its user agent (browser). As mention above, when using `authorization_code` grant, this endpoint will be your second request for authorizing a user with its user agent (browser).
#### Required request Parameters #### Required request parameters
| Parameter | Description | | Parameter | Description |
| ------------ | ------------------------------------------------------------------------------------------------------------- | | ------------ | ------------------------------------------------------------------------------------------------------------- |
@ -229,9 +229,9 @@ Send a client assertion as JWT for us to validate the signature against the regi
| refresh_token | An opaque token. Only returned if `offline_access` scope was requested | | refresh_token | An opaque token. Only returned if `offline_access` scope was requested |
| token_type | Type of the `access_token`. Value is always `Bearer` | | token_type | Type of the `access_token`. Value is always `Bearer` |
### JWT Profile Grant ### JWT profile grant
#### Required request Parameters #### Required request parameters
| Parameter | Description | | Parameter | Description |
| ---------- | ----------------------------------------------------------------------------------------------------------------------- | | ---------- | ----------------------------------------------------------------------------------------------------------------------- |
@ -247,7 +247,7 @@ curl --request POST \
--data assertion=eyJhbGciOiJSUzI1Ni... --data assertion=eyJhbGciOiJSUzI1Ni...
``` ```
#### Successful JWT Profile response {#token-jwt-response} #### Successful JWT profile response {#token-jwt-response}
| Property | Description | | Property | Description |
| ------------ | ------------------------------------------------------------------------------------- | | ------------ | ------------------------------------------------------------------------------------- |
@ -257,12 +257,12 @@ curl --request POST \
| scope | Scopes of the `access_token`. These might differ from the provided `scope` parameter. | | scope | Scopes of the `access_token`. These might differ from the provided `scope` parameter. |
| token_type | Type of the `access_token`. Value is always `Bearer` | | token_type | Type of the `access_token`. Value is always `Bearer` |
### Refresh Token Grant ### Refresh token grant
To request a new `access_token` without user interaction, you can use the `refresh_token` grant. To request a new `access_token` without user interaction, you can use the `refresh_token` grant.
See [offline_access Scope](scopes#standard-scopes) for how to request a `refresh_token` in the authorization request. See [offline_access Scope](scopes#standard-scopes) for how to request a `refresh_token` in the authorization request.
#### Required request Parameters #### Required request parameters
| Parameter | Description | | Parameter | Description |
| ------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | ------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
@ -325,9 +325,9 @@ Send a `client_assertion` as JWT for us to validate the signature against the re
| refresh_token | An new opaque refresh_token. | | refresh_token | An new opaque refresh_token. |
| token_type | Type of the `access_token`. Value is always `Bearer` | | token_type | Type of the `access_token`. Value is always `Bearer` |
### Client Credentials Grant ### Client credentials grant
#### Required request Parameters #### Required request parameters
| Parameter | Description | | Parameter | Description |
| ---------- | ----------------------------------------------------------------------------------------------------------------------- | | ---------- | ----------------------------------------------------------------------------------------------------------------------- |
@ -363,7 +363,7 @@ curl --request POST \
--data scope=openid profile --data scope=openid profile
``` ```
#### Successful Client Credentials response {#token-client-credentials-response} #### Successful client credentials response {#token-client-credentials-response}
| Property | Description | | Property | Description |
| ------------ | ------------------------------------------------------------------------------------- | | ------------ | ------------------------------------------------------------------------------------- |
@ -584,8 +584,27 @@ If both parameters are provided, they must be equal.
{your_domain}/oauth/v2/keys {your_domain}/oauth/v2/keys
> Be aware that these keys can be rotated without any prior notice. We will however make sure that a proper `kid` is set with each key! The endpoint returns a JSON Web Key Set (JWKS) containing the public keys that can be used to locally validate JWTs you received from ZITADEL.
The alternative would be to validate tokens with the [introspection endpoint](#introspection_endpoint).
## OAuth 2.0 Metadata ### Key rotation
Keys are automatically rotated on a regular basis or on demand, meaning keys can change in irregular intervals.
ZITADEL ensures that a proper `kid` is set with each key.
:::info Keys rotate without prior notice
Be aware that these keys can be rotated without any prior notice.
:::
### Caching
You can optimize performance of your clients by caching the response from the keys endpoint.
We recommend to regularly update the cached response, since the [keys can be rotated without prior notice](#key-rotation).
You could also combine caching with a risk-based on-demand refresh when a critical operation is executed.
Without caching you will call this endpoint on each request.
This might result in being rate limited for a large number of requests that come from the same backend.
## OAuth 2.0 metadata
**ZITADEL** does not yet provide a OAuth 2.0 Metadata endpoint but instead provides a [OpenID Connect Discovery Endpoint](https://openid.net/specs/openid-connect-discovery-1_0.html). **ZITADEL** does not yet provide a OAuth 2.0 Metadata endpoint but instead provides a [OpenID Connect Discovery Endpoint](https://openid.net/specs/openid-connect-discovery-1_0.html).

View File

@ -1,8 +1,8 @@
--- ---
title: Endpoints title: SAML endpoints
--- ---
## SAML 2.0 Metadata ## SAML 2.0 metadata
The SAML Metadata is located within the issuer domain. This would give us {your_domain}/saml/v2/metadata. The SAML Metadata is located within the issuer domain. This would give us {your_domain}/saml/v2/metadata.
@ -11,14 +11,14 @@ This metadata contains all the information defined in the spec.
**Link to **Link to
spec.** [Metadata for the OASIS Security Assertion Markup Language (SAML) V2.0 Errata Composite](https://www.oasis-open.org/committees/download.php/35391/sstc-saml-metadata-errata-2.0-wd-04-diff.pdf) spec.** [Metadata for the OASIS Security Assertion Markup Language (SAML) V2.0 Errata Composite](https://www.oasis-open.org/committees/download.php/35391/sstc-saml-metadata-errata-2.0-wd-04-diff.pdf)
## Certificate Endpoint ## Certificate endpoint
{your_domain}/saml/v2/certificate {your_domain}/saml/v2/certificate
The certificate endpoint provides the certificate which is used to sign the responses for download, for easier use with The certificate endpoint provides the certificate which is used to sign the responses for download, for easier use with
different service providers which want the certificate separately instead of inside the metadata. different service providers which want the certificate separately instead of inside the metadata.
## SSO Endpoint ## SSO endpoint
{your_domain}/saml/v2/SSO {your_domain}/saml/v2/SSO
@ -40,7 +40,7 @@ spec.** [Bindings for the OASIS Security Assertion Markup Language (SAML) V2.0
| SigAlg | Algorithm used to sign the request, only if binding is 'urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect' as signature has to be provided es separate parameter. (base64 encoded) | | SigAlg | Algorithm used to sign the request, only if binding is 'urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect' as signature has to be provided es separate parameter. (base64 encoded) |
| Signature | Signature of the request as parameter with 'urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect' binding. (base64 encoded) | | Signature | Signature of the request as parameter with 'urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect' binding. (base64 encoded) |
### Successful Response ### Successful response
Depending on the content of the request the response comes back in the requested binding, but the content is the same. Depending on the content of the request the response comes back in the requested binding, but the content is the same.
@ -51,7 +51,7 @@ Depending on the content of the request the response comes back in the requested
| SigAlg | Algorithm used to sign the response, only if binding is 'urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect' as signature has to be provided es separate parameter. (base64 encoded) | | SigAlg | Algorithm used to sign the response, only if binding is 'urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect' as signature has to be provided es separate parameter. (base64 encoded) |
| Signature | Signature of the response as parameter with 'urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect' binding. (base64 encoded) | | Signature | Signature of the response as parameter with 'urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect' binding. (base64 encoded) |
### Error Response ### Error response
Regardless of the error, the used http error code will be '200', which represents a successful request. Whereas the Regardless of the error, the used http error code will be '200', which represents a successful request. Whereas the
response will contain a StatusCode include a message which provides more information if an error occurred. response will contain a StatusCode include a message which provides more information if an error occurred.

View File

@ -147,5 +147,5 @@ The storage layer of ZITADEL is responsible for multiple things. For example:
ZITADEL currently supports CockroachDB as first choice of storage due to its perfect match for ZITADELs needs. ZITADEL currently supports CockroachDB as first choice of storage due to its perfect match for ZITADELs needs.
Postgres is currently in [Beta](/docs/support/software-release-cycles-support#beta) and will be [Enterprise Supported](/docs/support/software-release-cycles-support#partially-supported) afterwards. Postgres is currently in [Beta](/docs/support/software-release-cycles-support#beta) and will be [Enterprise Supported](/docs/support/software-release-cycles-support#partially-supported) afterwards.
Beta state will be removed as soon as [automated tests](https://github.com/zitadel/zitadel/issues/5741) are implemented. Beta state will be removed as soon as [automated tests](https://github.com/zitadel/zitadel/issues/5741) are implemented.
Make sure to read our [Production Guide](./self-hosting/manage/production#prefer-cockroachdb) before you decide to use it. Make sure to read our [Production Guide](/docs/self-hosting/manage/production#prefer-cockroachdb) before you decide to use it.

View File

@ -11,7 +11,7 @@ Depending on your projects needs our general recommendation is to run ZITADEL an
Consult the [CockroachDB documentation](https://www.cockroachlabs.com/docs/) for more details or use the [CockroachCloud Service](https://www.cockroachlabs.com/docs/cockroachcloud/create-an-account.html) Consult the [CockroachDB documentation](https://www.cockroachlabs.com/docs/) for more details or use the [CockroachCloud Service](https://www.cockroachlabs.com/docs/cockroachcloud/create-an-account.html)
Postgres is currently in [Beta](/docs/support/software-release-cycles-support#beta) and will be [Enterprise Supported](/docs/support/software-release-cycles-support#partially-supported) afterwards. Postgres is currently in [Beta](/docs/support/software-release-cycles-support#beta) and will be [Enterprise Supported](/docs/support/software-release-cycles-support#partially-supported) afterwards.
Beta state will be removed as soon as [automated tests](https://github.com/zitadel/zitadel/issues/5741) are implemented. Beta state will be removed as soon as [automated tests](https://github.com/zitadel/zitadel/issues/5741) are implemented.
Make sure to read our [Production Guide](./self-hosting/manage/production#prefer-cockroachdb) before you decide to use it. Make sure to read our [Production Guide](/self-hosting/manage/production#prefer-cockroachdb) before you decide to use it.
## Scalability ## Scalability

View File

@ -4,7 +4,7 @@ title: Audit Trail
ZITADEL provides you with an built-in audit trail to track all changes and events over an unlimited period of time. ZITADEL provides you with an built-in audit trail to track all changes and events over an unlimited period of time.
Most other solutions replace a historic record and track changes in a separate log when information is updated. Most other solutions replace a historic record and track changes in a separate log when information is updated.
ZITADEL only ever appends data in an [Eventstore](https://zitadel.com/docs/concepts/eventstore), keeping all historic record. ZITADEL only ever appends data in an [Eventstore](/docs/concepts/eventstore/overview), keeping all historic record.
The audit trail itself is identical to the state, since ZITADEL calculates the state from all the past changes. The audit trail itself is identical to the state, since ZITADEL calculates the state from all the past changes.
![Example of events that happen for a profile change and a login](/img/concepts/audit-trail/audit-log-events.png) ![Example of events that happen for a profile change and a login](/img/concepts/audit-trail/audit-log-events.png)

View File

@ -94,5 +94,4 @@ You might want to check out the following links to find a good library:
- [awesome-auth](https://github.com/casbin/awesome-auth) - [awesome-auth](https://github.com/casbin/awesome-auth)
- [OpenID General References](https://openid.net/developers/libraries/) - [OpenID General References](https://openid.net/developers/libraries/)
- [OpenID certified libraries](https://openid.net/developers/certified/) - [OpenID certified developer tools](https://openid.net/certified-open-id-developer-tools/)
- [OpenID uncertified libraries](https://openid.net/developers/uncertified/)

View File

@ -8,7 +8,7 @@ The individual guides in this section should give you an overview of things to c
When moving from a previous auth solution to ZITADEL, it is important to note that some decisions and features are unique to ZITADEL. When moving from a previous auth solution to ZITADEL, it is important to note that some decisions and features are unique to ZITADEL.
Without duplicating too much content here are some important features and patterns to consider in terms of solution architecture. Without duplicating too much content here are some important features and patterns to consider in terms of solution architecture.
You can read more about the basic structure and important concepts of ZITADEL in our [concepts section](https://zitadel.com/docs/concepts/introduction). You can read more about the basic structure and important concepts of ZITADEL in our [concepts section](/docs/concepts/).
## Multi-Tenancy Architecture ## Multi-Tenancy Architecture

View File

@ -3,8 +3,6 @@ title: Acceptable Use Policy
custom_edit_url: null custom_edit_url: null
--- ---
## Introduction
This policy is an annex to the [Terms of Service](terms-of-service) and clarifies your obligations while using our Services. This policy is an annex to the [Terms of Service](terms-of-service) and clarifies your obligations while using our Services.
## Use ## Use

View File

@ -0,0 +1,62 @@
---
title: Account Lockout Policy
custom_edit_url: null
---
This policy is an annex to the [Terms of Service](../terms-of-service) that clarifies your obligations and our procedure handling requests where you can't get access to your ZITADEL Cloud services and data. This policy is applicable to situations where we, ZITADEL, need to restore your access for a otherwise available service and not in cases where the services are unavailable.
## Why to do we have this policy?
Users may not be able to access our services anymore due to loss of credentials or misconfiguration.
In certain circumstances it might not be possible to recover the credentials through a self-service flow (eg, loss of 2FA credentials) or access the system to undo the configuration that caused the issue.
These cases might require help from our support, so you can regain access to your data.
We will require some initial information and conditions to be able to assist you, and will require further information to handle the request.
We also keep the right to refuse any such request without providing a reason, in case you can't provide the requested information.
## Scope
In scope of this policy are requests to recover
- ZITADEL Cloud account (customer portal)
- Manager accounts to a specific instance
- Undo configuration changes resulting in lockout (eg, misconfigured Action)
Out of scope are requests to recover access
- Where you have to option to ask another Admin/Manager
- by end-users who should ask an Admin/Manager instead
- self-hosted instances
## Process
Before you send a request to restore access to your account, please make sure that can't ask your manager/admin or another manager/admin to recover access.
### ZITADEL Cloud account
If you need to recover your ZITADEL Cloud account for the customer portal, please send an email to [support@zitadel.com](mailto:support@zitadel.com?subject=ZITADEL%20Cloud%20account%20lockout):
- State clearly in the subject line that this is related to an account lockout for a ZITADEL Cloud account
- The sender's email address must match the verified email address of the account owner
- State the reason why you're not able to recover the account yourself
Please allow us time to validate your request.
Our support will get back to you to request additional information for verification.
### Manager access to an Instance
If you need to recover a Manager account to an instance, please make sure you can't recover the account via another user or service user with Manager permissions.
Please visit the [support page in the customer portal](https://zitadel.cloud/admin/support):
- State clearly in the subject line that this is related to an account lockout the affected instance
- State the reason why you're not able to recover the account yourself
Please allow us time to validate your request.
Our support will get back to you to request additional information for verification.
## Entry into force
This policy is valid from May 31, 2023.
Last revised May 31, 2023

View File

@ -2,7 +2,6 @@
title: Rate Limit Policy title: Rate Limit Policy
custom_edit_url: null custom_edit_url: null
--- ---
## Introduction
This policy is an annex to the [Terms of Service](terms-of-service) and clarifies your obligations while using our Services, specifically how we will use rate limiting to enforce certain aspects of our [Acceptable Use Policy](acceptable-use-policy). This policy is an annex to the [Terms of Service](terms-of-service) and clarifies your obligations while using our Services, specifically how we will use rate limiting to enforce certain aspects of our [Acceptable Use Policy](acceptable-use-policy).

View File

@ -3,8 +3,6 @@ title: Vulnerability Disclosure Policy
custom_edit_url: null custom_edit_url: null
--- ---
## Introduction
At ZITADEL we are extremely grateful for security aware people who disclose vulnerabilities to us and the open source community. At ZITADEL we are extremely grateful for security aware people who disclose vulnerabilities to us and the open source community.
All reports will be investigated by our team and we will work with you closely to validate and fix vulnerabilities reported to us. All reports will be investigated by our team and we will work with you closely to validate and fix vulnerabilities reported to us.
@ -91,6 +89,6 @@ In case we have confirmed your report, we may compensate you, given prior writte
## Entry into force ## Entry into force
This privacy policy is valid from March 16, 2023. This policy is valid from March 16, 2023.
Last revised March 16, 2023 Last revised March 16, 2023

View File

@ -10,49 +10,61 @@ Installation and configuration details are described in the [open source ZITADEL
By default, the chart installs a secure and highly available ZITADEL instance. By default, the chart installs a secure and highly available ZITADEL instance.
For running an easily testable, insecure, non-HA ZITADEL instance, run the following commands. For running an easily testable, insecure, non-HA ZITADEL instance, run the following commands.
## Helm
### Add the helm repositories for CockroachDB and ZITADEL ## Add the Helm Repositories for CockroachDB and ZITADEL
```bash ```bash
helm repo add cockroachdb https://charts.cockroachdb.com/ helm repo add cockroachdb https://charts.cockroachdb.com/
helm repo add zitadel https://charts.zitadel.com helm repo add zitadel https://charts.zitadel.com
``` ```
### Install zitadel After you have your repositories added,
you can setup ZITADEL and either
- initialize an [IAM owner who is a human user](#setup-zitadel-and-a-human-admin) or
- initialize an [IAM owner who is a service account](#setup-zitadel-and-a-service-account-admin)
#### Install an insecure cockroachdb and zitadel release that works with localhost ## Setup ZITADEL and a Human Admin
```bash ```bash
# CockroachDB # Install CockroachDB
helm install crdb cockroachdb/cockroachdb \ helm install crdb cockroachdb/cockroachdb \
--set fullnameOverride=crdb \ --set fullnameOverride=crdb \
--set single-node=true \ --set single-node=true \
--set statefulset.replicas=1 --set statefulset.replicas=1
# ZITADEL # Install ZITADEL
helm install my-zitadel zitadel/zitadel \ helm install my-zitadel zitadel/zitadel \
--set zitadel.masterkey="MasterkeyNeedsToHave32Characters" \ --set zitadel.masterkey="MasterkeyNeedsToHave32Characters" \
--set zitadel.configmapConfig.ExternalSecure=false \ --set zitadel.configmapConfig.ExternalSecure=false \
--set zitadel.configmapConfig.TLS.Enabled=false \ --set zitadel.configmapConfig.TLS.Enabled=false \
--set zitadel.secretConfig.Database.cockroach.User.Password="a-zitadel-db-user-password" \ --set zitadel.secretConfig.Database.cockroach.User.Password="a-zitadel-db-user-password" \
--set replicaCount=1 --set replicaCount=1
# Make ZITADEL locally accessible
kubectl port-forward svc/my-zitadel 8080
``` ```
<DefaultUser components={props.components} /> <DefaultUser components={props.components} />
#### Install an insecure zitadel release that works with localhost with a service account ## Setup ZITADEL and a Service Account Admin
!!!Caution!!! With this setup you only get a service account with a key and no admin account where you can login directly into ZITADEL. With this setup, you don't create a human user that has the IAM_OWNER role.
Instead, you create a service account that has the IAM_OWNER role.
ZITADEL will also create a key for your, with which you can authenticate to the ZITADEL API.
For example, you can install ZITADEL and seemlessly provision ZITADEL resources after installation using [Terraform](/docs/guides/manage/terraform/basics.md).
:::caution
With this setup you only get a key for a service account. Logging in at ZITADEL using the login screen is not possible until you create a user with the ZITADEL API.
:::
```bash ```bash
# CockroachDB # Install CockroachDB
helm install crdb cockroachdb/cockroachdb \ helm install crdb cockroachdb/cockroachdb \
--set fullnameOverride=crdb \ --set fullnameOverride=crdb \
--set single-node=true \ --set single-node=true \
--set statefulset.replicas=1 --set statefulset.replicas=1
# ZITADEL # Install ZITADEL
helm install --namespace zitadel --create-namespace my-zitadel zitadel/zitadel \ helm install --namespace zitadel --create-namespace my-zitadel zitadel/zitadel \
--set zitadel.masterkey="MasterkeyNeedsToHave32Characters" \ --set zitadel.masterkey="MasterkeyNeedsToHave32Characters" \
--set zitadel.configmapConfig.ExternalSecure=false \ --set zitadel.configmapConfig.ExternalSecure=false \
@ -63,20 +75,15 @@ helm install --namespace zitadel --create-namespace my-zitadel zitadel/zitadel \
--set zitadel.configmapConfig.FirstInstance.Org.Machine.Machine.Username="zitadel-admin-sa" \ --set zitadel.configmapConfig.FirstInstance.Org.Machine.Machine.Username="zitadel-admin-sa" \
--set zitadel.configmapConfig.FirstInstance.Org.Machine.Machine.Name="Admin" \ --set zitadel.configmapConfig.FirstInstance.Org.Machine.Machine.Name="Admin" \
--set zitadel.configmapConfig.FirstInstance.Org.Machine.MachineKey.Type=1 --set zitadel.configmapConfig.FirstInstance.Org.Machine.MachineKey.Type=1
# Make ZITADEL locally accessible
kubectl port-forward svc/my-zitadel 8080
``` ```
When helm is done, you get a command to retrieve your machine key, which is saved as a kubernetes secret, for example: When Helm is done, you can print your service account key from a Kubernetes secret:
```bash ```bash
kubectl -n zitadel get secret zitadel-admin-sa -o jsonpath='{ .data.zitadel-admin-sa\.json }' | base64 -D kubectl -n zitadel get secret zitadel-admin-sa -o jsonpath='{ .data.zitadel-admin-sa\.json }' | base64 -D
``` ```
This key can be used to provision resources with for example [Terraform](/docs/guides/manage/terraform/basics.md).
### Forward the ZITADEL service port to your local machine
```bash
kubectl port-forward svc/my-zitadel 8080:8080
```
<Next components={props.components} /> <Next components={props.components} />
<Disclaimer components={props.components} /> <Disclaimer components={props.components} />

View File

@ -14,7 +14,7 @@ Choose your platform and run ZITADEL with the most minimal configuration possibl
## Prerequisites ## Prerequisites
- For test environments, ZITADEL does not need many resources, 1 CPU and 512MB memory are more than enough. (With more CPU, the password hashing might be faster) - For test environments, ZITADEL does not need many resources, 1 CPU and 512MB memory are more than enough. (With more CPU, the password hashing might be faster)
- A CockroachDB or Postgresql as only needed storage. Make sure to read our [Production Guide](./self-hosting/manage/production#prefer-cockroachdb) before you decide to use Postgresql. - A CockroachDB or Postgresql as only needed storage. Make sure to read our [Production Guide](/docs/self-hosting/manage/production#prefer-cockroachdb) before you decide to use Postgresql.
) )
## Releases ## Releases

View File

@ -10,7 +10,7 @@ Be aware that PostgreSQL is only [Enterprise Supported](/docs/support/software-r
::: :::
If you want to use a PostgreSQL database instead of CockroachDB you can [overwrite the default configuration](../configure/configure.mdx). If you want to use a PostgreSQL database instead of CockroachDB you can [overwrite the default configuration](../configure/configure.mdx).
Make sure to read our [Production Guide](./self-hosting/manage/production#prefer-cockroachdb) before you decide to use it. Make sure to read our [Production Guide](/docs/self-hosting/manage/production#prefer-cockroachdb) before you decide to use it.
Currently versions >= 14 are supported. Currently versions >= 14 are supported.

View File

@ -9,13 +9,15 @@ To apply best practices to your production setup we created a step by step check
- [ ] Make use of configuration management tools such as Terraform to provision all of the below - [ ] Make use of configuration management tools such as Terraform to provision all of the below
- [ ] Use a secrets manager to store your confidential information - [ ] Use a secrets manager to store your confidential information
- [ ] Reduce the manual interaction with your platform to an absolute minimum - [ ] Reduce the manual interaction with your platform to an absolute minimum
#### HA Setup #### HA Setup
- [ ] High Availability for ZITADEL containers - [ ] High Availability for ZITADEL containers
- [ ] Use a container orchestrator such as Kubernetes - [ ] Use a container orchestrator such as Kubernetes
- [ ] Use serverless platform such as Knative or a hyperscaler equivalent (e.g. CloudRun from Google) - [ ] Use serverless platform such as Knative or a hyperscaler equivalent (e.g. CloudRun from Google)
- [ ] Split `zitadel init` and `zitadel setup` for fast start-up times when [scaling](/docs/self-hosting/manage/updating_scaling) ZITADEL - [ ] Split `zitadel init` and `zitadel setup` for fast start-up times when [scaling](/docs/self-hosting/manage/updating_scaling) ZITADEL
- [ ] High Availability for database - [ ] High Availability for database
- [ ] Follow the [Production Checklist](https://www.cockroachlabs.com/docs/stable/recommended-production-settings.html) for CockroachDB if you selfhost the database or use [CockroachDB cloud](https://www.cockroachlabs.com/docs/cockroachcloud/create-an-account.html) - [ ] Follow the [Production Checklist](https://www.cockroachlabs.com/docs/stable/recommended-production-settings.html) for CockroachDB if you selfhost the database or use [CockroachDB cloud](https://www.cockroachlabs.com/docs/cockroachcloud/create-an-account.html)
- [ ] Configure backups on a regular basis for the database - [ ] Configure backups on a regular basis for the database
- [ ] Test the restore scenarios before going live - [ ] Test the restore scenarios before going live
@ -26,12 +28,14 @@ To apply best practices to your production setup we created a step by step check
- [ ] Web Application Firewall - [ ] Web Application Firewall
#### Networking #### Networking
- [ ] Use a Layer 7 Web Application Firewall to secure ZITADEL that supports **[HTTP/2](/docs/self-hosting/manage/http2)** - [ ] Use a Layer 7 Web Application Firewall to secure ZITADEL that supports **[HTTP/2](/docs/self-hosting/manage/http2)**
- [ ] Limit the access by IP addresses if needed - [ ] Limit the access by IP addresses if needed
- [ ] Secure the access by rate limits for specific endpoints (e.g. API vs frontend) to secure availability on high load. See the [ZITADEL Cloud rate limits](https://zitadel.com/docs/apis/ratelimits) for reference. - [ ] Secure the access by rate limits for specific endpoints (e.g. API vs frontend) to secure availability on high load. See the [ZITADEL Cloud rate limits](/docs/legal/rate-limit-policy) for reference.
- [ ] Check that your firewall also filters IPv6 traffic``` - [ ] Check that your firewall also filters IPv6 traffic
### ZITADEL configuration ### ZITADEL configuration
- [ ] Configure a valid [SMTP Server](/docs/guides/manage/console/instance-settings#smtp) and test the email delivery - [ ] Configure a valid [SMTP Server](/docs/guides/manage/console/instance-settings#smtp) and test the email delivery
- [ ] Add [Custom Branding](/docs/guides/manage/customize/branding) if required - [ ] Add [Custom Branding](/docs/guides/manage/customize/branding) if required
- [ ] Configure a valid [SMS Service](/docs/guides/manage/console/instance-settings#sms) such as Twilio if needed - [ ] Configure a valid [SMS Service](/docs/guides/manage/console/instance-settings#sms) such as Twilio if needed
@ -40,12 +44,14 @@ To apply best practices to your production setup we created a step by step check
- [ ] Declare and apply zitadel configuration using the zitadel terraform [provider](https://github.com/zitadel/terraform-provider-zitadel) - [ ] Declare and apply zitadel configuration using the zitadel terraform [provider](https://github.com/zitadel/terraform-provider-zitadel)
### Security ### Security
- [ ] Use a FQDN and a trusted valid certificate for external [TLS](/docs/self-hosting/manage/tls_modes#http2) connections - [ ] Use a FQDN and a trusted valid certificate for external [TLS](/docs/self-hosting/manage/tls_modes#http2) connections
- [ ] Create service accounts for applications that interact with ZITADEL's APIs - [ ] Create service accounts for applications that interact with ZITADEL's APIs
- [ ] Make use of a CDN service to decrease the load for static assets served by ZITADEL - [ ] Make use of a CDN service to decrease the load for static assets served by ZITADEL
- [ ] Make use of a [security scanner](https://owasp.org/www-community/Vulnerability_Scanning_Tools) to test your application and deployment environment - [ ] Make use of a [security scanner](https://owasp.org/www-community/Vulnerability_Scanning_Tools) to test your application and deployment environment
### Monitoring ### Monitoring
Use an appropriate monitoring solution to have an overview about your ZITADEL instance. In particular you may want to watch out for things like: Use an appropriate monitoring solution to have an overview about your ZITADEL instance. In particular you may want to watch out for things like:
- [ ] CPU and memory of ZITADEL and the database - [ ] CPU and memory of ZITADEL and the database

View File

@ -28,6 +28,13 @@ We understand that these advisories may include breaking changes, and we aim to
</tr> </tr>
</table> </table>
## Subscribe to our Mailing List
If you want to stay up to date on our technical advisories, we recommend subscribing to the mailing list.
Go to <a href="https://zitadel.com/technical-advisory">the subscription form</a> and add your email address.
As ZITADEL Cloud customer, you can also login to the <a href="https://zitadel.cloud">ZITADEL Customer Portal</a> and enable the Technical Advisory <a href="https://zitadel.cloud/admin/notifications">Notifications</a> in your settings.
## Categories ## Categories
### Breaking Behaviour Change ### Breaking Behaviour Change

View File

@ -37,6 +37,42 @@ If you're self hosting with a custom domain, you need to instruct ZITADEL to use
You can find further instructions in our guide about [custom domains](https://zitadel.com/docs/self-hosting/manage/custom-domain). You can find further instructions in our guide about [custom domains](https://zitadel.com/docs/self-hosting/manage/custom-domain).
We also provide a guide on how to [configure](https://zitadel.com/docs/self-hosting/manage/configure) ZITADEL with variables from files or environment variables. We also provide a guide on how to [configure](https://zitadel.com/docs/self-hosting/manage/configure) ZITADEL with variables from files or environment variables.
## Invalid audience
`invalid audience (APP-Zxfako)`
This error message refers to the audience claim (`aud`) of your token.
This claim identifies the audience, i.e. the resource server, that this token is intended for.
If a resource server does not identify itself with a value in the "aud" claim when this claim is present, then the must be rejected (see [RFC7519](https://www.rfc-editor.org/rfc/rfc7519#section-4.1.3) for more details).
You might encounter this error message from ZITADEL, typically when you authenticated with a client in one project and trying to access an application in another project.
You need add a specific [reserved scope](http://localhost:3000/docs/apis/openidoauth/scopes#reserved-scopes) to add the projectID to the audience of the access token.
The two scenarios should help you troubleshoot this issue:
### Frontend to Backend
You have one project for your frontend application and one project for your backend application.
End-users authenticate to an application in your frontend project.
The frontend then sends requests to the backend, validates the token with ZITADEL's introspection endpoint, and returns a payload to the frontend.
The backend returns the error `invalid audience (APP-Zxfako)`.
You must add the scope `urn:zitadel:iam:org:project:id:{projectId}:aud` to the auth request that is send from the front end.
Replace `projectId` with the projectId of your backend.
### Accessing ZITADEL's APIs
You have a project for a frontend application.
The application should also access the API of your ZITADEL, for example to pull a list of all users and display them on a user page.
End-users authenticate to the application in the frontend project, but when calling the management API you get the error `invalid audience (APP-Zxfako)`.
You must add the scope `urn:zitadel:iam:org:project:id:zitadel:aud` to the auth request that is send from the front end.
When accessing your ZITADEL instance's APIs they act as a resource server.
You can check the Console or via API and see that when you open your default organization there exists a project "ZITADEL" that contains different applications for each API and the Console.
Like in the scenario above the access token requires to have an `aud` claim that includes the "ZITADEL" project.
Instead of `urn:zitadel:iam:org:project:id:zitadel:aud` you could also use `urn:zitadel:iam:org:project:id:{projectId}:aud`, where `projectId` is the projectId of the Project "ZITADEL".
## WebFinger requirement for Tailscale ## WebFinger requirement for Tailscale
The WebFinger requirement and setup is a step a user has to take outside of their IdP set-up. WebFinger is a protocol which supports the ability for OIDC issuer discovery, and we use it to prove that the user has administrative control over the domain and to retrieve the issuer. This is a requirement we have in place for all users, regardless of their IdP, who use custom OIDC with Tailscale. The WebFinger requirement and setup is a step a user has to take outside of their IdP set-up. WebFinger is a protocol which supports the ability for OIDC issuer discovery, and we use it to prove that the user has administrative control over the domain and to retrieve the issuer. This is a requirement we have in place for all users, regardless of their IdP, who use custom OIDC with Tailscale.

View File

@ -150,6 +150,23 @@ module.exports = {
"guides/integrate/logout", "guides/integrate/logout",
], ],
}, },
{
type: "category",
label: "Authenticate service users",
link: {
type: "generated-index",
title: "Authenticate Service Users",
slug: "/guides/integrate/serviceusers",
description:
"How to authenticate service users for machine-to-machine (M2M) communication between services. You also need to authenticate service users to access ZITADEL's APIs.",
},
collapsed: true,
items: [
"guides/integrate/private-key-jwt",
"guides/integrate/client-credentials",
"guides/integrate/pat",
],
},
{ {
type: "category", type: "category",
label: "Configure identity providers", label: "Configure identity providers",
@ -179,21 +196,9 @@ module.exports = {
collapsed: true, collapsed: true,
items: [ items: [
{ {
type: "category", type: 'link',
label: "Authenticate service users", label: 'Authenticate service users',
link: { href: '/guides/integrate/serviceusers',
type: "generated-index",
title: "Authenticate Service Users",
slug: "/guides/integrate/serviceusers",
description:
"How to authenticate service users",
},
collapsed: true,
items: [
"guides/integrate/private-key-jwt",
"guides/integrate/client-credentials",
"guides/integrate/pat",
],
}, },
"guides/integrate/access-zitadel-apis", "guides/integrate/access-zitadel-apis",
"guides/integrate/access-zitadel-system-api", "guides/integrate/access-zitadel-system-api",
@ -587,10 +592,17 @@ module.exports = {
type: "category", type: "category",
label: "Policies", label: "Policies",
collapsed: false, collapsed: false,
link: {
type: "generated-index",
title: "Policies",
slug: "/legal/policies",
description: "Policies and guidelines in addition to our terms of services.",
},
items: [ items: [
"legal/privacy-policy", "legal/privacy-policy",
"legal/acceptable-use-policy", "legal/acceptable-use-policy",
"legal/rate-limit-policy", "legal/rate-limit-policy",
"legal/policies/account-lockout-policy",
"legal/vulnerability-disclosure-policy", "legal/vulnerability-disclosure-policy",
], ],
}, },

View File

@ -72,7 +72,7 @@ const features = [
description="" description=""
/> />
<ListElement <ListElement
link="/docs/concepts/introduction" link="/docs/concepts"
type={ICONTYPE.TASKS} type={ICONTYPE.TASKS}
title="Concepts" title="Concepts"
description="" description=""

View File

@ -6,6 +6,7 @@ import (
"github.com/zitadel/zitadel/internal/api/authz" "github.com/zitadel/zitadel/internal/api/authz"
idp_grpc "github.com/zitadel/zitadel/internal/api/grpc/idp" idp_grpc "github.com/zitadel/zitadel/internal/api/grpc/idp"
object_pb "github.com/zitadel/zitadel/internal/api/grpc/object" object_pb "github.com/zitadel/zitadel/internal/api/grpc/object"
"github.com/zitadel/zitadel/internal/domain"
"github.com/zitadel/zitadel/internal/query" "github.com/zitadel/zitadel/internal/query"
admin_pb "github.com/zitadel/zitadel/pkg/grpc/admin" admin_pb "github.com/zitadel/zitadel/pkg/grpc/admin"
) )
@ -220,6 +221,22 @@ func (s *Server) UpdateGenericOIDCProvider(ctx context.Context, req *admin_pb.Up
}, nil }, nil
} }
func (s *Server) MigrateGenericOIDCProvider(ctx context.Context, req *admin_pb.MigrateGenericOIDCProviderRequest) (*admin_pb.MigrateGenericOIDCProviderResponse, error) {
var details *domain.ObjectDetails
var err error
if req.GetAzure() != nil {
details, err = s.command.MigrateInstanceGenericOIDCToAzureADProvider(ctx, req.GetId(), addAzureADProviderToCommand(req.GetAzure()))
} else if req.GetGoogle() != nil {
details, err = s.command.MigrateInstanceGenericOIDCToGoogleProvider(ctx, req.GetId(), addGoogleProviderToCommand(req.GetGoogle()))
}
if err != nil {
return nil, err
}
return &admin_pb.MigrateGenericOIDCProviderResponse{
Details: object_pb.DomainToAddDetailsPb(details),
}, nil
}
func (s *Server) AddJWTProvider(ctx context.Context, req *admin_pb.AddJWTProviderRequest) (*admin_pb.AddJWTProviderResponse, error) { func (s *Server) AddJWTProvider(ctx context.Context, req *admin_pb.AddJWTProviderRequest) (*admin_pb.AddJWTProviderResponse, error) {
id, details, err := s.command.AddInstanceJWTProvider(ctx, addJWTProviderToCommand(req)) id, details, err := s.command.AddInstanceJWTProvider(ctx, addJWTProviderToCommand(req))
if err != nil { if err != nil {

View File

@ -30,7 +30,6 @@ func TestMain(m *testing.M) {
} }
func TestServer_Healthz(t *testing.T) { func TestServer_Healthz(t *testing.T) {
client := admin.NewAdminServiceClient(Tester.GRPCClientConn) _, err := Tester.Client.Admin.Healthz(context.TODO(), &admin.HealthzRequest{})
_, err := client.Healthz(context.TODO(), &admin.HealthzRequest{})
require.NoError(t, err) require.NoError(t, err)
} }

View File

@ -30,7 +30,9 @@ func AllFieldsSet(t testing.TB, msg protoreflect.Message, ignoreTypes ...protore
} }
if fd.Kind() == protoreflect.MessageKind { if fd.Kind() == protoreflect.MessageKind {
AllFieldsSet(t, msg.Get(fd).Message(), ignoreTypes...) if m, ok := msg.Get(fd).Interface().(protoreflect.Message); ok {
AllFieldsSet(t, m, ignoreTypes...)
}
} }
} }
} }

View File

@ -6,6 +6,7 @@ import (
"github.com/zitadel/zitadel/internal/api/authz" "github.com/zitadel/zitadel/internal/api/authz"
idp_grpc "github.com/zitadel/zitadel/internal/api/grpc/idp" idp_grpc "github.com/zitadel/zitadel/internal/api/grpc/idp"
object_pb "github.com/zitadel/zitadel/internal/api/grpc/object" object_pb "github.com/zitadel/zitadel/internal/api/grpc/object"
"github.com/zitadel/zitadel/internal/domain"
"github.com/zitadel/zitadel/internal/query" "github.com/zitadel/zitadel/internal/query"
mgmt_pb "github.com/zitadel/zitadel/pkg/grpc/management" mgmt_pb "github.com/zitadel/zitadel/pkg/grpc/management"
) )
@ -212,6 +213,22 @@ func (s *Server) UpdateGenericOIDCProvider(ctx context.Context, req *mgmt_pb.Upd
}, nil }, nil
} }
func (s *Server) MigrateGenericOIDCProvider(ctx context.Context, req *mgmt_pb.MigrateGenericOIDCProviderRequest) (*mgmt_pb.MigrateGenericOIDCProviderResponse, error) {
var details *domain.ObjectDetails
var err error
if req.GetAzure() != nil {
details, err = s.command.MigrateOrgGenericOIDCToAzureADProvider(ctx, authz.GetCtxData(ctx).OrgID, req.GetId(), addAzureADProviderToCommand(req.GetAzure()))
} else if req.GetGoogle() != nil {
details, err = s.command.MigrateOrgGenericOIDCToGoogleProvider(ctx, authz.GetCtxData(ctx).OrgID, req.GetId(), addGoogleProviderToCommand(req.GetGoogle()))
}
if err != nil {
return nil, err
}
return &mgmt_pb.MigrateGenericOIDCProviderResponse{
Details: object_pb.DomainToAddDetailsPb(details),
}, nil
}
func (s *Server) AddJWTProvider(ctx context.Context, req *mgmt_pb.AddJWTProviderRequest) (*mgmt_pb.AddJWTProviderResponse, error) { func (s *Server) AddJWTProvider(ctx context.Context, req *mgmt_pb.AddJWTProviderRequest) (*mgmt_pb.AddJWTProviderResponse, error) {
id, details, err := s.command.AddOrgJWTProvider(ctx, authz.GetCtxData(ctx).OrgID, addJWTProviderToCommand(req)) id, details, err := s.command.AddOrgJWTProvider(ctx, authz.GetCtxData(ctx).OrgID, addJWTProviderToCommand(req))
if err != nil { if err != nil {

View File

@ -26,12 +26,20 @@ import (
mgmt_pb "github.com/zitadel/zitadel/pkg/grpc/management" mgmt_pb "github.com/zitadel/zitadel/pkg/grpc/management"
) )
func (s *Server) GetUserByID(ctx context.Context, req *mgmt_pb.GetUserByIDRequest) (*mgmt_pb.GetUserByIDResponse, error) { func (s *Server) getUserByID(ctx context.Context, id string) (*query.User, error) {
owner, err := query.NewUserResourceOwnerSearchQuery(authz.GetCtxData(ctx).OrgID, query.TextEquals) owner, err := query.NewUserResourceOwnerSearchQuery(authz.GetCtxData(ctx).OrgID, query.TextEquals)
if err != nil { if err != nil {
return nil, err return nil, err
} }
user, err := s.query.GetUserByID(ctx, true, req.Id, false, owner) user, err := s.query.GetUserByID(ctx, true, id, false, owner)
if err != nil {
return nil, err
}
return user, nil
}
func (s *Server) GetUserByID(ctx context.Context, req *mgmt_pb.GetUserByIDRequest) (*mgmt_pb.GetUserByIDResponse, error) {
user, err := s.getUserByID(ctx, req.GetId())
if err != nil { if err != nil {
return nil, err return nil, err
} }
@ -785,13 +793,18 @@ func (s *Server) GenerateMachineSecret(ctx context.Context, req *mgmt_pb.Generat
if err != nil { if err != nil {
return nil, err return nil, err
} }
user, err := s.getUserByID(ctx, req.GetUserId())
if err != nil {
return nil, err
}
set := new(command.GenerateMachineSecret) set := new(command.GenerateMachineSecret)
details, err := s.command.GenerateMachineSecret(ctx, req.UserId, authz.GetCtxData(ctx).OrgID, secretGenerator, set) details, err := s.command.GenerateMachineSecret(ctx, req.UserId, authz.GetCtxData(ctx).OrgID, secretGenerator, set)
if err != nil { if err != nil {
return nil, err return nil, err
} }
return &mgmt_pb.GenerateMachineSecretResponse{ return &mgmt_pb.GenerateMachineSecretResponse{
ClientId: set.ClientID, ClientId: user.PreferredLoginName,
ClientSecret: set.ClientSecret, ClientSecret: set.ClientSecret,
Details: obj_grpc.DomainToAddDetailsPb(details), Details: obj_grpc.DomainToAddDetailsPb(details),
}, nil }, nil

View File

@ -3,11 +3,13 @@ package session
import ( import (
"context" "context"
"google.golang.org/protobuf/types/known/structpb"
"google.golang.org/protobuf/types/known/timestamppb" "google.golang.org/protobuf/types/known/timestamppb"
"github.com/zitadel/zitadel/internal/api/authz" "github.com/zitadel/zitadel/internal/api/authz"
"github.com/zitadel/zitadel/internal/api/grpc/object/v2" "github.com/zitadel/zitadel/internal/api/grpc/object/v2"
"github.com/zitadel/zitadel/internal/command" "github.com/zitadel/zitadel/internal/command"
"github.com/zitadel/zitadel/internal/domain"
caos_errs "github.com/zitadel/zitadel/internal/errors" caos_errs "github.com/zitadel/zitadel/internal/errors"
"github.com/zitadel/zitadel/internal/query" "github.com/zitadel/zitadel/internal/query"
session "github.com/zitadel/zitadel/pkg/grpc/session/v2alpha" session "github.com/zitadel/zitadel/pkg/grpc/session/v2alpha"
@ -43,7 +45,9 @@ func (s *Server) CreateSession(ctx context.Context, req *session.CreateSessionRe
if err != nil { if err != nil {
return nil, err return nil, err
} }
set, err := s.command.CreateSession(ctx, checks, metadata) challengeResponse, cmds := s.challengesToCommand(req.GetChallenges(), checks)
set, err := s.command.CreateSession(ctx, cmds, metadata)
if err != nil { if err != nil {
return nil, err return nil, err
} }
@ -51,6 +55,7 @@ func (s *Server) CreateSession(ctx context.Context, req *session.CreateSessionRe
Details: object.DomainToDetailsPb(set.ObjectDetails), Details: object.DomainToDetailsPb(set.ObjectDetails),
SessionId: set.ID, SessionId: set.ID,
SessionToken: set.NewToken, SessionToken: set.NewToken,
Challenges: challengeResponse,
}, nil }, nil
} }
@ -59,7 +64,9 @@ func (s *Server) SetSession(ctx context.Context, req *session.SetSessionRequest)
if err != nil { if err != nil {
return nil, err return nil, err
} }
set, err := s.command.UpdateSession(ctx, req.GetSessionId(), req.GetSessionToken(), checks, req.GetMetadata()) challengeResponse, cmds := s.challengesToCommand(req.GetChallenges(), checks)
set, err := s.command.UpdateSession(ctx, req.GetSessionId(), req.GetSessionToken(), cmds, req.GetMetadata())
if err != nil { if err != nil {
return nil, err return nil, err
} }
@ -70,6 +77,7 @@ func (s *Server) SetSession(ctx context.Context, req *session.SetSessionRequest)
return &session.SetSessionResponse{ return &session.SetSessionResponse{
Details: object.DomainToDetailsPb(set.ObjectDetails), Details: object.DomainToDetailsPb(set.ObjectDetails),
SessionToken: set.NewToken, SessionToken: set.NewToken,
Challenges: challengeResponse,
}, nil }, nil
} }
@ -104,13 +112,13 @@ func sessionToPb(s *query.Session) *session.Session {
func factorsToPb(s *query.Session) *session.Factors { func factorsToPb(s *query.Session) *session.Factors {
user := userFactorToPb(s.UserFactor) user := userFactorToPb(s.UserFactor)
pw := passwordFactorToPb(s.PasswordFactor) if user == nil {
if user == nil && pw == nil {
return nil return nil
} }
return &session.Factors{ return &session.Factors{
User: user, User: user,
Password: pw, Password: passwordFactorToPb(s.PasswordFactor),
Passkey: passkeyFactorToPb(s.PasskeyFactor),
} }
} }
@ -123,6 +131,15 @@ func passwordFactorToPb(factor query.SessionPasswordFactor) *session.PasswordFac
} }
} }
func passkeyFactorToPb(factor query.SessionPasskeyFactor) *session.PasskeyFactor {
if factor.PasskeyCheckedAt.IsZero() {
return nil
}
return &session.PasskeyFactor{
VerifiedAt: timestamppb.New(factor.PasskeyCheckedAt),
}
}
func userFactorToPb(factor query.SessionUserFactor) *session.UserFactor { func userFactorToPb(factor query.SessionUserFactor) *session.UserFactor {
if factor.UserID == "" || factor.UserCheckedAt.IsZero() { if factor.UserID == "" || factor.UserCheckedAt.IsZero() {
return nil return nil
@ -180,7 +197,7 @@ func idsQueryToQuery(q *session.IDsQuery) (query.SearchQuery, error) {
return query.NewSessionIDsSearchQuery(q.Ids) return query.NewSessionIDsSearchQuery(q.Ids)
} }
func (s *Server) createSessionRequestToCommand(ctx context.Context, req *session.CreateSessionRequest) ([]command.SessionCheck, map[string][]byte, error) { func (s *Server) createSessionRequestToCommand(ctx context.Context, req *session.CreateSessionRequest) ([]command.SessionCommand, map[string][]byte, error) {
checks, err := s.checksToCommand(ctx, req.Checks) checks, err := s.checksToCommand(ctx, req.Checks)
if err != nil { if err != nil {
return nil, nil, err return nil, nil, err
@ -188,7 +205,7 @@ func (s *Server) createSessionRequestToCommand(ctx context.Context, req *session
return checks, req.GetMetadata(), nil return checks, req.GetMetadata(), nil
} }
func (s *Server) setSessionRequestToCommand(ctx context.Context, req *session.SetSessionRequest) ([]command.SessionCheck, error) { func (s *Server) setSessionRequestToCommand(ctx context.Context, req *session.SetSessionRequest) ([]command.SessionCommand, error) {
checks, err := s.checksToCommand(ctx, req.Checks) checks, err := s.checksToCommand(ctx, req.Checks)
if err != nil { if err != nil {
return nil, err return nil, err
@ -196,12 +213,12 @@ func (s *Server) setSessionRequestToCommand(ctx context.Context, req *session.Se
return checks, nil return checks, nil
} }
func (s *Server) checksToCommand(ctx context.Context, checks *session.Checks) ([]command.SessionCheck, error) { func (s *Server) checksToCommand(ctx context.Context, checks *session.Checks) ([]command.SessionCommand, error) {
checkUser, err := userCheck(checks.GetUser()) checkUser, err := userCheck(checks.GetUser())
if err != nil { if err != nil {
return nil, err return nil, err
} }
sessionChecks := make([]command.SessionCheck, 0, 2) sessionChecks := make([]command.SessionCommand, 0, 3)
if checkUser != nil { if checkUser != nil {
user, err := checkUser.search(ctx, s.query) user, err := checkUser.search(ctx, s.query)
if err != nil { if err != nil {
@ -212,9 +229,38 @@ func (s *Server) checksToCommand(ctx context.Context, checks *session.Checks) ([
if password := checks.GetPassword(); password != nil { if password := checks.GetPassword(); password != nil {
sessionChecks = append(sessionChecks, command.CheckPassword(password.GetPassword())) sessionChecks = append(sessionChecks, command.CheckPassword(password.GetPassword()))
} }
if passkey := checks.GetPasskey(); passkey != nil {
sessionChecks = append(sessionChecks, s.command.CheckPasskey(passkey.GetCredentialAssertionData()))
}
return sessionChecks, nil return sessionChecks, nil
} }
func (s *Server) challengesToCommand(challenges []session.ChallengeKind, cmds []command.SessionCommand) (*session.Challenges, []command.SessionCommand) {
if len(challenges) == 0 {
return nil, cmds
}
resp := new(session.Challenges)
for _, c := range challenges {
switch c {
case session.ChallengeKind_CHALLENGE_KIND_UNSPECIFIED:
continue
case session.ChallengeKind_CHALLENGE_KIND_PASSKEY:
passkeyChallenge, cmd := s.createPasskeyChallengeCommand()
resp.Passkey = passkeyChallenge
cmds = append(cmds, cmd)
}
}
return resp, cmds
}
func (s *Server) createPasskeyChallengeCommand() (*session.Challenges_Passkey, command.SessionCommand) {
challenge := &session.Challenges_Passkey{
PublicKeyCredentialRequestOptions: new(structpb.Struct),
}
return challenge, s.command.CreatePasskeyChallenge(domain.UserVerificationRequirementRequired, challenge.PublicKeyCredentialRequestOptions)
}
func userCheck(user *session.CheckUser) (userSearch, error) { func userCheck(user *session.CheckUser) (userSearch, error) {
if user == nil { if user == nil {
return nil, nil return nil, nil

View File

@ -0,0 +1,270 @@
//go:build integration
package session_test
import (
"context"
"os"
"testing"
"time"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"github.com/zitadel/zitadel/internal/integration"
object "github.com/zitadel/zitadel/pkg/grpc/object/v2alpha"
session "github.com/zitadel/zitadel/pkg/grpc/session/v2alpha"
user "github.com/zitadel/zitadel/pkg/grpc/user/v2alpha"
"google.golang.org/grpc/codes"
"google.golang.org/grpc/status"
)
var (
CTX context.Context
Tester *integration.Tester
Client session.SessionServiceClient
User *user.AddHumanUserResponse
)
func TestMain(m *testing.M) {
os.Exit(func() int {
ctx, errCtx, cancel := integration.Contexts(time.Hour)
defer cancel()
Tester = integration.NewTester(ctx)
defer Tester.Done()
Client = Tester.Client.SessionV2
CTX, _ = Tester.WithSystemAuthorization(ctx, integration.OrgOwner), errCtx
User = Tester.CreateHumanUser(CTX)
Tester.RegisterUserPasskey(CTX, User.GetUserId())
return m.Run()
}())
}
func verifyCurrentSession(t testing.TB, id, token string, sequence uint64, window time.Duration, metadata map[string][]byte, factors ...wantFactor) (s *session.Session) {
require.NotEmpty(t, id)
require.NotEmpty(t, token)
retry:
for {
resp, err := Client.GetSession(CTX, &session.GetSessionRequest{
SessionId: id,
SessionToken: &token,
})
if err == nil {
s = resp.GetSession()
break retry
}
if status.Convert(err).Code() == codes.NotFound {
select {
case <-CTX.Done():
t.Fatal(CTX.Err(), err)
case <-time.After(time.Second):
t.Log("retrying GetSession")
continue
}
}
require.NoError(t, err)
}
assert.Equal(t, id, s.GetId())
assert.WithinRange(t, s.GetCreationDate().AsTime(), time.Now().Add(-window), time.Now().Add(window))
assert.WithinRange(t, s.GetChangeDate().AsTime(), time.Now().Add(-window), time.Now().Add(window))
assert.Equal(t, sequence, s.GetSequence())
assert.Equal(t, metadata, s.GetMetadata())
verifyFactors(t, s.GetFactors(), window, factors)
return s
}
type wantFactor int
const (
wantUserFactor wantFactor = iota
wantPasswordFactor
wantPasskeyFactor
)
func verifyFactors(t testing.TB, factors *session.Factors, window time.Duration, want []wantFactor) {
for _, w := range want {
switch w {
case wantUserFactor:
uf := factors.GetUser()
assert.NotNil(t, uf)
assert.WithinRange(t, uf.GetVerifiedAt().AsTime(), time.Now().Add(-window), time.Now().Add(window))
assert.Equal(t, User.GetUserId(), uf.GetId())
case wantPasswordFactor:
pf := factors.GetPassword()
assert.NotNil(t, pf)
assert.WithinRange(t, pf.GetVerifiedAt().AsTime(), time.Now().Add(-window), time.Now().Add(window))
case wantPasskeyFactor:
pf := factors.GetPasskey()
assert.NotNil(t, pf)
assert.WithinRange(t, pf.GetVerifiedAt().AsTime(), time.Now().Add(-window), time.Now().Add(window))
}
}
}
func TestServer_CreateSession(t *testing.T) {
tests := []struct {
name string
req *session.CreateSessionRequest
want *session.CreateSessionResponse
wantErr bool
wantFactors []wantFactor
}{
{
name: "empty session",
req: &session.CreateSessionRequest{
Metadata: map[string][]byte{"foo": []byte("bar")},
},
want: &session.CreateSessionResponse{
Details: &object.Details{
ResourceOwner: Tester.Organisation.ID,
},
},
},
{
name: "with user",
req: &session.CreateSessionRequest{
Checks: &session.Checks{
User: &session.CheckUser{
Search: &session.CheckUser_UserId{
UserId: User.GetUserId(),
},
},
},
Metadata: map[string][]byte{"foo": []byte("bar")},
},
want: &session.CreateSessionResponse{
Details: &object.Details{
ResourceOwner: Tester.Organisation.ID,
},
},
wantFactors: []wantFactor{wantUserFactor},
},
{
name: "password without user error",
req: &session.CreateSessionRequest{
Checks: &session.Checks{
Password: &session.CheckPassword{
Password: "Difficult",
},
},
},
wantErr: true,
},
{
name: "passkey without user error",
req: &session.CreateSessionRequest{
Challenges: []session.ChallengeKind{
session.ChallengeKind_CHALLENGE_KIND_PASSKEY,
},
},
wantErr: true,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
got, err := Client.CreateSession(CTX, tt.req)
if tt.wantErr {
require.Error(t, err)
return
}
require.NoError(t, err)
integration.AssertDetails(t, tt.want, got)
verifyCurrentSession(t, got.GetSessionId(), got.GetSessionToken(), got.GetDetails().GetSequence(), time.Minute, tt.req.GetMetadata(), tt.wantFactors...)
})
}
}
func TestServer_CreateSession_passkey(t *testing.T) {
// create new session with user and request the passkey challenge
createResp, err := Client.CreateSession(CTX, &session.CreateSessionRequest{
Checks: &session.Checks{
User: &session.CheckUser{
Search: &session.CheckUser_UserId{
UserId: User.GetUserId(),
},
},
},
Challenges: []session.ChallengeKind{
session.ChallengeKind_CHALLENGE_KIND_PASSKEY,
},
})
require.NoError(t, err)
verifyCurrentSession(t, createResp.GetSessionId(), createResp.GetSessionToken(), createResp.GetDetails().GetSequence(), time.Minute, nil)
assertionData, err := Tester.WebAuthN.CreateAssertionResponse(createResp.GetChallenges().GetPasskey().GetPublicKeyCredentialRequestOptions())
require.NoError(t, err)
// update the session with passkey assertion data
updateResp, err := Client.SetSession(CTX, &session.SetSessionRequest{
SessionId: createResp.GetSessionId(),
SessionToken: createResp.GetSessionToken(),
Checks: &session.Checks{
Passkey: &session.CheckPasskey{
CredentialAssertionData: assertionData,
},
},
})
require.NoError(t, err)
verifyCurrentSession(t, createResp.GetSessionId(), updateResp.GetSessionToken(), updateResp.GetDetails().GetSequence(), time.Minute, nil, wantUserFactor, wantPasskeyFactor)
}
func TestServer_SetSession_flow(t *testing.T) {
var wantFactors []wantFactor
// create new, empty session
createResp, err := Client.CreateSession(CTX, &session.CreateSessionRequest{})
require.NoError(t, err)
verifyCurrentSession(t, createResp.GetSessionId(), createResp.GetSessionToken(), createResp.GetDetails().GetSequence(), time.Minute, nil, wantFactors...)
sessionToken := createResp.GetSessionToken()
t.Run("check user", func(t *testing.T) {
wantFactors = append(wantFactors, wantUserFactor)
resp, err := Client.SetSession(CTX, &session.SetSessionRequest{
SessionId: createResp.GetSessionId(),
SessionToken: sessionToken,
Checks: &session.Checks{
User: &session.CheckUser{
Search: &session.CheckUser_UserId{
UserId: User.GetUserId(),
},
},
},
})
require.NoError(t, err)
verifyCurrentSession(t, createResp.GetSessionId(), resp.GetSessionToken(), resp.GetDetails().GetSequence(), time.Minute, nil, wantFactors...)
sessionToken = resp.GetSessionToken()
})
t.Run("check passkey", func(t *testing.T) {
resp, err := Client.SetSession(CTX, &session.SetSessionRequest{
SessionId: createResp.GetSessionId(),
SessionToken: sessionToken,
Challenges: []session.ChallengeKind{
session.ChallengeKind_CHALLENGE_KIND_PASSKEY,
},
})
require.NoError(t, err)
verifyCurrentSession(t, createResp.GetSessionId(), resp.GetSessionToken(), resp.GetDetails().GetSequence(), time.Minute, nil, wantFactors...)
sessionToken = resp.GetSessionToken()
wantFactors = append(wantFactors, wantPasskeyFactor)
assertionData, err := Tester.WebAuthN.CreateAssertionResponse(resp.GetChallenges().GetPasskey().GetPublicKeyCredentialRequestOptions())
require.NoError(t, err)
resp, err = Client.SetSession(CTX, &session.SetSessionRequest{
SessionId: createResp.GetSessionId(),
SessionToken: sessionToken,
Checks: &session.Checks{
Passkey: &session.CheckPasskey{
CredentialAssertionData: assertionData,
},
},
})
require.NoError(t, err)
verifyCurrentSession(t, createResp.GetSessionId(), resp.GetSessionToken(), resp.GetDetails().GetSequence(), time.Minute, nil, wantFactors...)
})
}

View File

@ -49,7 +49,7 @@ func Test_sessionsToPb(t *testing.T) {
}, },
Metadata: map[string][]byte{"hello": []byte("world")}, Metadata: map[string][]byte{"hello": []byte("world")},
}, },
{ // no factor { // password factor
ID: "999", ID: "999",
CreationDate: now, CreationDate: now,
ChangeDate: now, ChangeDate: now,
@ -57,11 +57,36 @@ func Test_sessionsToPb(t *testing.T) {
State: domain.SessionStateActive, State: domain.SessionStateActive,
ResourceOwner: "me", ResourceOwner: "me",
Creator: "he", Creator: "he",
UserFactor: query.SessionUserFactor{
UserID: "345",
UserCheckedAt: past,
LoginName: "donald",
DisplayName: "donald duck",
},
PasswordFactor: query.SessionPasswordFactor{ PasswordFactor: query.SessionPasswordFactor{
PasswordCheckedAt: past, PasswordCheckedAt: past,
}, },
Metadata: map[string][]byte{"hello": []byte("world")}, Metadata: map[string][]byte{"hello": []byte("world")},
}, },
{ // passkey factor
ID: "999",
CreationDate: now,
ChangeDate: now,
Sequence: 123,
State: domain.SessionStateActive,
ResourceOwner: "me",
Creator: "he",
UserFactor: query.SessionUserFactor{
UserID: "345",
UserCheckedAt: past,
LoginName: "donald",
DisplayName: "donald duck",
},
PasskeyFactor: query.SessionPasskeyFactor{
PasskeyCheckedAt: past,
},
Metadata: map[string][]byte{"hello": []byte("world")},
},
} }
want := []*session.Session{ want := []*session.Session{
@ -94,12 +119,36 @@ func Test_sessionsToPb(t *testing.T) {
ChangeDate: timestamppb.New(now), ChangeDate: timestamppb.New(now),
Sequence: 123, Sequence: 123,
Factors: &session.Factors{ Factors: &session.Factors{
User: &session.UserFactor{
VerifiedAt: timestamppb.New(past),
Id: "345",
LoginName: "donald",
DisplayName: "donald duck",
},
Password: &session.PasswordFactor{ Password: &session.PasswordFactor{
VerifiedAt: timestamppb.New(past), VerifiedAt: timestamppb.New(past),
}, },
}, },
Metadata: map[string][]byte{"hello": []byte("world")}, Metadata: map[string][]byte{"hello": []byte("world")},
}, },
{ // passkey factor
Id: "999",
CreationDate: timestamppb.New(now),
ChangeDate: timestamppb.New(now),
Sequence: 123,
Factors: &session.Factors{
User: &session.UserFactor{
VerifiedAt: timestamppb.New(past),
Id: "345",
LoginName: "donald",
DisplayName: "donald duck",
},
Passkey: &session.PasskeyFactor{
VerifiedAt: timestamppb.New(past),
},
},
Metadata: map[string][]byte{"hello": []byte("world")},
},
} }
out := sessionsToPb(sessions) out := sessionsToPb(sessions)
@ -107,7 +156,7 @@ func Test_sessionsToPb(t *testing.T) {
for i, got := range out { for i, got := range out {
if !proto.Equal(got, want[i]) { if !proto.Equal(got, want[i]) {
t.Errorf("session %d got:\n%v\nwant:\n%v", i, got, want) t.Errorf("session %d got:\n%v\nwant:\n%v", i, got, want[i])
} }
} }
} }

View File

@ -3,9 +3,7 @@
package user_test package user_test
import ( import (
"fmt"
"testing" "testing"
"time"
"github.com/muhlemmer/gu" "github.com/muhlemmer/gu"
"github.com/stretchr/testify/assert" "github.com/stretchr/testify/assert"
@ -16,31 +14,8 @@ import (
"google.golang.org/protobuf/types/known/timestamppb" "google.golang.org/protobuf/types/known/timestamppb"
) )
func createHumanUser(t *testing.T) *user.AddHumanUserResponse {
resp, err := Client.AddHumanUser(CTX, &user.AddHumanUserRequest{
Organisation: &object.Organisation{
Org: &object.Organisation_OrgId{
OrgId: Tester.Organisation.ID,
},
},
Profile: &user.SetHumanProfile{
FirstName: "Mickey",
LastName: "Mouse",
},
Email: &user.SetHumanEmail{
Email: fmt.Sprintf("%d@mouse.com", time.Now().UnixNano()),
Verification: &user.SetHumanEmail_ReturnCode{
ReturnCode: &user.ReturnEmailVerificationCode{},
},
},
})
require.NoError(t, err)
require.NotEmpty(t, resp.GetUserId())
return resp
}
func TestServer_SetEmail(t *testing.T) { func TestServer_SetEmail(t *testing.T) {
userID := createHumanUser(t).GetUserId() userID := Tester.CreateHumanUser(CTX).GetUserId()
tests := []struct { tests := []struct {
name string name string
@ -158,7 +133,7 @@ func TestServer_SetEmail(t *testing.T) {
} }
func TestServer_VerifyEmail(t *testing.T) { func TestServer_VerifyEmail(t *testing.T) {
userResp := createHumanUser(t) userResp := Tester.CreateHumanUser(CTX)
tests := []struct { tests := []struct {
name string name string
req *user.VerifyEmailRequest req *user.VerifyEmailRequest

View File

@ -3,6 +3,9 @@ package user
import ( import (
"context" "context"
"google.golang.org/protobuf/encoding/protojson"
"google.golang.org/protobuf/types/known/structpb"
"github.com/zitadel/zitadel/internal/api/authz" "github.com/zitadel/zitadel/internal/api/authz"
"github.com/zitadel/zitadel/internal/api/grpc/object/v2" "github.com/zitadel/zitadel/internal/api/grpc/object/v2"
"github.com/zitadel/zitadel/internal/domain" "github.com/zitadel/zitadel/internal/domain"
@ -42,16 +45,24 @@ func passkeyRegistrationDetailsToPb(details *domain.PasskeyRegistrationDetails,
if err != nil { if err != nil {
return nil, err return nil, err
} }
options := new(structpb.Struct)
if err := protojson.Unmarshal(details.PublicKeyCredentialCreationOptions, options); err != nil {
return nil, caos_errs.ThrowInternal(err, "USERv2-Dohr6", "Errors.Internal")
}
return &user.RegisterPasskeyResponse{ return &user.RegisterPasskeyResponse{
Details: object.DomainToDetailsPb(details.ObjectDetails), Details: object.DomainToDetailsPb(details.ObjectDetails),
PasskeyId: details.PasskeyID, PasskeyId: details.PasskeyID,
PublicKeyCredentialCreationOptions: details.PublicKeyCredentialCreationOptions, PublicKeyCredentialCreationOptions: options,
}, nil }, nil
} }
func (s *Server) VerifyPasskeyRegistration(ctx context.Context, req *user.VerifyPasskeyRegistrationRequest) (*user.VerifyPasskeyRegistrationResponse, error) { func (s *Server) VerifyPasskeyRegistration(ctx context.Context, req *user.VerifyPasskeyRegistrationRequest) (*user.VerifyPasskeyRegistrationResponse, error) {
resourceOwner := authz.GetCtxData(ctx).ResourceOwner resourceOwner := authz.GetCtxData(ctx).ResourceOwner
objectDetails, err := s.command.HumanHumanPasswordlessSetup(ctx, req.GetUserId(), resourceOwner, req.GetPasskeyName(), "", req.GetPublicKeyCredential()) pkc, err := protojson.Marshal(req.GetPublicKeyCredential())
if err != nil {
return nil, caos_errs.ThrowInternal(err, "USERv2-Pha2o", "Errors.Internal")
}
objectDetails, err := s.command.HumanHumanPasswordlessSetup(ctx, req.GetUserId(), resourceOwner, req.GetPasskeyName(), "", pkc)
if err != nil { if err != nil {
return nil, err return nil, err
} }

View File

@ -10,19 +10,18 @@ import (
"github.com/stretchr/testify/assert" "github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require" "github.com/stretchr/testify/require"
"github.com/zitadel/zitadel/internal/integration" "github.com/zitadel/zitadel/internal/integration"
"github.com/zitadel/zitadel/internal/webauthn"
object "github.com/zitadel/zitadel/pkg/grpc/object/v2alpha" object "github.com/zitadel/zitadel/pkg/grpc/object/v2alpha"
user "github.com/zitadel/zitadel/pkg/grpc/user/v2alpha" user "github.com/zitadel/zitadel/pkg/grpc/user/v2alpha"
"google.golang.org/protobuf/types/known/structpb"
) )
func TestServer_RegisterPasskey(t *testing.T) { func TestServer_RegisterPasskey(t *testing.T) {
userID := createHumanUser(t).GetUserId() userID := Tester.CreateHumanUser(CTX).GetUserId()
reg, err := Client.CreatePasskeyRegistrationLink(CTX, &user.CreatePasskeyRegistrationLinkRequest{ reg, err := Client.CreatePasskeyRegistrationLink(CTX, &user.CreatePasskeyRegistrationLinkRequest{
UserId: userID, UserId: userID,
Medium: &user.CreatePasskeyRegistrationLinkRequest_ReturnCode{}, Medium: &user.CreatePasskeyRegistrationLinkRequest_ReturnCode{},
}) })
require.NoError(t, err) require.NoError(t, err)
client := webauthn.NewClient(Tester.Config.WebAuthNName, Tester.Config.ExternalDomain, "https://"+Tester.Host())
type args struct { type args struct {
ctx context.Context ctx context.Context
@ -125,7 +124,7 @@ func TestServer_RegisterPasskey(t *testing.T) {
if tt.want != nil { if tt.want != nil {
assert.NotEmpty(t, got.GetPasskeyId()) assert.NotEmpty(t, got.GetPasskeyId())
assert.NotEmpty(t, got.GetPublicKeyCredentialCreationOptions()) assert.NotEmpty(t, got.GetPublicKeyCredentialCreationOptions())
_, err := client.CreateAttestationResponse(got.GetPublicKeyCredentialCreationOptions()) _, err = Tester.WebAuthN.CreateAttestationResponse(got.GetPublicKeyCredentialCreationOptions())
require.NoError(t, err) require.NoError(t, err)
} }
}) })
@ -133,7 +132,7 @@ func TestServer_RegisterPasskey(t *testing.T) {
} }
func TestServer_VerifyPasskeyRegistration(t *testing.T) { func TestServer_VerifyPasskeyRegistration(t *testing.T) {
userID := createHumanUser(t).GetUserId() userID := Tester.CreateHumanUser(CTX).GetUserId()
reg, err := Client.CreatePasskeyRegistrationLink(CTX, &user.CreatePasskeyRegistrationLinkRequest{ reg, err := Client.CreatePasskeyRegistrationLink(CTX, &user.CreatePasskeyRegistrationLinkRequest{
UserId: userID, UserId: userID,
Medium: &user.CreatePasskeyRegistrationLinkRequest_ReturnCode{}, Medium: &user.CreatePasskeyRegistrationLinkRequest_ReturnCode{},
@ -147,8 +146,7 @@ func TestServer_VerifyPasskeyRegistration(t *testing.T) {
require.NotEmpty(t, pkr.GetPasskeyId()) require.NotEmpty(t, pkr.GetPasskeyId())
require.NotEmpty(t, pkr.GetPublicKeyCredentialCreationOptions()) require.NotEmpty(t, pkr.GetPublicKeyCredentialCreationOptions())
client := webauthn.NewClient(Tester.Config.WebAuthNName, Tester.Config.ExternalDomain, "https://"+Tester.Host()) attestationResponse, err := Tester.WebAuthN.CreateAttestationResponse(pkr.GetPublicKeyCredentialCreationOptions())
attestationResponse, err := client.CreateAttestationResponse(pkr.GetPublicKeyCredentialCreationOptions())
require.NoError(t, err) require.NoError(t, err)
type args struct { type args struct {
@ -167,7 +165,7 @@ func TestServer_VerifyPasskeyRegistration(t *testing.T) {
ctx: CTX, ctx: CTX,
req: &user.VerifyPasskeyRegistrationRequest{ req: &user.VerifyPasskeyRegistrationRequest{
PasskeyId: pkr.GetPasskeyId(), PasskeyId: pkr.GetPasskeyId(),
PublicKeyCredential: []byte(attestationResponse), PublicKeyCredential: attestationResponse,
PasskeyName: "nice name", PasskeyName: "nice name",
}, },
}, },
@ -195,10 +193,12 @@ func TestServer_VerifyPasskeyRegistration(t *testing.T) {
args: args{ args: args{
ctx: CTX, ctx: CTX,
req: &user.VerifyPasskeyRegistrationRequest{ req: &user.VerifyPasskeyRegistrationRequest{
UserId: userID, UserId: userID,
PasskeyId: pkr.GetPasskeyId(), PasskeyId: pkr.GetPasskeyId(),
PublicKeyCredential: []byte("attestationResponseattestationResponseattestationResponse"), PublicKeyCredential: &structpb.Struct{
PasskeyName: "nice name", Fields: map[string]*structpb.Value{"foo": {Kind: &structpb.Value_StringValue{StringValue: "bar"}}},
},
PasskeyName: "nice name",
}, },
}, },
wantErr: true, wantErr: true,
@ -219,7 +219,7 @@ func TestServer_VerifyPasskeyRegistration(t *testing.T) {
} }
func TestServer_CreatePasskeyRegistrationLink(t *testing.T) { func TestServer_CreatePasskeyRegistrationLink(t *testing.T) {
userID := createHumanUser(t).GetUserId() userID := Tester.CreateHumanUser(CTX).GetUserId()
type args struct { type args struct {
ctx context.Context ctx context.Context

View File

@ -7,10 +7,13 @@ import (
"github.com/stretchr/testify/assert" "github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require" "github.com/stretchr/testify/require"
"google.golang.org/protobuf/proto"
"google.golang.org/protobuf/types/known/structpb"
"google.golang.org/protobuf/types/known/timestamppb" "google.golang.org/protobuf/types/known/timestamppb"
"github.com/zitadel/zitadel/internal/api/grpc" "github.com/zitadel/zitadel/internal/api/grpc"
"github.com/zitadel/zitadel/internal/domain" "github.com/zitadel/zitadel/internal/domain"
caos_errs "github.com/zitadel/zitadel/internal/errors"
object "github.com/zitadel/zitadel/pkg/grpc/object/v2alpha" object "github.com/zitadel/zitadel/pkg/grpc/object/v2alpha"
user "github.com/zitadel/zitadel/pkg/grpc/user/v2alpha" user "github.com/zitadel/zitadel/pkg/grpc/user/v2alpha"
) )
@ -51,9 +54,10 @@ func Test_passkeyRegistrationDetailsToPb(t *testing.T) {
err error err error
} }
tests := []struct { tests := []struct {
name string name string
args args args args
want *user.RegisterPasskeyResponse want *user.RegisterPasskeyResponse
wantErr error
}{ }{
{ {
name: "an error", name: "an error",
@ -61,6 +65,23 @@ func Test_passkeyRegistrationDetailsToPb(t *testing.T) {
details: nil, details: nil,
err: io.ErrClosedPipe, err: io.ErrClosedPipe,
}, },
wantErr: io.ErrClosedPipe,
},
{
name: "unmarshall error",
args: args{
details: &domain.PasskeyRegistrationDetails{
ObjectDetails: &domain.ObjectDetails{
Sequence: 22,
EventDate: time.Unix(3000, 22),
ResourceOwner: "me",
},
PasskeyID: "123",
PublicKeyCredentialCreationOptions: []byte(`\\`),
},
err: nil,
},
wantErr: caos_errs.ThrowInternal(nil, "USERv2-Dohr6", "Errors.Internal"),
}, },
{ {
name: "ok", name: "ok",
@ -72,7 +93,7 @@ func Test_passkeyRegistrationDetailsToPb(t *testing.T) {
ResourceOwner: "me", ResourceOwner: "me",
}, },
PasskeyID: "123", PasskeyID: "123",
PublicKeyCredentialCreationOptions: []byte{1, 2, 3}, PublicKeyCredentialCreationOptions: []byte(`{"foo": "bar"}`),
}, },
err: nil, err: nil,
}, },
@ -85,16 +106,20 @@ func Test_passkeyRegistrationDetailsToPb(t *testing.T) {
}, },
ResourceOwner: "me", ResourceOwner: "me",
}, },
PasskeyId: "123", PasskeyId: "123",
PublicKeyCredentialCreationOptions: []byte{1, 2, 3}, PublicKeyCredentialCreationOptions: &structpb.Struct{
Fields: map[string]*structpb.Value{"foo": {Kind: &structpb.Value_StringValue{StringValue: "bar"}}},
},
}, },
}, },
} }
for _, tt := range tests { for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) { t.Run(tt.name, func(t *testing.T) {
got, err := passkeyRegistrationDetailsToPb(tt.args.details, tt.args.err) got, err := passkeyRegistrationDetailsToPb(tt.args.details, tt.args.err)
require.ErrorIs(t, err, tt.args.err) require.ErrorIs(t, err, tt.wantErr)
assert.Equal(t, tt.want, got) if !proto.Equal(tt.want, got) {
t.Errorf("Not equal:\nExpected\n%s\nActual:%s", tt.want, got)
}
if tt.want != nil { if tt.want != nil {
grpc.AllFieldsSet(t, got.ProtoReflect()) grpc.AllFieldsSet(t, got.ProtoReflect())
} }

View File

@ -42,7 +42,7 @@ func TestMain(m *testing.M) {
defer Tester.Done() defer Tester.Done()
CTX, ErrCTX = Tester.WithSystemAuthorization(ctx, integration.OrgOwner), errCtx CTX, ErrCTX = Tester.WithSystemAuthorization(ctx, integration.OrgOwner), errCtx
Client = user.NewUserServiceClient(Tester.GRPCClientConn) Client = Tester.Client.UserV2
return m.Run() return m.Run()
}()) }())
} }

View File

@ -286,17 +286,19 @@ func (l *Login) handleExternalUserAuthenticated(
callback func(w http.ResponseWriter, r *http.Request, authReq *domain.AuthRequest), callback func(w http.ResponseWriter, r *http.Request, authReq *domain.AuthRequest),
) { ) {
externalUser := mapIDPUserToExternalUser(user, provider.ID) externalUser := mapIDPUserToExternalUser(user, provider.ID)
externalUser, externalUserChange, err := l.runPostExternalAuthenticationActions(externalUser, tokens(session), authReq, r, user, nil) // check and fill in local linked user
externalErr := l.authRepo.CheckExternalUserLogin(setContext(r.Context(), ""), authReq.ID, authReq.AgentID, externalUser, domain.BrowserInfoFromRequest(r))
if !errors.IsNotFound(externalErr) {
l.renderError(w, r, authReq, externalErr)
return
}
externalUser, externalUserChange, err := l.runPostExternalAuthenticationActions(externalUser, tokens(session), authReq, r, user, externalErr)
if err != nil { if err != nil {
l.renderError(w, r, authReq, err) l.renderError(w, r, authReq, err)
return return
} }
err = l.authRepo.CheckExternalUserLogin(setContext(r.Context(), ""), authReq.ID, authReq.AgentID, externalUser, domain.BrowserInfoFromRequest(r)) // if action is done and no user linked then link or register
if err != nil { if errors.IsNotFound(externalErr) {
if !errors.IsNotFound(err) {
l.renderError(w, r, authReq, err)
return
}
l.externalUserNotExisting(w, r, authReq, provider, externalUser) l.externalUserNotExisting(w, r, authReq, provider, externalUser)
return return
} }

View File

@ -1,68 +0,0 @@
/*
* modified version of:
*
* base64-arraybuffer
* https://github.com/niklasvh/base64-arraybuffer
*
* Copyright (c) 2012 Niklas von Hertzen
* Licensed under the MIT license.
*/
"use strict";
let chars = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
// Use a lookup table to find the index.
let lookup = new Uint8Array(256);
for (var i = 0; i < chars.length; i++) {
lookup[chars.charCodeAt(i)] = i;
}
function encode(arraybuffer) {
let bytes = new Uint8Array(arraybuffer),
i, len = bytes.length, base64 = "";
for (i = 0; i < len; i += 3) {
base64 += chars[bytes[i] >> 2];
base64 += chars[((bytes[i] & 3) << 4) | (bytes[i + 1] >> 4)];
base64 += chars[((bytes[i + 1] & 15) << 2) | (bytes[i + 2] >> 6)];
base64 += chars[bytes[i + 2] & 63];
}
if ((len % 3) === 2) {
base64 = base64.substring(0, base64.length - 1) + "=";
} else if (len % 3 === 1) {
base64 = base64.substring(0, base64.length - 2) + "==";
}
return base64;
}
function decode(base64) {
let bufferLength = base64.length * 0.75,
len = base64.length, i, p = 0,
encoded1, encoded2, encoded3, encoded4;
if (base64[base64.length - 1] === "=") {
bufferLength--;
if (base64[base64.length - 2] === "=") {
bufferLength--;
}
}
let arraybuffer = new ArrayBuffer(bufferLength),
bytes = new Uint8Array(arraybuffer);
for (i = 0; i < len; i += 4) {
encoded1 = lookup[base64.charCodeAt(i)];
encoded2 = lookup[base64.charCodeAt(i + 1)];
encoded3 = lookup[base64.charCodeAt(i + 2)];
encoded4 = lookup[base64.charCodeAt(i + 3)];
bytes[p++] = (encoded1 << 2) | (encoded2 >> 4);
bytes[p++] = ((encoded2 & 15) << 4) | (encoded3 >> 2);
bytes[p++] = ((encoded3 & 3) << 6) | (encoded4 & 63);
}
return arraybuffer;
}

View File

@ -0,0 +1,63 @@
function coerceToBase64Url(thing, name) {
// Array or ArrayBuffer to Uint8Array
if (Array.isArray(thing)) {
thing = Uint8Array.from(thing);
}
if (thing instanceof ArrayBuffer) {
thing = new Uint8Array(thing);
}
// Uint8Array to base64
if (thing instanceof Uint8Array) {
var str = "";
var len = thing.byteLength;
for (var i = 0; i < len; i++) {
str += String.fromCharCode(thing[i]);
}
thing = window.btoa(str);
}
if (typeof thing !== "string") {
throw new Error("could not coerce '" + name + "' to string");
}
// base64 to base64url
// NOTE: "=" at the end of challenge is optional, strip it off here
thing = thing.replace(/\+/g, "-").replace(/\//g, "_").replace(/=*$/g, "");
return thing;
}
function coerceToArrayBuffer(thing, name) {
if (typeof thing === "string") {
// base64url to base64
thing = thing.replace(/-/g, "+").replace(/_/g, "/");
// base64 to Uint8Array
var str = window.atob(thing);
var bytes = new Uint8Array(str.length);
for (var i = 0; i < str.length; i++) {
bytes[i] = str.charCodeAt(i);
}
thing = bytes;
}
// Array to Uint8Array
if (Array.isArray(thing)) {
thing = new Uint8Array(thing);
}
// Uint8Array to ArrayBuffer
if (thing instanceof Uint8Array) {
thing = thing.buffer;
}
// error if none of the above worked
if (!(thing instanceof ArrayBuffer)) {
throw new TypeError("could not coerce '" + name + "' to ArrayBuffer");
}
return thing;
}

View File

@ -1,31 +1,28 @@
function checkWebauthnSupported(button, func) { function checkWebauthnSupported(button, func) {
let support = document.getElementsByClassName("wa-support"); let support = document.getElementsByClassName("wa-support");
let noSupport = document.getElementsByClassName("wa-no-support"); let noSupport = document.getElementsByClassName("wa-no-support");
if (!window.PublicKeyCredential) { if (!window.PublicKeyCredential) {
for (let item of noSupport) { for (let item of noSupport) {
item.classList.remove('hidden'); item.classList.remove("hidden");
}
for (let item of support) {
item.classList.add('hidden');
}
return;
} }
document.getElementById(button).addEventListener('click', func); for (let item of support) {
item.classList.add("hidden");
}
return;
}
document.getElementById(button).addEventListener("click", func);
} }
function webauthnError(error) { function webauthnError(error) {
let err = document.getElementById('wa-error'); let err = document.getElementById("wa-error");
err.getElementsByClassName('cause')[0].innerText = error.message; err.getElementsByClassName("cause")[0].innerText = error.message;
err.classList.remove('hidden'); err.classList.remove("hidden");
} }
function bufferDecode(value) { function bufferDecode(value, name) {
return decode(value); return coerceToArrayBuffer(value, name);
} }
function bufferEncode(value) { function bufferEncode(value, name) {
return encode(value) return coerceToBase64Url(value, name);
.replace(/\+/g, "-")
.replace(/\//g, "_")
.replace(/=/g, "");
} }

View File

@ -1,41 +1,54 @@
document.addEventListener('DOMContentLoaded', checkWebauthnSupported('btn-login', login)); document.addEventListener(
"DOMContentLoaded",
checkWebauthnSupported("btn-login", login)
);
function login() { function login() {
document.getElementById('wa-error').classList.add('hidden'); document.getElementById("wa-error").classList.add("hidden");
let makeAssertionOptions = JSON.parse(atob(document.getElementsByName('credentialAssertionData')[0].value)); let makeAssertionOptions = JSON.parse(
makeAssertionOptions.publicKey.challenge = bufferDecode(makeAssertionOptions.publicKey.challenge); atob(document.getElementsByName("credentialAssertionData")[0].value)
makeAssertionOptions.publicKey.allowCredentials.forEach(function (listItem) { );
listItem.id = bufferDecode(listItem.id) makeAssertionOptions.publicKey.challenge = bufferDecode(
}); makeAssertionOptions.publicKey.challenge,
navigator.credentials.get({ "publicKey.challenge"
publicKey: makeAssertionOptions.publicKey );
}).then(function (credential) { makeAssertionOptions.publicKey.allowCredentials.forEach(function (listItem) {
verifyAssertion(credential); listItem.id = bufferDecode(listItem.id, "publicKey.allowCredentials.id");
}).catch(function (err) { });
webauthnError(err); navigator.credentials
.get({
publicKey: makeAssertionOptions.publicKey,
})
.then(function (credential) {
verifyAssertion(credential);
})
.catch(function (err) {
webauthnError(err);
}); });
} }
function verifyAssertion(assertedCredential) { function verifyAssertion(assertedCredential) {
let authData = new Uint8Array(assertedCredential.response.authenticatorData); let authData = new Uint8Array(assertedCredential.response.authenticatorData);
let clientDataJSON = new Uint8Array(assertedCredential.response.clientDataJSON); let clientDataJSON = new Uint8Array(
let rawId = new Uint8Array(assertedCredential.rawId); assertedCredential.response.clientDataJSON
let sig = new Uint8Array(assertedCredential.response.signature); );
let userHandle = new Uint8Array(assertedCredential.response.userHandle); let rawId = new Uint8Array(assertedCredential.rawId);
let sig = new Uint8Array(assertedCredential.response.signature);
let userHandle = new Uint8Array(assertedCredential.response.userHandle);
let data = JSON.stringify({ let data = JSON.stringify({
id: assertedCredential.id, id: assertedCredential.id,
rawId: bufferEncode(rawId), rawId: bufferEncode(rawId),
type: assertedCredential.type, type: assertedCredential.type,
response: { response: {
authenticatorData: bufferEncode(authData), authenticatorData: bufferEncode(authData),
clientDataJSON: bufferEncode(clientDataJSON), clientDataJSON: bufferEncode(clientDataJSON),
signature: bufferEncode(sig), signature: bufferEncode(sig),
userHandle: bufferEncode(userHandle), userHandle: bufferEncode(userHandle),
}, },
}) });
document.getElementsByName('credentialData')[0].value = btoa(data); document.getElementsByName("credentialData")[0].value = btoa(data);
document.getElementsByTagName('form')[0].submit(); document.getElementsByTagName("form")[0].submit();
} }

View File

@ -1,42 +1,61 @@
document.addEventListener('DOMContentLoaded', checkWebauthnSupported('btn-register', registerCredential)); document.addEventListener(
"DOMContentLoaded",
checkWebauthnSupported("btn-register", registerCredential)
);
function registerCredential() { function registerCredential() {
document.getElementById('wa-error').classList.add('hidden'); document.getElementById("wa-error").classList.add("hidden");
let opt = JSON.parse(atob(document.getElementsByName('credentialCreationData')[0].value)); let opt = JSON.parse(
opt.publicKey.challenge = bufferDecode(opt.publicKey.challenge); atob(document.getElementsByName("credentialCreationData")[0].value)
opt.publicKey.user.id = bufferDecode(opt.publicKey.user.id); );
if (opt.publicKey.excludeCredentials) { opt.publicKey.challenge = bufferDecode(
for (let i = 0; i < opt.publicKey.excludeCredentials.length; i++) { opt.publicKey.challenge,
if (opt.publicKey.excludeCredentials[i].id !== null) { "publicKey.challenge"
opt.publicKey.excludeCredentials[i].id = bufferDecode(opt.publicKey.excludeCredentials[i].id); );
} opt.publicKey.user.id = bufferDecode(
} opt.publicKey.user.id,
"publicKey.user.id"
);
if (opt.publicKey.excludeCredentials) {
for (let i = 0; i < opt.publicKey.excludeCredentials.length; i++) {
if (opt.publicKey.excludeCredentials[i].id !== null) {
opt.publicKey.excludeCredentials[i].id = bufferDecode(
opt.publicKey.excludeCredentials[i].id,
"publicKey.excludeCredentials"
);
}
} }
navigator.credentials.create({ }
publicKey: opt.publicKey navigator.credentials
}).then(function (credential) { .create({
createCredential(credential); publicKey: opt.publicKey,
}).catch(function (err) { })
webauthnError(err); .then(function (credential) {
createCredential(credential);
})
.catch(function (err) {
webauthnError(err);
}); });
} }
function createCredential(newCredential) { function createCredential(newCredential) {
let attestationObject = new Uint8Array(newCredential.response.attestationObject); let attestationObject = new Uint8Array(
let clientDataJSON = new Uint8Array(newCredential.response.clientDataJSON); newCredential.response.attestationObject
let rawId = new Uint8Array(newCredential.rawId); );
let clientDataJSON = new Uint8Array(newCredential.response.clientDataJSON);
let rawId = new Uint8Array(newCredential.rawId);
let data = JSON.stringify({ let data = JSON.stringify({
id: newCredential.id, id: newCredential.id,
rawId: bufferEncode(rawId), rawId: bufferEncode(rawId),
type: newCredential.type, type: newCredential.type,
response: { response: {
attestationObject: bufferEncode(attestationObject), attestationObject: bufferEncode(attestationObject),
clientDataJSON: bufferEncode(clientDataJSON), clientDataJSON: bufferEncode(clientDataJSON),
}, },
}); });
document.getElementsByName('credentialData')[0].value = btoa(data); document.getElementsByName("credentialData")[0].value = btoa(data);
document.getElementsByTagName('form')[0].submit(); document.getElementsByTagName("form")[0].submit();
} }

View File

@ -218,7 +218,7 @@ body.waiting * {
footer { footer {
width: 100%; width: 100%;
box-sizing: border-box; box-sizing: border-box;
background: rgba(0, 0, 0, 0.1254901961); background: #00000020;
min-height: 50px; min-height: 50px;
display: flex; display: flex;
align-items: center; align-items: center;
@ -759,7 +759,7 @@ i {
letter-spacing: 0.05em; letter-spacing: 0.05em;
font-size: 12px; font-size: 12px;
white-space: nowrap; white-space: nowrap;
box-shadow: 0 0 3px rgba(0, 0, 0, 0.1019607843); box-shadow: 0 0 3px #0000001a;
width: fit-content; width: fit-content;
line-height: 1rem; line-height: 1rem;
} }
@ -1211,7 +1211,7 @@ i {
footer { footer {
width: 100%; width: 100%;
box-sizing: border-box; box-sizing: border-box;
background: rgba(0, 0, 0, 0.1254901961); background: #00000020;
min-height: 50px; min-height: 50px;
display: flex; display: flex;
align-items: center; align-items: center;
@ -1752,7 +1752,7 @@ i {
letter-spacing: 0.05em; letter-spacing: 0.05em;
font-size: 12px; font-size: 12px;
white-space: nowrap; white-space: nowrap;
box-shadow: 0 0 3px rgba(0, 0, 0, 0.1019607843); box-shadow: 0 0 3px #0000001a;
width: fit-content; width: fit-content;
line-height: 1rem; line-height: 1rem;
} }

File diff suppressed because one or more lines are too long

View File

@ -41,7 +41,7 @@
</div> </div>
</form> </form>
<script src="{{ resourceUrl "scripts/base64.js" }}"></script> <script src="{{ resourceUrl "scripts/utils.js" }}"></script>
<script src="{{ resourceUrl "scripts/webauthn.js" }}"></script> <script src="{{ resourceUrl "scripts/webauthn.js" }}"></script>
<script src="{{ resourceUrl "scripts/webauthn_register.js" }}"></script> <script src="{{ resourceUrl "scripts/webauthn_register.js" }}"></script>

View File

@ -41,7 +41,7 @@
{{ end }} {{ end }}
</form> </form>
<script src="{{ resourceUrl "scripts/base64.js" }}"></script> <script src="{{ resourceUrl "scripts/utils.js" }}"></script>
<script src="{{ resourceUrl "scripts/webauthn.js" }}"></script> <script src="{{ resourceUrl "scripts/webauthn.js" }}"></script>
<script src="{{ resourceUrl "scripts/webauthn_login.js" }}"></script> <script src="{{ resourceUrl "scripts/webauthn_login.js" }}"></script>

View File

@ -37,7 +37,7 @@
</div> </div>
</form> </form>
<script src="{{ resourceUrl "scripts/base64.js" }}"></script> <script src="{{ resourceUrl "scripts/utils.js" }}"></script>
<script src="{{ resourceUrl "scripts/webauthn.js" }}"></script> <script src="{{ resourceUrl "scripts/webauthn.js" }}"></script>
<script src="{{ resourceUrl "scripts/webauthn_login.js" }}"></script> <script src="{{ resourceUrl "scripts/webauthn_login.js" }}"></script>

View File

@ -45,7 +45,7 @@
</div> </div>
</form> </form>
<script src="{{ resourceUrl "scripts/base64.js" }}"></script> <script src="{{ resourceUrl "scripts/utils.js" }}"></script>
<script src="{{ resourceUrl "scripts/webauthn.js" }}"></script> <script src="{{ resourceUrl "scripts/webauthn.js" }}"></script>
<script src="{{ resourceUrl "scripts/webauthn_register.js" }}"></script> <script src="{{ resourceUrl "scripts/webauthn_register.js" }}"></script>

View File

@ -211,6 +211,12 @@ func (wm *OIDCIDPWriteModel) Reduce() error {
wm.reduceAddedEvent(e) wm.reduceAddedEvent(e)
case *idp.OIDCIDPChangedEvent: case *idp.OIDCIDPChangedEvent:
wm.reduceChangedEvent(e) wm.reduceChangedEvent(e)
case *idp.OIDCIDPMigratedAzureADEvent:
wm.State = domain.IDPStateMigrated
case *idp.OIDCIDPMigratedGoogleEvent:
wm.State = domain.IDPStateMigrated
case *idp.RemovedEvent:
wm.State = domain.IDPStateRemoved
case *idpconfig.IDPConfigAddedEvent: case *idpconfig.IDPConfigAddedEvent:
wm.reduceIDPConfigAddedEvent(e) wm.reduceIDPConfigAddedEvent(e)
case *idpconfig.IDPConfigChangedEvent: case *idpconfig.IDPConfigChangedEvent:
@ -397,6 +403,8 @@ func (wm *JWTIDPWriteModel) Reduce() error {
wm.reduceAddedEvent(e) wm.reduceAddedEvent(e)
case *idp.JWTIDPChangedEvent: case *idp.JWTIDPChangedEvent:
wm.reduceChangedEvent(e) wm.reduceChangedEvent(e)
case *idp.RemovedEvent:
wm.State = domain.IDPStateRemoved
case *idpconfig.IDPConfigAddedEvent: case *idpconfig.IDPConfigAddedEvent:
wm.reduceIDPConfigAddedEvent(e) wm.reduceIDPConfigAddedEvent(e)
case *idpconfig.IDPConfigChangedEvent: case *idpconfig.IDPConfigChangedEvent:
@ -558,6 +566,8 @@ func (wm *AzureADIDPWriteModel) Reduce() error {
switch e := event.(type) { switch e := event.(type) {
case *idp.AzureADIDPAddedEvent: case *idp.AzureADIDPAddedEvent:
wm.reduceAddedEvent(e) wm.reduceAddedEvent(e)
case *idp.OIDCIDPMigratedAzureADEvent:
wm.reduceAddedEvent(&e.AzureADIDPAddedEvent)
case *idp.AzureADIDPChangedEvent: case *idp.AzureADIDPChangedEvent:
wm.reduceChangedEvent(e) wm.reduceChangedEvent(e)
case *idp.RemovedEvent: case *idp.RemovedEvent:
@ -1195,6 +1205,8 @@ func (wm *GoogleIDPWriteModel) Reduce() error {
wm.reduceAddedEvent(e) wm.reduceAddedEvent(e)
case *idp.GoogleIDPChangedEvent: case *idp.GoogleIDPChangedEvent:
wm.reduceChangedEvent(e) wm.reduceChangedEvent(e)
case *idp.OIDCIDPMigratedGoogleEvent:
wm.reduceAddedEvent(&e.GoogleIDPAddedEvent)
case *idp.RemovedEvent: case *idp.RemovedEvent:
wm.State = domain.IDPStateRemoved wm.State = domain.IDPStateRemoved
} }

View File

@ -97,6 +97,40 @@ func (c *Commands) UpdateInstanceGenericOIDCProvider(ctx context.Context, id str
return pushedEventsToObjectDetails(pushedEvents), nil return pushedEventsToObjectDetails(pushedEvents), nil
} }
func (c *Commands) MigrateInstanceGenericOIDCToAzureADProvider(ctx context.Context, id string, provider AzureADProvider) (*domain.ObjectDetails, error) {
return c.migrateInstanceGenericOIDC(ctx, id, provider)
}
func (c *Commands) MigrateInstanceGenericOIDCToGoogleProvider(ctx context.Context, id string, provider GoogleProvider) (*domain.ObjectDetails, error) {
return c.migrateInstanceGenericOIDC(ctx, id, provider)
}
func (c *Commands) migrateInstanceGenericOIDC(ctx context.Context, id string, provider interface{}) (*domain.ObjectDetails, error) {
instanceID := authz.GetInstance(ctx).InstanceID()
instanceAgg := instance.NewAggregate(instanceID)
writeModel := NewOIDCInstanceIDPWriteModel(instanceID, id)
var validation preparation.Validation
switch p := provider.(type) {
case AzureADProvider:
validation = c.prepareMigrateInstanceOIDCToAzureADProvider(instanceAgg, writeModel, p)
case GoogleProvider:
validation = c.prepareMigrateInstanceOIDCToGoogleProvider(instanceAgg, writeModel, p)
default:
return nil, caos_errs.ThrowInvalidArgument(nil, "COMMAND-s9219", "Errors.IDPConfig.NotExisting")
}
cmds, err := preparation.PrepareCommands(ctx, c.eventstore.Filter, validation)
if err != nil {
return nil, err
}
pushedEvents, err := c.eventstore.Push(ctx, cmds...)
if err != nil {
return nil, err
}
return pushedEventsToObjectDetails(pushedEvents), nil
}
func (c *Commands) AddInstanceJWTProvider(ctx context.Context, provider JWTProvider) (string, *domain.ObjectDetails, error) { func (c *Commands) AddInstanceJWTProvider(ctx context.Context, provider JWTProvider) (string, *domain.ObjectDetails, error) {
instanceID := authz.GetInstance(ctx).InstanceID() instanceID := authz.GetInstance(ctx).InstanceID()
instanceAgg := instance.NewAggregate(instanceID) instanceAgg := instance.NewAggregate(instanceID)
@ -552,7 +586,7 @@ func (c *Commands) prepareUpdateInstanceOAuthProvider(a *instance.Aggregate, wri
return nil, err return nil, err
} }
if !writeModel.State.Exists() { if !writeModel.State.Exists() {
return nil, caos_errs.ThrowNotFound(nil, "INST-D3r1s", "Errors.Instance.IDPConfig.NotExisting") return nil, caos_errs.ThrowNotFound(nil, "INST-D3r1s", "Errors.IDPConfig.NotExisting")
} }
event, err := writeModel.NewChangedEvent( event, err := writeModel.NewChangedEvent(
ctx, ctx,
@ -646,7 +680,7 @@ func (c *Commands) prepareUpdateInstanceOIDCProvider(a *instance.Aggregate, writ
return nil, err return nil, err
} }
if !writeModel.State.Exists() { if !writeModel.State.Exists() {
return nil, caos_errs.ThrowNotFound(nil, "INST-Dg331", "Errors.Instance.IDPConfig.NotExisting") return nil, caos_errs.ThrowNotFound(nil, "INST-Dg331", "Errors.IDPConfig.NotExisting")
} }
event, err := writeModel.NewChangedEvent( event, err := writeModel.NewChangedEvent(
ctx, ctx,
@ -669,6 +703,91 @@ func (c *Commands) prepareUpdateInstanceOIDCProvider(a *instance.Aggregate, writ
} }
} }
func (c *Commands) prepareMigrateInstanceOIDCToAzureADProvider(a *instance.Aggregate, writeModel *InstanceOIDCIDPWriteModel, provider AzureADProvider) preparation.Validation {
return func() (preparation.CreateCommands, error) {
if provider.Name = strings.TrimSpace(provider.Name); provider.Name == "" {
return nil, caos_errs.ThrowInvalidArgument(nil, "INST-sdf3g", "Errors.Invalid.Argument")
}
if provider.ClientID = strings.TrimSpace(provider.ClientID); provider.ClientID == "" {
return nil, caos_errs.ThrowInvalidArgument(nil, "INST-Fhbr2", "Errors.Invalid.Argument")
}
if provider.ClientSecret = strings.TrimSpace(provider.ClientSecret); provider.ClientSecret == "" {
return nil, caos_errs.ThrowInvalidArgument(nil, "INST-Dzh3g", "Errors.Invalid.Argument")
}
return func(ctx context.Context, filter preparation.FilterToQueryReducer) ([]eventstore.Command, error) {
events, err := filter(ctx, writeModel.Query())
if err != nil {
return nil, err
}
writeModel.AppendEvents(events...)
if err = writeModel.Reduce(); err != nil {
return nil, err
}
if !writeModel.State.Exists() {
return nil, caos_errs.ThrowNotFound(nil, "INST-Dg29201", "Errors.IDPConfig.NotExisting")
}
secret, err := crypto.Encrypt([]byte(provider.ClientSecret), c.idpConfigEncryption)
if err != nil {
return nil, err
}
return []eventstore.Command{
instance.NewOIDCIDPMigratedAzureADEvent(
ctx,
&a.Aggregate,
writeModel.ID,
provider.Name,
provider.ClientID,
secret,
provider.Scopes,
provider.Tenant,
provider.EmailVerified,
provider.IDPOptions,
),
}, nil
}, nil
}
}
func (c *Commands) prepareMigrateInstanceOIDCToGoogleProvider(a *instance.Aggregate, writeModel *InstanceOIDCIDPWriteModel, provider GoogleProvider) preparation.Validation {
return func() (preparation.CreateCommands, error) {
if provider.ClientID = strings.TrimSpace(provider.ClientID); provider.ClientID == "" {
return nil, caos_errs.ThrowInvalidArgument(nil, "INST-D3fvs", "Errors.Invalid.Argument")
}
if provider.ClientSecret = strings.TrimSpace(provider.ClientSecret); provider.ClientSecret == "" {
return nil, caos_errs.ThrowInvalidArgument(nil, "INST-W2vqs", "Errors.Invalid.Argument")
}
return func(ctx context.Context, filter preparation.FilterToQueryReducer) ([]eventstore.Command, error) {
events, err := filter(ctx, writeModel.Query())
if err != nil {
return nil, err
}
writeModel.AppendEvents(events...)
if err = writeModel.Reduce(); err != nil {
return nil, err
}
if !writeModel.State.Exists() {
return nil, caos_errs.ThrowNotFound(nil, "INST-Dg29202", "Errors.IDPConfig.NotExisting")
}
secret, err := crypto.Encrypt([]byte(provider.ClientSecret), c.idpConfigEncryption)
if err != nil {
return nil, err
}
return []eventstore.Command{
instance.NewOIDCIDPMigratedGoogleEvent(
ctx,
&a.Aggregate,
writeModel.ID,
provider.Name,
provider.ClientID,
secret,
provider.Scopes,
provider.IDPOptions,
),
}, nil
}, nil
}
}
func (c *Commands) prepareAddInstanceJWTProvider(a *instance.Aggregate, writeModel *InstanceJWTIDPWriteModel, provider JWTProvider) preparation.Validation { func (c *Commands) prepareAddInstanceJWTProvider(a *instance.Aggregate, writeModel *InstanceJWTIDPWriteModel, provider JWTProvider) preparation.Validation {
return func() (preparation.CreateCommands, error) { return func() (preparation.CreateCommands, error) {
if provider.Name = strings.TrimSpace(provider.Name); provider.Name == "" { if provider.Name = strings.TrimSpace(provider.Name); provider.Name == "" {
@ -742,7 +861,7 @@ func (c *Commands) prepareUpdateInstanceJWTProvider(a *instance.Aggregate, write
return nil, err return nil, err
} }
if !writeModel.State.Exists() { if !writeModel.State.Exists() {
return nil, caos_errs.ThrowNotFound(nil, "INST-Bhju5", "Errors.Instance.IDPConfig.NotExisting") return nil, caos_errs.ThrowNotFound(nil, "INST-Bhju5", "Errors.IDPConfig.NotExisting")
} }
event, err := writeModel.NewChangedEvent( event, err := writeModel.NewChangedEvent(
ctx, ctx,
@ -826,7 +945,7 @@ func (c *Commands) prepareUpdateInstanceAzureADProvider(a *instance.Aggregate, w
return nil, err return nil, err
} }
if !writeModel.State.Exists() { if !writeModel.State.Exists() {
return nil, caos_errs.ThrowNotFound(nil, "INST-BHz3q", "Errors.Instance.IDPConfig.NotExisting") return nil, caos_errs.ThrowNotFound(nil, "INST-BHz3q", "Errors.IDPConfig.NotExisting")
} }
event, err := writeModel.NewChangedEvent( event, err := writeModel.NewChangedEvent(
ctx, ctx,
@ -904,7 +1023,7 @@ func (c *Commands) prepareUpdateInstanceGitHubProvider(a *instance.Aggregate, wr
return nil, err return nil, err
} }
if !writeModel.State.Exists() { if !writeModel.State.Exists() {
return nil, caos_errs.ThrowNotFound(nil, "INST-Dr1gs", "Errors.Instance.IDPConfig.NotExisting") return nil, caos_errs.ThrowNotFound(nil, "INST-Dr1gs", "Errors.IDPConfig.NotExisting")
} }
event, err := writeModel.NewChangedEvent( event, err := writeModel.NewChangedEvent(
ctx, ctx,
@ -1007,7 +1126,7 @@ func (c *Commands) prepareUpdateInstanceGitHubEnterpriseProvider(a *instance.Agg
return nil, err return nil, err
} }
if !writeModel.State.Exists() { if !writeModel.State.Exists() {
return nil, caos_errs.ThrowNotFound(nil, "INST-GBr42", "Errors.Instance.IDPConfig.NotExisting") return nil, caos_errs.ThrowNotFound(nil, "INST-GBr42", "Errors.IDPConfig.NotExisting")
} }
event, err := writeModel.NewChangedEvent( event, err := writeModel.NewChangedEvent(
ctx, ctx,
@ -1086,7 +1205,7 @@ func (c *Commands) prepareUpdateInstanceGitLabProvider(a *instance.Aggregate, wr
return nil, err return nil, err
} }
if !writeModel.State.Exists() { if !writeModel.State.Exists() {
return nil, caos_errs.ThrowNotFound(nil, "INST-HBReq", "Errors.Instance.IDPConfig.NotExisting") return nil, caos_errs.ThrowNotFound(nil, "INST-HBReq", "Errors.IDPConfig.NotExisting")
} }
event, err := writeModel.NewChangedEvent( event, err := writeModel.NewChangedEvent(
ctx, ctx,
@ -1175,7 +1294,7 @@ func (c *Commands) prepareUpdateInstanceGitLabSelfHostedProvider(a *instance.Agg
return nil, err return nil, err
} }
if !writeModel.State.Exists() { if !writeModel.State.Exists() {
return nil, caos_errs.ThrowNotFound(nil, "INST-D2tg1", "Errors.Instance.IDPConfig.NotExisting") return nil, caos_errs.ThrowNotFound(nil, "INST-D2tg1", "Errors.IDPConfig.NotExisting")
} }
event, err := writeModel.NewChangedEvent( event, err := writeModel.NewChangedEvent(
ctx, ctx,
@ -1252,7 +1371,7 @@ func (c *Commands) prepareUpdateInstanceGoogleProvider(a *instance.Aggregate, wr
return nil, err return nil, err
} }
if !writeModel.State.Exists() { if !writeModel.State.Exists() {
return nil, caos_errs.ThrowNotFound(nil, "INST-D3r1s", "Errors.Instance.IDPConfig.NotExisting") return nil, caos_errs.ThrowNotFound(nil, "INST-D3r1s", "Errors.IDPConfig.NotExisting")
} }
event, err := writeModel.NewChangedEvent( event, err := writeModel.NewChangedEvent(
ctx, ctx,
@ -1371,7 +1490,7 @@ func (c *Commands) prepareUpdateInstanceLDAPProvider(a *instance.Aggregate, writ
return nil, err return nil, err
} }
if !writeModel.State.Exists() { if !writeModel.State.Exists() {
return nil, caos_errs.ThrowNotFound(nil, "INST-ASF3F", "Errors.Instance.IDPConfig.NotExisting") return nil, caos_errs.ThrowNotFound(nil, "INST-ASF3F", "Errors.IDPConfig.NotExisting")
} }
event, err := writeModel.NewChangedEvent( event, err := writeModel.NewChangedEvent(
ctx, ctx,
@ -1412,7 +1531,7 @@ func (c *Commands) prepareDeleteInstanceProvider(a *instance.Aggregate, id strin
return nil, err return nil, err
} }
if !writeModel.State.Exists() { if !writeModel.State.Exists() {
return nil, caos_errs.ThrowNotFound(nil, "INST-Se3tg", "Errors.Instance.IDPConfig.NotExisting") return nil, caos_errs.ThrowNotFound(nil, "INST-Se3tg", "Errors.IDPConfig.NotExisting")
} }
return []eventstore.Command{instance.NewIDPRemovedEvent(ctx, &a.Aggregate, id)}, nil return []eventstore.Command{instance.NewIDPRemovedEvent(ctx, &a.Aggregate, id)}, nil
}, nil }, nil

View File

@ -113,6 +113,10 @@ func (wm *InstanceOIDCIDPWriteModel) AppendEvents(events ...eventstore.Event) {
wm.OIDCIDPWriteModel.AppendEvents(&e.OIDCIDPChangedEvent) wm.OIDCIDPWriteModel.AppendEvents(&e.OIDCIDPChangedEvent)
case *instance.IDPRemovedEvent: case *instance.IDPRemovedEvent:
wm.OIDCIDPWriteModel.AppendEvents(&e.RemovedEvent) wm.OIDCIDPWriteModel.AppendEvents(&e.RemovedEvent)
case *instance.OIDCIDPMigratedAzureADEvent:
wm.OIDCIDPWriteModel.AppendEvents(&e.OIDCIDPMigratedAzureADEvent)
case *instance.OIDCIDPMigratedGoogleEvent:
wm.OIDCIDPWriteModel.AppendEvents(&e.OIDCIDPMigratedGoogleEvent)
// old events // old events
case *instance.IDPConfigAddedEvent: case *instance.IDPConfigAddedEvent:
@ -141,6 +145,8 @@ func (wm *InstanceOIDCIDPWriteModel) Query() *eventstore.SearchQueryBuilder {
instance.OIDCIDPAddedEventType, instance.OIDCIDPAddedEventType,
instance.OIDCIDPChangedEventType, instance.OIDCIDPChangedEventType,
instance.IDPRemovedEventType, instance.IDPRemovedEventType,
instance.OIDCIDPMigratedAzureADEventType,
instance.OIDCIDPMigratedGoogleEventType,
). ).
EventData(map[string]interface{}{"id": wm.ID}). EventData(map[string]interface{}{"id": wm.ID}).
Or(). // old events Or(). // old events
@ -305,6 +311,8 @@ func (wm *InstanceAzureADIDPWriteModel) AppendEvents(events ...eventstore.Event)
wm.AzureADIDPWriteModel.AppendEvents(&e.AzureADIDPAddedEvent) wm.AzureADIDPWriteModel.AppendEvents(&e.AzureADIDPAddedEvent)
case *instance.AzureADIDPChangedEvent: case *instance.AzureADIDPChangedEvent:
wm.AzureADIDPWriteModel.AppendEvents(&e.AzureADIDPChangedEvent) wm.AzureADIDPWriteModel.AppendEvents(&e.AzureADIDPChangedEvent)
case *instance.OIDCIDPMigratedAzureADEvent:
wm.AzureADIDPWriteModel.AppendEvents(&e.OIDCIDPMigratedAzureADEvent)
case *instance.IDPRemovedEvent: case *instance.IDPRemovedEvent:
wm.AzureADIDPWriteModel.AppendEvents(&e.RemovedEvent) wm.AzureADIDPWriteModel.AppendEvents(&e.RemovedEvent)
default: default:
@ -322,6 +330,7 @@ func (wm *InstanceAzureADIDPWriteModel) Query() *eventstore.SearchQueryBuilder {
EventTypes( EventTypes(
instance.AzureADIDPAddedEventType, instance.AzureADIDPAddedEventType,
instance.AzureADIDPChangedEventType, instance.AzureADIDPChangedEventType,
instance.OIDCIDPMigratedAzureADEventType,
instance.IDPRemovedEventType, instance.IDPRemovedEventType,
). ).
EventData(map[string]interface{}{"id": wm.ID}). EventData(map[string]interface{}{"id": wm.ID}).
@ -655,6 +664,8 @@ func (wm *InstanceGoogleIDPWriteModel) AppendEvents(events ...eventstore.Event)
wm.GoogleIDPWriteModel.AppendEvents(&e.GoogleIDPAddedEvent) wm.GoogleIDPWriteModel.AppendEvents(&e.GoogleIDPAddedEvent)
case *instance.GoogleIDPChangedEvent: case *instance.GoogleIDPChangedEvent:
wm.GoogleIDPWriteModel.AppendEvents(&e.GoogleIDPChangedEvent) wm.GoogleIDPWriteModel.AppendEvents(&e.GoogleIDPChangedEvent)
case *instance.OIDCIDPMigratedGoogleEvent:
wm.GoogleIDPWriteModel.AppendEvents(&e.OIDCIDPMigratedGoogleEvent)
case *instance.IDPRemovedEvent: case *instance.IDPRemovedEvent:
wm.GoogleIDPWriteModel.AppendEvents(&e.RemovedEvent) wm.GoogleIDPWriteModel.AppendEvents(&e.RemovedEvent)
} }
@ -670,6 +681,7 @@ func (wm *InstanceGoogleIDPWriteModel) Query() *eventstore.SearchQueryBuilder {
EventTypes( EventTypes(
instance.GoogleIDPAddedEventType, instance.GoogleIDPAddedEventType,
instance.GoogleIDPChangedEventType, instance.GoogleIDPChangedEventType,
instance.OIDCIDPMigratedGoogleEventType,
instance.IDPRemovedEventType, instance.IDPRemovedEventType,
). ).
EventData(map[string]interface{}{"id": wm.ID}). EventData(map[string]interface{}{"id": wm.ID}).

View File

@ -1102,6 +1102,474 @@ func TestCommandSide_UpdateInstanceGenericOIDCIDP(t *testing.T) {
} }
} }
func TestCommandSide_MigrateInstanceGenericOIDCToAzureADProvider(t *testing.T) {
type fields struct {
eventstore *eventstore.Eventstore
secretCrypto crypto.EncryptionAlgorithm
}
type args struct {
ctx context.Context
id string
provider AzureADProvider
}
type res struct {
want *domain.ObjectDetails
err func(error) bool
}
tests := []struct {
name string
fields fields
args args
res res
}{
{
"invalid name",
fields{
eventstore: eventstoreExpect(t),
},
args{
ctx: authz.WithInstanceID(context.Background(), "instance1"),
provider: AzureADProvider{},
},
res{
err: func(err error) bool {
return errors.Is(err, caos_errors.ThrowInvalidArgument(nil, "INST-sdf3g", ""))
},
},
},
{
"invalid client id",
fields{
eventstore: eventstoreExpect(t),
},
args{
ctx: authz.WithInstanceID(context.Background(), "instance1"),
provider: AzureADProvider{
Name: "name",
},
},
res{
err: func(err error) bool {
return errors.Is(err, caos_errors.ThrowInvalidArgument(nil, "INST-Fhbr2", ""))
},
},
},
{
"invalid client secret",
fields{
eventstore: eventstoreExpect(t),
},
args{
ctx: authz.WithInstanceID(context.Background(), "instance1"),
provider: AzureADProvider{
Name: "name",
ClientID: "clientID",
},
},
res{
err: func(err error) bool {
return errors.Is(err, caos_errors.ThrowInvalidArgument(nil, "INST-Dzh3g", ""))
},
},
},
{
name: "not found",
fields: fields{
eventstore: eventstoreExpect(t,
expectFilter(),
),
},
args: args{
ctx: authz.WithInstanceID(context.Background(), "instance1"),
id: "id1",
provider: AzureADProvider{
Name: "name",
ClientID: "clientID",
ClientSecret: "clientSecret",
},
},
res: res{
err: caos_errors.IsNotFound,
},
},
{
name: "migrate ok",
fields: fields{
eventstore: eventstoreExpect(t,
expectFilter(
eventFromEventPusher(
instance.NewOIDCIDPAddedEvent(context.Background(), &instance.NewAggregate("instance1").Aggregate,
"id1",
"name",
"issuer",
"clientID",
&crypto.CryptoValue{
CryptoType: crypto.TypeEncryption,
Algorithm: "enc",
KeyID: "id",
Crypted: []byte("clientSecret"),
},
nil,
false,
idp.Options{},
)),
),
expectPush(
[]*repository.Event{
eventFromEventPusherWithInstanceID(
"instance1",
func() eventstore.Command {
event := instance.NewOIDCIDPMigratedAzureADEvent(context.Background(), &instance.NewAggregate("instance1").Aggregate,
"id1",
"name",
"clientID",
&crypto.CryptoValue{
CryptoType: crypto.TypeEncryption,
Algorithm: "enc",
KeyID: "id",
Crypted: []byte("clientSecret"),
},
nil,
"",
false,
idp.Options{},
)
return event
}(),
),
},
),
),
secretCrypto: crypto.CreateMockEncryptionAlg(gomock.NewController(t)),
},
args: args{
ctx: authz.WithInstanceID(context.Background(), "instance1"),
id: "id1",
provider: AzureADProvider{
Name: "name",
ClientID: "clientID",
ClientSecret: "clientSecret",
},
},
res: res{
want: &domain.ObjectDetails{ResourceOwner: "instance1"},
},
},
{
name: "migrate ok full",
fields: fields{
eventstore: eventstoreExpect(t,
expectFilter(
eventFromEventPusher(
instance.NewOIDCIDPAddedEvent(context.Background(), &instance.NewAggregate("instance1").Aggregate,
"id1",
"name",
"issuer",
"clientID",
&crypto.CryptoValue{
CryptoType: crypto.TypeEncryption,
Algorithm: "enc",
KeyID: "id",
Crypted: []byte("clientSecret"),
},
nil,
false,
idp.Options{},
)),
),
expectPush(
[]*repository.Event{
eventFromEventPusherWithInstanceID(
"instance1",
func() eventstore.Command {
event := instance.NewOIDCIDPMigratedAzureADEvent(context.Background(), &instance.NewAggregate("instance1").Aggregate,
"id1",
"name",
"clientID",
&crypto.CryptoValue{
CryptoType: crypto.TypeEncryption,
Algorithm: "enc",
KeyID: "id",
Crypted: []byte("clientSecret"),
},
[]string{"openid"},
"tenant",
true,
idp.Options{
IsCreationAllowed: true,
IsLinkingAllowed: true,
IsAutoCreation: true,
IsAutoUpdate: true,
},
)
return event
}(),
),
},
),
),
secretCrypto: crypto.CreateMockEncryptionAlg(gomock.NewController(t)),
},
args: args{
ctx: authz.WithInstanceID(context.Background(), "instance1"),
id: "id1",
provider: AzureADProvider{
Name: "name",
ClientID: "clientID",
ClientSecret: "clientSecret",
Scopes: []string{"openid"},
Tenant: "tenant",
EmailVerified: true,
IDPOptions: idp.Options{
IsCreationAllowed: true,
IsLinkingAllowed: true,
IsAutoCreation: true,
IsAutoUpdate: true,
},
},
},
res: res{
want: &domain.ObjectDetails{ResourceOwner: "instance1"},
},
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
c := &Commands{
eventstore: tt.fields.eventstore,
idpConfigEncryption: tt.fields.secretCrypto,
}
got, err := c.MigrateInstanceGenericOIDCToAzureADProvider(tt.args.ctx, tt.args.id, tt.args.provider)
if tt.res.err == nil {
assert.NoError(t, err)
}
if tt.res.err != nil && !tt.res.err(err) {
t.Errorf("got wrong err: %v ", err)
}
if tt.res.err == nil {
assert.Equal(t, tt.res.want, got)
}
})
}
}
func TestCommandSide_MigrateInstanceOIDCToGoogleIDP(t *testing.T) {
type fields struct {
eventstore *eventstore.Eventstore
secretCrypto crypto.EncryptionAlgorithm
}
type args struct {
ctx context.Context
id string
provider GoogleProvider
}
type res struct {
want *domain.ObjectDetails
err func(error) bool
}
tests := []struct {
name string
fields fields
args args
res res
}{
{
"invalid clientID",
fields{
eventstore: eventstoreExpect(t),
},
args{
ctx: authz.WithInstanceID(context.Background(), "instance1"),
provider: GoogleProvider{},
},
res{
err: func(err error) bool {
return errors.Is(err, caos_errors.ThrowInvalidArgument(nil, "INST-D3fvs", ""))
},
},
},
{
"invalid clientSecret",
fields{
eventstore: eventstoreExpect(t),
},
args{
ctx: authz.WithInstanceID(context.Background(), "instance1"),
provider: GoogleProvider{
ClientID: "clientID",
},
},
res{
err: func(err error) bool {
return errors.Is(err, caos_errors.ThrowInvalidArgument(nil, "INST-W2vqs", ""))
},
},
},
{
name: "not found",
fields: fields{
eventstore: eventstoreExpect(t,
expectFilter(),
),
},
args: args{
ctx: authz.WithInstanceID(context.Background(), "instance1"),
id: "id1",
provider: GoogleProvider{
ClientID: "clientID",
ClientSecret: "clientSecret",
},
},
res: res{
err: caos_errors.IsNotFound,
},
},
{
name: "migrate ok",
fields: fields{
eventstore: eventstoreExpect(t,
expectFilter(
eventFromEventPusher(
instance.NewOIDCIDPAddedEvent(context.Background(), &instance.NewAggregate("instance1").Aggregate,
"id1",
"name",
"issuer",
"clientID",
&crypto.CryptoValue{
CryptoType: crypto.TypeEncryption,
Algorithm: "enc",
KeyID: "id",
Crypted: []byte("clientSecret"),
},
nil,
false,
idp.Options{},
)),
),
expectPush(
[]*repository.Event{
eventFromEventPusherWithInstanceID(
"instance1",
instance.NewOIDCIDPMigratedGoogleEvent(context.Background(), &instance.NewAggregate("instance1").Aggregate,
"id1",
"",
"clientID",
&crypto.CryptoValue{
CryptoType: crypto.TypeEncryption,
Algorithm: "enc",
KeyID: "id",
Crypted: []byte("clientSecret"),
},
nil,
idp.Options{},
)),
},
),
),
secretCrypto: crypto.CreateMockEncryptionAlg(gomock.NewController(t)),
},
args: args{
ctx: authz.WithInstanceID(context.Background(), "instance1"),
id: "id1",
provider: GoogleProvider{
ClientID: "clientID",
ClientSecret: "clientSecret",
},
},
res: res{
want: &domain.ObjectDetails{ResourceOwner: "instance1"},
},
},
{
name: "migrate ok full",
fields: fields{
eventstore: eventstoreExpect(t,
expectFilter(
eventFromEventPusher(
instance.NewOIDCIDPAddedEvent(context.Background(), &instance.NewAggregate("instance1").Aggregate,
"id1",
"name",
"issuer",
"clientID",
&crypto.CryptoValue{
CryptoType: crypto.TypeEncryption,
Algorithm: "enc",
KeyID: "id",
Crypted: []byte("clientSecret"),
},
nil,
false,
idp.Options{},
)),
),
expectPush(
[]*repository.Event{
eventFromEventPusherWithInstanceID(
"instance1",
instance.NewOIDCIDPMigratedGoogleEvent(context.Background(), &instance.NewAggregate("instance1").Aggregate,
"id1",
"",
"clientID",
&crypto.CryptoValue{
CryptoType: crypto.TypeEncryption,
Algorithm: "enc",
KeyID: "id",
Crypted: []byte("clientSecret"),
},
[]string{"openid"},
idp.Options{
IsCreationAllowed: true,
IsLinkingAllowed: true,
IsAutoCreation: true,
IsAutoUpdate: true,
},
)),
},
),
),
secretCrypto: crypto.CreateMockEncryptionAlg(gomock.NewController(t)),
},
args: args{
ctx: authz.WithInstanceID(context.Background(), "instance1"),
id: "id1",
provider: GoogleProvider{
ClientID: "clientID",
ClientSecret: "clientSecret",
Scopes: []string{"openid"},
IDPOptions: idp.Options{
IsCreationAllowed: true,
IsLinkingAllowed: true,
IsAutoCreation: true,
IsAutoUpdate: true,
},
},
},
res: res{
want: &domain.ObjectDetails{ResourceOwner: "instance1"},
},
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
c := &Commands{
eventstore: tt.fields.eventstore,
idpConfigEncryption: tt.fields.secretCrypto,
}
got, err := c.MigrateInstanceGenericOIDCToGoogleProvider(tt.args.ctx, tt.args.id, tt.args.provider)
if tt.res.err == nil {
assert.NoError(t, err)
}
if tt.res.err != nil && !tt.res.err(err) {
t.Errorf("got wrong err: %v ", err)
}
if tt.res.err == nil {
assert.Equal(t, tt.res.want, got)
}
})
}
}
func TestCommandSide_AddInstanceAzureADIDP(t *testing.T) { func TestCommandSide_AddInstanceAzureADIDP(t *testing.T) {
type fields struct { type fields struct {
eventstore *eventstore.Eventstore eventstore *eventstore.Eventstore

View File

@ -92,6 +92,39 @@ func (c *Commands) UpdateOrgGenericOIDCProvider(ctx context.Context, resourceOwn
return pushedEventsToObjectDetails(pushedEvents), nil return pushedEventsToObjectDetails(pushedEvents), nil
} }
func (c *Commands) MigrateOrgGenericOIDCToAzureADProvider(ctx context.Context, resourceOwner, id string, provider AzureADProvider) (*domain.ObjectDetails, error) {
return c.migrateOrgGenericOIDC(ctx, resourceOwner, id, provider)
}
func (c *Commands) MigrateOrgGenericOIDCToGoogleProvider(ctx context.Context, resourceOwner, id string, provider GoogleProvider) (*domain.ObjectDetails, error) {
return c.migrateOrgGenericOIDC(ctx, resourceOwner, id, provider)
}
func (c *Commands) migrateOrgGenericOIDC(ctx context.Context, resourceOwner, id string, provider interface{}) (*domain.ObjectDetails, error) {
orgAgg := org.NewAggregate(resourceOwner)
writeModel := NewOIDCOrgIDPWriteModel(resourceOwner, id)
var validation preparation.Validation
switch p := provider.(type) {
case AzureADProvider:
validation = c.prepareMigrateOrgOIDCToAzureADProvider(orgAgg, writeModel, p)
case GoogleProvider:
validation = c.prepareMigrateOrgOIDCToGoogleProvider(orgAgg, writeModel, p)
default:
return nil, caos_errs.ThrowInvalidArgument(nil, "COMMAND-s9s2919", "Errors.IDPConfig.NotExisting")
}
cmds, err := preparation.PrepareCommands(ctx, c.eventstore.Filter, validation)
if err != nil {
return nil, err
}
pushedEvents, err := c.eventstore.Push(ctx, cmds...)
if err != nil {
return nil, err
}
return pushedEventsToObjectDetails(pushedEvents), nil
}
func (c *Commands) AddOrgJWTProvider(ctx context.Context, resourceOwner string, provider JWTProvider) (string, *domain.ObjectDetails, error) { func (c *Commands) AddOrgJWTProvider(ctx context.Context, resourceOwner string, provider JWTProvider) (string, *domain.ObjectDetails, error) {
orgAgg := org.NewAggregate(resourceOwner) orgAgg := org.NewAggregate(resourceOwner)
id, err := c.idGenerator.Next() id, err := c.idGenerator.Next()
@ -647,6 +680,91 @@ func (c *Commands) prepareUpdateOrgOIDCProvider(a *org.Aggregate, writeModel *Or
} }
} }
func (c *Commands) prepareMigrateOrgOIDCToAzureADProvider(a *org.Aggregate, writeModel *OrgOIDCIDPWriteModel, provider AzureADProvider) preparation.Validation {
return func() (preparation.CreateCommands, error) {
if provider.Name = strings.TrimSpace(provider.Name); provider.Name == "" {
return nil, caos_errs.ThrowInvalidArgument(nil, "ORG-sdf3g", "Errors.Invalid.Argument")
}
if provider.ClientID = strings.TrimSpace(provider.ClientID); provider.ClientID == "" {
return nil, caos_errs.ThrowInvalidArgument(nil, "ORG-Fhbr2", "Errors.Invalid.Argument")
}
if provider.ClientSecret = strings.TrimSpace(provider.ClientSecret); provider.ClientSecret == "" {
return nil, caos_errs.ThrowInvalidArgument(nil, "ORG-Dzh3g", "Errors.Invalid.Argument")
}
return func(ctx context.Context, filter preparation.FilterToQueryReducer) ([]eventstore.Command, error) {
events, err := filter(ctx, writeModel.Query())
if err != nil {
return nil, err
}
writeModel.AppendEvents(events...)
if err = writeModel.Reduce(); err != nil {
return nil, err
}
if !writeModel.State.Exists() {
return nil, caos_errs.ThrowNotFound(nil, "INST-Dg239201", "Errors.Instance.IDPConfig.NotExisting")
}
secret, err := crypto.Encrypt([]byte(provider.ClientSecret), c.idpConfigEncryption)
if err != nil {
return nil, err
}
return []eventstore.Command{
org.NewOIDCIDPMigratedAzureADEvent(
ctx,
&a.Aggregate,
writeModel.ID,
provider.Name,
provider.ClientID,
secret,
provider.Scopes,
provider.Tenant,
provider.EmailVerified,
provider.IDPOptions,
),
}, nil
}, nil
}
}
func (c *Commands) prepareMigrateOrgOIDCToGoogleProvider(a *org.Aggregate, writeModel *OrgOIDCIDPWriteModel, provider GoogleProvider) preparation.Validation {
return func() (preparation.CreateCommands, error) {
if provider.ClientID = strings.TrimSpace(provider.ClientID); provider.ClientID == "" {
return nil, caos_errs.ThrowInvalidArgument(nil, "ORG-D3fvs", "Errors.Invalid.Argument")
}
if provider.ClientSecret = strings.TrimSpace(provider.ClientSecret); provider.ClientSecret == "" {
return nil, caos_errs.ThrowInvalidArgument(nil, "ORG-W2vqs", "Errors.Invalid.Argument")
}
return func(ctx context.Context, filter preparation.FilterToQueryReducer) ([]eventstore.Command, error) {
events, err := filter(ctx, writeModel.Query())
if err != nil {
return nil, err
}
writeModel.AppendEvents(events...)
if err = writeModel.Reduce(); err != nil {
return nil, err
}
if !writeModel.State.Exists() {
return nil, caos_errs.ThrowNotFound(nil, "INST-x09981", "Errors.Instance.IDPConfig.NotExisting")
}
secret, err := crypto.Encrypt([]byte(provider.ClientSecret), c.idpConfigEncryption)
if err != nil {
return nil, err
}
return []eventstore.Command{
org.NewOIDCIDPMigratedGoogleEvent(
ctx,
&a.Aggregate,
writeModel.ID,
provider.Name,
provider.ClientID,
secret,
provider.Scopes,
provider.IDPOptions,
),
}, nil
}, nil
}
}
func (c *Commands) prepareAddOrgJWTProvider(a *org.Aggregate, writeModel *OrgJWTIDPWriteModel, provider JWTProvider) preparation.Validation { func (c *Commands) prepareAddOrgJWTProvider(a *org.Aggregate, writeModel *OrgJWTIDPWriteModel, provider JWTProvider) preparation.Validation {
return func() (preparation.CreateCommands, error) { return func() (preparation.CreateCommands, error) {
if provider.Name = strings.TrimSpace(provider.Name); provider.Name == "" { if provider.Name = strings.TrimSpace(provider.Name); provider.Name == "" {

View File

@ -113,6 +113,10 @@ func (wm *OrgOIDCIDPWriteModel) AppendEvents(events ...eventstore.Event) {
wm.OIDCIDPWriteModel.AppendEvents(&e.OIDCIDPAddedEvent) wm.OIDCIDPWriteModel.AppendEvents(&e.OIDCIDPAddedEvent)
case *org.OIDCIDPChangedEvent: case *org.OIDCIDPChangedEvent:
wm.OIDCIDPWriteModel.AppendEvents(&e.OIDCIDPChangedEvent) wm.OIDCIDPWriteModel.AppendEvents(&e.OIDCIDPChangedEvent)
case *org.OIDCIDPMigratedAzureADEvent:
wm.OIDCIDPWriteModel.AppendEvents(&e.OIDCIDPMigratedAzureADEvent)
case *org.OIDCIDPMigratedGoogleEvent:
wm.OIDCIDPWriteModel.AppendEvents(&e.OIDCIDPMigratedGoogleEvent)
case *org.IDPRemovedEvent: case *org.IDPRemovedEvent:
wm.OIDCIDPWriteModel.AppendEvents(&e.RemovedEvent) wm.OIDCIDPWriteModel.AppendEvents(&e.RemovedEvent)
@ -142,6 +146,8 @@ func (wm *OrgOIDCIDPWriteModel) Query() *eventstore.SearchQueryBuilder {
EventTypes( EventTypes(
org.OIDCIDPAddedEventType, org.OIDCIDPAddedEventType,
org.OIDCIDPChangedEventType, org.OIDCIDPChangedEventType,
org.OIDCIDPMigratedAzureADEventType,
org.OIDCIDPMigratedGoogleEventType,
org.IDPRemovedEventType, org.IDPRemovedEventType,
). ).
EventData(map[string]interface{}{"id": wm.ID}). EventData(map[string]interface{}{"id": wm.ID}).
@ -311,6 +317,8 @@ func (wm *OrgAzureADIDPWriteModel) AppendEvents(events ...eventstore.Event) {
wm.AzureADIDPWriteModel.AppendEvents(&e.AzureADIDPAddedEvent) wm.AzureADIDPWriteModel.AppendEvents(&e.AzureADIDPAddedEvent)
case *org.AzureADIDPChangedEvent: case *org.AzureADIDPChangedEvent:
wm.AzureADIDPWriteModel.AppendEvents(&e.AzureADIDPChangedEvent) wm.AzureADIDPWriteModel.AppendEvents(&e.AzureADIDPChangedEvent)
case *org.OIDCIDPMigratedAzureADEvent:
wm.AzureADIDPWriteModel.AppendEvents(&e.OIDCIDPMigratedAzureADEvent)
case *org.IDPRemovedEvent: case *org.IDPRemovedEvent:
wm.AzureADIDPWriteModel.AppendEvents(&e.RemovedEvent) wm.AzureADIDPWriteModel.AppendEvents(&e.RemovedEvent)
default: default:
@ -328,6 +336,7 @@ func (wm *OrgAzureADIDPWriteModel) Query() *eventstore.SearchQueryBuilder {
EventTypes( EventTypes(
org.AzureADIDPAddedEventType, org.AzureADIDPAddedEventType,
org.AzureADIDPChangedEventType, org.AzureADIDPChangedEventType,
org.OIDCIDPMigratedAzureADEventType,
org.IDPRemovedEventType, org.IDPRemovedEventType,
). ).
EventData(map[string]interface{}{"id": wm.ID}). EventData(map[string]interface{}{"id": wm.ID}).
@ -663,6 +672,8 @@ func (wm *OrgGoogleIDPWriteModel) AppendEvents(events ...eventstore.Event) {
wm.GoogleIDPWriteModel.AppendEvents(&e.GoogleIDPAddedEvent) wm.GoogleIDPWriteModel.AppendEvents(&e.GoogleIDPAddedEvent)
case *org.GoogleIDPChangedEvent: case *org.GoogleIDPChangedEvent:
wm.GoogleIDPWriteModel.AppendEvents(&e.GoogleIDPChangedEvent) wm.GoogleIDPWriteModel.AppendEvents(&e.GoogleIDPChangedEvent)
case *org.OIDCIDPMigratedGoogleEvent:
wm.GoogleIDPWriteModel.AppendEvents(&e.OIDCIDPMigratedGoogleEvent)
case *org.IDPRemovedEvent: case *org.IDPRemovedEvent:
wm.GoogleIDPWriteModel.AppendEvents(&e.RemovedEvent) wm.GoogleIDPWriteModel.AppendEvents(&e.RemovedEvent)
default: default:
@ -680,6 +691,7 @@ func (wm *OrgGoogleIDPWriteModel) Query() *eventstore.SearchQueryBuilder {
EventTypes( EventTypes(
org.GoogleIDPAddedEventType, org.GoogleIDPAddedEventType,
org.GoogleIDPChangedEventType, org.GoogleIDPChangedEventType,
org.OIDCIDPMigratedGoogleEventType,
org.IDPRemovedEventType, org.IDPRemovedEventType,
). ).
EventData(map[string]interface{}{"id": wm.ID}). EventData(map[string]interface{}{"id": wm.ID}).

View File

@ -1119,6 +1119,474 @@ func TestCommandSide_UpdateOrgGenericOIDCIDP(t *testing.T) {
} }
} }
func TestCommandSide_MigrateOrgGenericOIDCToAzureADProvider(t *testing.T) {
type fields struct {
eventstore *eventstore.Eventstore
secretCrypto crypto.EncryptionAlgorithm
}
type args struct {
ctx context.Context
resourceOwner string
id string
provider AzureADProvider
}
type res struct {
want *domain.ObjectDetails
err func(error) bool
}
tests := []struct {
name string
fields fields
args args
res res
}{
{
"invalid name",
fields{
eventstore: eventstoreExpect(t),
},
args{
ctx: context.Background(),
resourceOwner: "org1",
provider: AzureADProvider{},
},
res{
err: func(err error) bool {
return errors.Is(err, caos_errors.ThrowInvalidArgument(nil, "ORG-sdf3g", ""))
},
},
},
{
"invalid client id",
fields{
eventstore: eventstoreExpect(t),
},
args{
ctx: context.Background(),
resourceOwner: "org1",
provider: AzureADProvider{
Name: "name",
},
},
res{
err: func(err error) bool {
return errors.Is(err, caos_errors.ThrowInvalidArgument(nil, "ORG-Fhbr2", ""))
},
},
},
{
"invalid client secret",
fields{
eventstore: eventstoreExpect(t),
},
args{
ctx: context.Background(),
resourceOwner: "org1",
provider: AzureADProvider{
Name: "name",
ClientID: "clientID",
},
},
res{
err: func(err error) bool {
return errors.Is(err, caos_errors.ThrowInvalidArgument(nil, "ORG-Dzh3g", ""))
},
},
},
{
name: "not found",
fields: fields{
eventstore: eventstoreExpect(t,
expectFilter(),
),
},
args: args{
ctx: context.Background(),
resourceOwner: "ro",
id: "id1",
provider: AzureADProvider{
Name: "name",
ClientID: "clientID",
ClientSecret: "clientSecret",
},
},
res: res{
err: caos_errors.IsNotFound,
},
},
{
name: "migrate ok",
fields: fields{
eventstore: eventstoreExpect(t,
expectFilter(
eventFromEventPusher(
org.NewOIDCIDPAddedEvent(context.Background(), &org.NewAggregate("org1").Aggregate,
"id1",
"name",
"issuer",
"clientID",
&crypto.CryptoValue{
CryptoType: crypto.TypeEncryption,
Algorithm: "enc",
KeyID: "id",
Crypted: []byte("clientSecret"),
},
nil,
false,
idp.Options{},
)),
),
expectPush(
[]*repository.Event{
eventFromEventPusher(
func() eventstore.Command {
event := org.NewOIDCIDPMigratedAzureADEvent(context.Background(), &org.NewAggregate("org1").Aggregate,
"id1",
"name",
"clientID",
&crypto.CryptoValue{
CryptoType: crypto.TypeEncryption,
Algorithm: "enc",
KeyID: "id",
Crypted: []byte("clientSecret"),
},
nil,
"",
false,
idp.Options{},
)
return event
}(),
),
},
),
),
secretCrypto: crypto.CreateMockEncryptionAlg(gomock.NewController(t)),
},
args: args{
ctx: context.Background(),
resourceOwner: "org1",
id: "id1",
provider: AzureADProvider{
Name: "name",
ClientID: "clientID",
ClientSecret: "clientSecret",
},
},
res: res{
want: &domain.ObjectDetails{ResourceOwner: "org1"},
},
},
{
name: "migrate full ok",
fields: fields{
eventstore: eventstoreExpect(t,
expectFilter(
eventFromEventPusher(
org.NewOIDCIDPAddedEvent(context.Background(), &org.NewAggregate("org1").Aggregate,
"id1",
"name",
"issuer",
"clientID",
&crypto.CryptoValue{
CryptoType: crypto.TypeEncryption,
Algorithm: "enc",
KeyID: "id",
Crypted: []byte("clientSecret"),
},
nil,
false,
idp.Options{},
)),
),
expectPush(
eventPusherToEvents(
org.NewOIDCIDPMigratedAzureADEvent(context.Background(), &org.NewAggregate("org1").Aggregate,
"id1",
"name",
"clientID",
&crypto.CryptoValue{
CryptoType: crypto.TypeEncryption,
Algorithm: "enc",
KeyID: "id",
Crypted: []byte("clientSecret"),
},
[]string{"openid"},
"tenant",
true,
idp.Options{
IsCreationAllowed: true,
IsLinkingAllowed: true,
IsAutoCreation: true,
IsAutoUpdate: true,
},
)),
),
),
secretCrypto: crypto.CreateMockEncryptionAlg(gomock.NewController(t)),
},
args: args{
ctx: context.Background(),
resourceOwner: "org1",
id: "id1",
provider: AzureADProvider{
Name: "name",
ClientID: "clientID",
ClientSecret: "clientSecret",
Scopes: []string{"openid"},
Tenant: "tenant",
EmailVerified: true,
IDPOptions: idp.Options{
IsCreationAllowed: true,
IsLinkingAllowed: true,
IsAutoCreation: true,
IsAutoUpdate: true,
},
},
},
res: res{
want: &domain.ObjectDetails{ResourceOwner: "org1"},
},
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
c := &Commands{
eventstore: tt.fields.eventstore,
idpConfigEncryption: tt.fields.secretCrypto,
}
got, err := c.MigrateOrgGenericOIDCToAzureADProvider(tt.args.ctx, tt.args.resourceOwner, tt.args.id, tt.args.provider)
if tt.res.err == nil {
assert.NoError(t, err)
}
if tt.res.err != nil && !tt.res.err(err) {
t.Errorf("got wrong err: %v ", err)
}
if tt.res.err == nil {
assert.Equal(t, tt.res.want, got)
}
})
}
}
func TestCommandSide_MigrateOrgOIDCToGoogleIDP(t *testing.T) {
type fields struct {
eventstore *eventstore.Eventstore
secretCrypto crypto.EncryptionAlgorithm
}
type args struct {
ctx context.Context
resourceOwner string
id string
provider GoogleProvider
}
type res struct {
want *domain.ObjectDetails
err func(error) bool
}
tests := []struct {
name string
fields fields
args args
res res
}{
{
"invalid clientID",
fields{
eventstore: eventstoreExpect(t),
},
args{
ctx: context.Background(),
resourceOwner: "org1",
id: "id1",
provider: GoogleProvider{},
},
res{
err: func(err error) bool {
return errors.Is(err, caos_errors.ThrowInvalidArgument(nil, "ORG-D3fvs", ""))
},
},
},
{
"invalid clientSecret",
fields{
eventstore: eventstoreExpect(t),
},
args{
ctx: context.Background(),
resourceOwner: "org1",
id: "id1",
provider: GoogleProvider{
ClientID: "clientID",
},
},
res{
err: func(err error) bool {
return errors.Is(err, caos_errors.ThrowInvalidArgument(nil, "ORG-W2vqs", ""))
},
},
},
{
"not found",
fields{
eventstore: eventstoreExpect(t,
expectFilter(),
),
},
args{
ctx: context.Background(),
resourceOwner: "org1",
id: "id1",
provider: GoogleProvider{
ClientID: "clientID",
ClientSecret: "clientSecret",
},
},
res{
err: caos_errors.IsNotFound,
},
},
{
name: "migrate ok",
fields: fields{
eventstore: eventstoreExpect(t,
expectFilter(
eventFromEventPusher(
org.NewOIDCIDPAddedEvent(context.Background(), &org.NewAggregate("org1").Aggregate,
"id1",
"name",
"issuer",
"clientID",
&crypto.CryptoValue{
CryptoType: crypto.TypeEncryption,
Algorithm: "enc",
KeyID: "id",
Crypted: []byte("clientSecret"),
},
nil,
false,
idp.Options{},
)),
),
expectPush(
eventPusherToEvents(
org.NewOIDCIDPMigratedGoogleEvent(context.Background(), &org.NewAggregate("org1").Aggregate,
"id1",
"",
"clientID",
&crypto.CryptoValue{
CryptoType: crypto.TypeEncryption,
Algorithm: "enc",
KeyID: "id",
Crypted: []byte("clientSecret"),
},
nil,
idp.Options{},
)),
),
),
secretCrypto: crypto.CreateMockEncryptionAlg(gomock.NewController(t)),
},
args: args{
ctx: context.Background(),
resourceOwner: "org1",
id: "id1",
provider: GoogleProvider{
ClientID: "clientID",
ClientSecret: "clientSecret",
},
},
res: res{
want: &domain.ObjectDetails{ResourceOwner: "org1"},
},
},
{
name: "migrate full ok",
fields: fields{
eventstore: eventstoreExpect(t,
expectFilter(
eventFromEventPusher(
org.NewOIDCIDPAddedEvent(context.Background(), &org.NewAggregate("org1").Aggregate,
"id1",
"name",
"issuer",
"clientID",
&crypto.CryptoValue{
CryptoType: crypto.TypeEncryption,
Algorithm: "enc",
KeyID: "id",
Crypted: []byte("clientSecret"),
},
nil,
false,
idp.Options{},
)),
),
expectPush(
eventPusherToEvents(
org.NewOIDCIDPMigratedGoogleEvent(context.Background(), &org.NewAggregate("org1").Aggregate,
"id1",
"",
"clientID",
&crypto.CryptoValue{
CryptoType: crypto.TypeEncryption,
Algorithm: "enc",
KeyID: "id",
Crypted: []byte("clientSecret"),
},
[]string{"openid"},
idp.Options{
IsCreationAllowed: true,
IsLinkingAllowed: true,
IsAutoCreation: true,
IsAutoUpdate: true,
},
)),
),
),
secretCrypto: crypto.CreateMockEncryptionAlg(gomock.NewController(t)),
},
args: args{
ctx: context.Background(),
resourceOwner: "org1",
id: "id1",
provider: GoogleProvider{
ClientID: "clientID",
ClientSecret: "clientSecret",
Scopes: []string{"openid"},
IDPOptions: idp.Options{
IsCreationAllowed: true,
IsLinkingAllowed: true,
IsAutoCreation: true,
IsAutoUpdate: true,
},
},
},
res: res{
want: &domain.ObjectDetails{ResourceOwner: "org1"},
},
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
c := &Commands{
eventstore: tt.fields.eventstore,
idpConfigEncryption: tt.fields.secretCrypto,
}
got, err := c.MigrateOrgGenericOIDCToGoogleProvider(tt.args.ctx, tt.args.resourceOwner, tt.args.id, tt.args.provider)
if tt.res.err == nil {
assert.NoError(t, err)
}
if tt.res.err != nil && !tt.res.err(err) {
t.Errorf("got wrong err: %v ", err)
}
if tt.res.err == nil {
assert.Equal(t, tt.res.want, got)
}
})
}
}
func TestCommandSide_AddOrgAzureADIDP(t *testing.T) { func TestCommandSide_AddOrgAzureADIDP(t *testing.T) {
type fields struct { type fields struct {
eventstore *eventstore.Eventstore eventstore *eventstore.Eventstore

View File

@ -16,10 +16,10 @@ import (
"github.com/zitadel/zitadel/internal/telemetry/tracing" "github.com/zitadel/zitadel/internal/telemetry/tracing"
) )
type SessionCheck func(ctx context.Context, cmd *SessionChecks) error type SessionCommand func(ctx context.Context, cmd *SessionCommands) error
type SessionChecks struct { type SessionCommands struct {
checks []SessionCheck cmds []SessionCommand
sessionWriteModel *SessionWriteModel sessionWriteModel *SessionWriteModel
passwordWriteModel *HumanPasswordWriteModel passwordWriteModel *HumanPasswordWriteModel
@ -29,9 +29,9 @@ type SessionChecks struct {
now func() time.Time now func() time.Time
} }
func (c *Commands) NewSessionChecks(checks []SessionCheck, session *SessionWriteModel) *SessionChecks { func (c *Commands) NewSessionCommands(cmds []SessionCommand, session *SessionWriteModel) *SessionCommands {
return &SessionChecks{ return &SessionCommands{
checks: checks, cmds: cmds,
sessionWriteModel: session, sessionWriteModel: session,
eventstore: c.eventstore, eventstore: c.eventstore,
userPasswordAlg: c.userPasswordAlg, userPasswordAlg: c.userPasswordAlg,
@ -41,8 +41,8 @@ func (c *Commands) NewSessionChecks(checks []SessionCheck, session *SessionWrite
} }
// CheckUser defines a user check to be executed for a session update // CheckUser defines a user check to be executed for a session update
func CheckUser(id string) SessionCheck { func CheckUser(id string) SessionCommand {
return func(ctx context.Context, cmd *SessionChecks) error { return func(ctx context.Context, cmd *SessionCommands) error {
if cmd.sessionWriteModel.UserID != "" && id != "" && cmd.sessionWriteModel.UserID != id { if cmd.sessionWriteModel.UserID != "" && id != "" && cmd.sessionWriteModel.UserID != id {
return caos_errs.ThrowInvalidArgument(nil, "", "user change not possible") return caos_errs.ThrowInvalidArgument(nil, "", "user change not possible")
} }
@ -51,8 +51,8 @@ func CheckUser(id string) SessionCheck {
} }
// CheckPassword defines a password check to be executed for a session update // CheckPassword defines a password check to be executed for a session update
func CheckPassword(password string) SessionCheck { func CheckPassword(password string) SessionCommand {
return func(ctx context.Context, cmd *SessionChecks) error { return func(ctx context.Context, cmd *SessionCommands) error {
if cmd.sessionWriteModel.UserID == "" { if cmd.sessionWriteModel.UserID == "" {
return caos_errs.ThrowPreconditionFailed(nil, "COMMAND-Sfw3f", "Errors.User.UserIDMissing") return caos_errs.ThrowPreconditionFailed(nil, "COMMAND-Sfw3f", "Errors.User.UserIDMissing")
} }
@ -80,17 +80,32 @@ func CheckPassword(password string) SessionCheck {
} }
} }
// Check will execute the checks specified and return an error on the first occurrence // Exec will execute the commands specified and returns an error on the first occurrence
func (s *SessionChecks) Check(ctx context.Context) error { func (s *SessionCommands) Exec(ctx context.Context) error {
for _, check := range s.checks { for _, cmd := range s.cmds {
if err := check(ctx, s); err != nil { if err := cmd(ctx, s); err != nil {
return err return err
} }
} }
return nil return nil
} }
func (s *SessionChecks) commands(ctx context.Context) (string, []eventstore.Command, error) { func (s *SessionCommands) gethumanWriteModel(ctx context.Context) (*HumanWriteModel, error) {
if s.sessionWriteModel.UserID == "" {
return nil, caos_errs.ThrowPreconditionFailed(nil, "COMMAND-eeR2e", "Errors.User.UserIDMissing")
}
humanWriteModel := NewHumanWriteModel(s.sessionWriteModel.UserID, s.sessionWriteModel.ResourceOwner)
err := s.eventstore.FilterToQueryReducer(ctx, humanWriteModel)
if err != nil {
return nil, err
}
if humanWriteModel.UserState != domain.UserStateActive {
return nil, caos_errs.ThrowPreconditionFailed(nil, "COMMAND-Df4b3", "Errors.ie4Ai.NotFound")
}
return humanWriteModel, nil
}
func (s *SessionCommands) commands(ctx context.Context) (string, []eventstore.Command, error) {
if len(s.sessionWriteModel.commands) == 0 { if len(s.sessionWriteModel.commands) == 0 {
return "", nil, nil return "", nil, nil
} }
@ -103,7 +118,7 @@ func (s *SessionChecks) commands(ctx context.Context) (string, []eventstore.Comm
return token, s.sessionWriteModel.commands, nil return token, s.sessionWriteModel.commands, nil
} }
func (c *Commands) CreateSession(ctx context.Context, checks []SessionCheck, metadata map[string][]byte) (set *SessionChanged, err error) { func (c *Commands) CreateSession(ctx context.Context, cmds []SessionCommand, metadata map[string][]byte) (set *SessionChanged, err error) {
sessionID, err := c.idGenerator.Next() sessionID, err := c.idGenerator.Next()
if err != nil { if err != nil {
return nil, err return nil, err
@ -113,12 +128,12 @@ func (c *Commands) CreateSession(ctx context.Context, checks []SessionCheck, met
if err != nil { if err != nil {
return nil, err return nil, err
} }
cmd := c.NewSessionChecks(checks, sessionWriteModel) cmd := c.NewSessionCommands(cmds, sessionWriteModel)
cmd.sessionWriteModel.Start(ctx) cmd.sessionWriteModel.Start(ctx)
return c.updateSession(ctx, cmd, metadata) return c.updateSession(ctx, cmd, metadata)
} }
func (c *Commands) UpdateSession(ctx context.Context, sessionID, sessionToken string, checks []SessionCheck, metadata map[string][]byte) (set *SessionChanged, err error) { func (c *Commands) UpdateSession(ctx context.Context, sessionID, sessionToken string, cmds []SessionCommand, metadata map[string][]byte) (set *SessionChanged, err error) {
sessionWriteModel := NewSessionWriteModel(sessionID, authz.GetCtxData(ctx).OrgID) sessionWriteModel := NewSessionWriteModel(sessionID, authz.GetCtxData(ctx).OrgID)
err = c.eventstore.FilterToQueryReducer(ctx, sessionWriteModel) err = c.eventstore.FilterToQueryReducer(ctx, sessionWriteModel)
if err != nil { if err != nil {
@ -127,7 +142,7 @@ func (c *Commands) UpdateSession(ctx context.Context, sessionID, sessionToken st
if err := c.sessionPermission(ctx, sessionWriteModel, sessionToken, domain.PermissionSessionWrite); err != nil { if err := c.sessionPermission(ctx, sessionWriteModel, sessionToken, domain.PermissionSessionWrite); err != nil {
return nil, err return nil, err
} }
cmd := c.NewSessionChecks(checks, sessionWriteModel) cmd := c.NewSessionCommands(cmds, sessionWriteModel)
return c.updateSession(ctx, cmd, metadata) return c.updateSession(ctx, cmd, metadata)
} }
@ -154,12 +169,12 @@ func (c *Commands) TerminateSession(ctx context.Context, sessionID, sessionToken
return writeModelToObjectDetails(&sessionWriteModel.WriteModel), nil return writeModelToObjectDetails(&sessionWriteModel.WriteModel), nil
} }
// updateSession execute the [SessionChecks] where new events will be created and as well as for metadata (changes) // updateSession execute the [SessionCommands] where new events will be created and as well as for metadata (changes)
func (c *Commands) updateSession(ctx context.Context, checks *SessionChecks, metadata map[string][]byte) (set *SessionChanged, err error) { func (c *Commands) updateSession(ctx context.Context, checks *SessionCommands, metadata map[string][]byte) (set *SessionChanged, err error) {
if checks.sessionWriteModel.State == domain.SessionStateTerminated { if checks.sessionWriteModel.State == domain.SessionStateTerminated {
return nil, caos_errs.ThrowPreconditionFailed(nil, "COMAND-SAjeh", "Errors.Session.Terminated") return nil, caos_errs.ThrowPreconditionFailed(nil, "COMAND-SAjeh", "Errors.Session.Terminated")
} }
if err := checks.Check(ctx); err != nil { if err := checks.Exec(ctx); err != nil {
// TODO: how to handle failed checks (e.g. pw wrong) https://github.com/zitadel/zitadel/issues/5807 // TODO: how to handle failed checks (e.g. pw wrong) https://github.com/zitadel/zitadel/issues/5807
return nil, err return nil, err
} }

View File

@ -6,10 +6,31 @@ import (
"time" "time"
"github.com/zitadel/zitadel/internal/domain" "github.com/zitadel/zitadel/internal/domain"
caos_errs "github.com/zitadel/zitadel/internal/errors"
"github.com/zitadel/zitadel/internal/eventstore" "github.com/zitadel/zitadel/internal/eventstore"
"github.com/zitadel/zitadel/internal/repository/session" "github.com/zitadel/zitadel/internal/repository/session"
usr_repo "github.com/zitadel/zitadel/internal/repository/user"
) )
type PasskeyChallengeModel struct {
Challenge string
AllowedCrentialIDs [][]byte
UserVerification domain.UserVerificationRequirement
}
func (p *PasskeyChallengeModel) WebAuthNLogin(human *domain.Human, credentialAssertionData []byte) (*domain.WebAuthNLogin, error) {
if p == nil {
return nil, caos_errs.ThrowPreconditionFailed(nil, "COMMAND-Ioqu5", "Errors.Session.Passkey.NoChallenge")
}
return &domain.WebAuthNLogin{
ObjectRoot: human.ObjectRoot,
CredentialAssertionData: credentialAssertionData,
Challenge: p.Challenge,
AllowedCredentialIDs: p.AllowedCrentialIDs,
UserVerification: p.UserVerification,
}, nil
}
type SessionWriteModel struct { type SessionWriteModel struct {
eventstore.WriteModel eventstore.WriteModel
@ -17,9 +38,12 @@ type SessionWriteModel struct {
UserID string UserID string
UserCheckedAt time.Time UserCheckedAt time.Time
PasswordCheckedAt time.Time PasswordCheckedAt time.Time
PasskeyCheckedAt time.Time
Metadata map[string][]byte Metadata map[string][]byte
State domain.SessionState State domain.SessionState
PasskeyChallenge *PasskeyChallengeModel
commands []eventstore.Command commands []eventstore.Command
aggregate *eventstore.Aggregate aggregate *eventstore.Aggregate
} }
@ -44,6 +68,10 @@ func (wm *SessionWriteModel) Reduce() error {
wm.reduceUserChecked(e) wm.reduceUserChecked(e)
case *session.PasswordCheckedEvent: case *session.PasswordCheckedEvent:
wm.reducePasswordChecked(e) wm.reducePasswordChecked(e)
case *session.PasskeyChallengedEvent:
wm.reducePasskeyChallenged(e)
case *session.PasskeyCheckedEvent:
wm.reducePasskeyChecked(e)
case *session.TokenSetEvent: case *session.TokenSetEvent:
wm.reduceTokenSet(e) wm.reduceTokenSet(e)
case *session.TerminateEvent: case *session.TerminateEvent:
@ -62,6 +90,8 @@ func (wm *SessionWriteModel) Query() *eventstore.SearchQueryBuilder {
session.AddedType, session.AddedType,
session.UserCheckedType, session.UserCheckedType,
session.PasswordCheckedType, session.PasswordCheckedType,
session.PasskeyChallengedType,
session.PasskeyCheckedType,
session.TokenSetType, session.TokenSetType,
session.MetadataSetType, session.MetadataSetType,
session.TerminateType, session.TerminateType,
@ -87,6 +117,19 @@ func (wm *SessionWriteModel) reducePasswordChecked(e *session.PasswordCheckedEve
wm.PasswordCheckedAt = e.CheckedAt wm.PasswordCheckedAt = e.CheckedAt
} }
func (wm *SessionWriteModel) reducePasskeyChallenged(e *session.PasskeyChallengedEvent) {
wm.PasskeyChallenge = &PasskeyChallengeModel{
Challenge: e.Challenge,
AllowedCrentialIDs: e.AllowedCrentialIDs,
UserVerification: e.UserVerification,
}
}
func (wm *SessionWriteModel) reducePasskeyChecked(e *session.PasskeyCheckedEvent) {
wm.PasskeyChallenge = nil
wm.PasskeyCheckedAt = e.CheckedAt
}
func (wm *SessionWriteModel) reduceTokenSet(e *session.TokenSetEvent) { func (wm *SessionWriteModel) reduceTokenSet(e *session.TokenSetEvent) {
wm.TokenID = e.TokenID wm.TokenID = e.TokenID
} }
@ -110,6 +153,17 @@ func (wm *SessionWriteModel) PasswordChecked(ctx context.Context, checkedAt time
wm.commands = append(wm.commands, session.NewPasswordCheckedEvent(ctx, wm.aggregate, checkedAt)) wm.commands = append(wm.commands, session.NewPasswordCheckedEvent(ctx, wm.aggregate, checkedAt))
} }
func (wm *SessionWriteModel) PasskeyChallenged(ctx context.Context, challenge string, allowedCrentialIDs [][]byte, userVerification domain.UserVerificationRequirement) {
wm.commands = append(wm.commands, session.NewPasskeyChallengedEvent(ctx, wm.aggregate, challenge, allowedCrentialIDs, userVerification))
}
func (wm *SessionWriteModel) PasskeyChecked(ctx context.Context, checkedAt time.Time, tokenID string, signCount uint32) {
wm.commands = append(wm.commands,
session.NewPasskeyCheckedEvent(ctx, wm.aggregate, checkedAt),
usr_repo.NewHumanPasswordlessSignCountChangedEvent(ctx, wm.aggregate, tokenID, signCount),
)
}
func (wm *SessionWriteModel) SetToken(ctx context.Context, tokenID string) { func (wm *SessionWriteModel) SetToken(ctx context.Context, tokenID string) {
wm.commands = append(wm.commands, session.NewTokenSetEvent(ctx, wm.aggregate, tokenID)) wm.commands = append(wm.commands, session.NewTokenSetEvent(ctx, wm.aggregate, tokenID))
} }

View File

@ -0,0 +1,84 @@
package command
import (
"context"
"encoding/json"
"github.com/zitadel/zitadel/internal/domain"
caos_errs "github.com/zitadel/zitadel/internal/errors"
)
type humanPasskeys struct {
human *domain.Human
tokens []*domain.WebAuthNToken
}
func (s *SessionCommands) getHumanPasskeys(ctx context.Context) (*humanPasskeys, error) {
humanWritemodel, err := s.gethumanWriteModel(ctx)
if err != nil {
return nil, err
}
tokenReadModel, err := s.getHumanPasswordlessTokenReadModel(ctx)
if err != nil {
return nil, err
}
return &humanPasskeys{
human: writeModelToHuman(humanWritemodel),
tokens: readModelToPasswordlessTokens(tokenReadModel),
}, nil
}
func (s *SessionCommands) getHumanPasswordlessTokenReadModel(ctx context.Context) (*HumanPasswordlessTokensReadModel, error) {
tokenReadModel := NewHumanPasswordlessTokensReadModel(s.sessionWriteModel.UserID, s.sessionWriteModel.ResourceOwner)
err := s.eventstore.FilterToQueryReducer(ctx, tokenReadModel)
if err != nil {
return nil, err
}
return tokenReadModel, nil
}
func (c *Commands) CreatePasskeyChallenge(userVerification domain.UserVerificationRequirement, dst json.Unmarshaler) SessionCommand {
return func(ctx context.Context, cmd *SessionCommands) error {
humanPasskeys, err := cmd.getHumanPasskeys(ctx)
if err != nil {
return err
}
webAuthNLogin, err := c.webauthnConfig.BeginLogin(ctx, humanPasskeys.human, userVerification, humanPasskeys.tokens...)
if err != nil {
return err
}
if err = json.Unmarshal(webAuthNLogin.CredentialAssertionData, dst); err != nil {
return caos_errs.ThrowInternal(err, "COMMAND-Yah6A", "Errors.Internal")
}
cmd.sessionWriteModel.PasskeyChallenged(ctx, webAuthNLogin.Challenge, webAuthNLogin.AllowedCredentialIDs, webAuthNLogin.UserVerification)
return nil
}
}
func (c *Commands) CheckPasskey(credentialAssertionData json.Marshaler) SessionCommand {
return func(ctx context.Context, cmd *SessionCommands) error {
credentialAssertionData, err := json.Marshal(credentialAssertionData)
if err != nil {
return caos_errs.ThrowInvalidArgument(err, "COMMAND-ohG2o", "todo")
}
humanPasskeys, err := cmd.getHumanPasskeys(ctx)
if err != nil {
return err
}
webAuthN, err := cmd.sessionWriteModel.PasskeyChallenge.WebAuthNLogin(humanPasskeys.human, credentialAssertionData)
if err != nil {
return err
}
keyID, signCount, err := c.webauthnConfig.FinishLogin(ctx, humanPasskeys.human, webAuthN, credentialAssertionData, humanPasskeys.tokens...)
if err != nil && keyID == nil {
return err
}
_, token := domain.GetTokenByKeyID(humanPasskeys.tokens, keyID)
if token == nil {
return caos_errs.ThrowPreconditionFailed(nil, "COMMAND-Aej7i", "Errors.User.WebAuthN.NotFound")
}
cmd.sessionWriteModel.PasskeyChecked(ctx, cmd.now(), token.WebAuthNTokenID, signCount)
return nil
}
}

View File

@ -0,0 +1,130 @@
package command
import (
"context"
"io"
"testing"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"golang.org/x/text/language"
"github.com/zitadel/zitadel/internal/domain"
caos_errs "github.com/zitadel/zitadel/internal/errors"
"github.com/zitadel/zitadel/internal/eventstore"
"github.com/zitadel/zitadel/internal/eventstore/v1/models"
"github.com/zitadel/zitadel/internal/repository/org"
"github.com/zitadel/zitadel/internal/repository/user"
)
func TestSessionCommands_getHumanPasskeys(t *testing.T) {
userAggr := &user.NewAggregate("user1", "org1").Aggregate
type fields struct {
eventstore *eventstore.Eventstore
sessionWriteModel *SessionWriteModel
}
type res struct {
want *humanPasskeys
err error
}
tests := []struct {
name string
fields fields
res res
}{
{
name: "missing UID",
fields: fields{
eventstore: &eventstore.Eventstore{},
sessionWriteModel: &SessionWriteModel{},
},
res: res{
want: nil,
err: caos_errs.ThrowPreconditionFailed(nil, "COMMAND-eeR2e", "Errors.User.UserIDMissing"),
},
},
{
name: "passwordless filter error",
fields: fields{
eventstore: eventstoreExpect(t,
expectFilter(
eventFromEventPusher(
user.NewHumanAddedEvent(context.Background(),
userAggr,
"", "", "", "", "", language.Georgian,
domain.GenderDiverse, "", true,
),
),
),
expectFilterError(io.ErrClosedPipe),
),
sessionWriteModel: &SessionWriteModel{
UserID: "user1",
},
},
res: res{
want: nil,
err: io.ErrClosedPipe,
},
},
{
name: "ok",
fields: fields{
eventstore: eventstoreExpect(t,
expectFilter(
eventFromEventPusher(
user.NewHumanAddedEvent(context.Background(),
userAggr,
"", "", "", "", "", language.Georgian,
domain.GenderDiverse, "", true,
),
),
),
expectFilter(eventFromEventPusher(
user.NewHumanWebAuthNAddedEvent(eventstore.NewBaseEventForPush(
context.Background(), &org.NewAggregate("org1").Aggregate, user.HumanPasswordlessTokenAddedType,
), "111", "challenge"),
)),
),
sessionWriteModel: &SessionWriteModel{
UserID: "user1",
},
},
res: res{
want: &humanPasskeys{
human: &domain.Human{
ObjectRoot: models.ObjectRoot{
AggregateID: "user1",
ResourceOwner: "org1",
},
State: domain.UserStateActive,
Profile: &domain.Profile{
PreferredLanguage: language.Georgian,
Gender: domain.GenderDiverse,
},
Email: &domain.Email{},
},
tokens: []*domain.WebAuthNToken{{
ObjectRoot: models.ObjectRoot{
AggregateID: "org1",
},
WebAuthNTokenID: "111",
State: domain.MFAStateNotReady,
Challenge: "challenge",
}},
},
err: nil,
},
},
}
for _, tt := range tests {
s := &SessionCommands{
eventstore: tt.fields.eventstore,
sessionWriteModel: tt.fields.sessionWriteModel,
}
got, err := s.getHumanPasskeys(context.Background())
require.ErrorIs(t, err, tt.res.err)
assert.Equal(t, tt.res.want, got)
}
}

View File

@ -2,6 +2,7 @@ package command
import ( import (
"context" "context"
"io"
"testing" "testing"
"time" "time"
@ -21,6 +22,121 @@ import (
"github.com/zitadel/zitadel/internal/repository/user" "github.com/zitadel/zitadel/internal/repository/user"
) )
func TestSessionCommands_getHumanWriteModel(t *testing.T) {
userAggr := &user.NewAggregate("user1", "org1").Aggregate
type fields struct {
eventstore *eventstore.Eventstore
sessionWriteModel *SessionWriteModel
}
type res struct {
want *HumanWriteModel
err error
}
tests := []struct {
name string
fields fields
res res
}{
{
name: "missing UID",
fields: fields{
eventstore: &eventstore.Eventstore{},
sessionWriteModel: &SessionWriteModel{},
},
res: res{
want: nil,
err: caos_errs.ThrowPreconditionFailed(nil, "COMMAND-eeR2e", "Errors.User.UserIDMissing"),
},
},
{
name: "filter error",
fields: fields{
eventstore: eventstoreExpect(t,
expectFilterError(io.ErrClosedPipe),
),
sessionWriteModel: &SessionWriteModel{
UserID: "user1",
},
},
res: res{
want: nil,
err: io.ErrClosedPipe,
},
},
{
name: "removed user",
fields: fields{
eventstore: eventstoreExpect(t,
expectFilter(
eventFromEventPusher(
user.NewHumanAddedEvent(context.Background(),
userAggr,
"", "", "", "", "", language.Georgian,
domain.GenderDiverse, "", true,
),
),
eventFromEventPusher(
user.NewUserRemovedEvent(context.Background(),
userAggr,
"", nil, true,
),
),
),
),
sessionWriteModel: &SessionWriteModel{
UserID: "user1",
},
},
res: res{
want: nil,
err: caos_errs.ThrowPreconditionFailed(nil, "COMMAND-Df4b3", "Errors.ie4Ai.NotFound"),
},
},
{
name: "ok",
fields: fields{
eventstore: eventstoreExpect(t,
expectFilter(
eventFromEventPusher(
user.NewHumanAddedEvent(context.Background(),
userAggr,
"", "", "", "", "", language.Georgian,
domain.GenderDiverse, "", true,
),
),
),
),
sessionWriteModel: &SessionWriteModel{
UserID: "user1",
},
},
res: res{
want: &HumanWriteModel{
WriteModel: eventstore.WriteModel{
AggregateID: "user1",
ResourceOwner: "org1",
Events: []eventstore.Event{},
},
PreferredLanguage: language.Georgian,
Gender: domain.GenderDiverse,
UserState: domain.UserStateActive,
},
err: nil,
},
},
}
for _, tt := range tests {
s := &SessionCommands{
eventstore: tt.fields.eventstore,
sessionWriteModel: tt.fields.sessionWriteModel,
}
got, err := s.gethumanWriteModel(context.Background())
require.ErrorIs(t, err, tt.res.err)
assert.Equal(t, tt.res.want, got)
}
}
func TestCommands_CreateSession(t *testing.T) { func TestCommands_CreateSession(t *testing.T) {
type fields struct { type fields struct {
eventstore *eventstore.Eventstore eventstore *eventstore.Eventstore
@ -29,7 +145,7 @@ func TestCommands_CreateSession(t *testing.T) {
} }
type args struct { type args struct {
ctx context.Context ctx context.Context
checks []SessionCheck checks []SessionCommand
metadata map[string][]byte metadata map[string][]byte
} }
type res struct { type res struct {
@ -126,7 +242,7 @@ func TestCommands_UpdateSession(t *testing.T) {
ctx context.Context ctx context.Context
sessionID string sessionID string
sessionToken string sessionToken string
checks []SessionCheck checks []SessionCommand
metadata map[string][]byte metadata map[string][]byte
} }
type res struct { type res struct {
@ -231,7 +347,7 @@ func TestCommands_updateSession(t *testing.T) {
} }
type args struct { type args struct {
ctx context.Context ctx context.Context
checks *SessionChecks checks *SessionCommands
metadata map[string][]byte metadata map[string][]byte
} }
type res struct { type res struct {
@ -251,7 +367,7 @@ func TestCommands_updateSession(t *testing.T) {
}, },
args{ args{
ctx: context.Background(), ctx: context.Background(),
checks: &SessionChecks{ checks: &SessionCommands{
sessionWriteModel: &SessionWriteModel{State: domain.SessionStateTerminated}, sessionWriteModel: &SessionWriteModel{State: domain.SessionStateTerminated},
}, },
}, },
@ -266,10 +382,10 @@ func TestCommands_updateSession(t *testing.T) {
}, },
args{ args{
ctx: context.Background(), ctx: context.Background(),
checks: &SessionChecks{ checks: &SessionCommands{
sessionWriteModel: NewSessionWriteModel("sessionID", "org1"), sessionWriteModel: NewSessionWriteModel("sessionID", "org1"),
checks: []SessionCheck{ cmds: []SessionCommand{
func(ctx context.Context, cmd *SessionChecks) error { func(ctx context.Context, cmd *SessionCommands) error {
return caos_errs.ThrowInternal(nil, "id", "check failed") return caos_errs.ThrowInternal(nil, "id", "check failed")
}, },
}, },
@ -286,9 +402,9 @@ func TestCommands_updateSession(t *testing.T) {
}, },
args{ args{
ctx: context.Background(), ctx: context.Background(),
checks: &SessionChecks{ checks: &SessionCommands{
sessionWriteModel: NewSessionWriteModel("sessionID", "org1"), sessionWriteModel: NewSessionWriteModel("sessionID", "org1"),
checks: []SessionCheck{}, cmds: []SessionCommand{},
}, },
}, },
res{ res{
@ -321,9 +437,9 @@ func TestCommands_updateSession(t *testing.T) {
}, },
args{ args{
ctx: context.Background(), ctx: context.Background(),
checks: &SessionChecks{ checks: &SessionCommands{
sessionWriteModel: NewSessionWriteModel("sessionID", "org1"), sessionWriteModel: NewSessionWriteModel("sessionID", "org1"),
checks: []SessionCheck{ cmds: []SessionCommand{
CheckUser("userID"), CheckUser("userID"),
CheckPassword("password"), CheckPassword("password"),
}, },

View File

@ -14,7 +14,6 @@ import (
) )
type GenerateMachineSecret struct { type GenerateMachineSecret struct {
ClientID string
ClientSecret string ClientSecret string
} }
@ -53,7 +52,6 @@ func prepareGenerateMachineSecret(a *user.Aggregate, generator crypto.Generator,
if !isUserStateExists(writeModel.UserState) { if !isUserStateExists(writeModel.UserState) {
return nil, caos_errs.ThrowPreconditionFailed(nil, "COMMAND-x8910n", "Errors.User.NotExisting") return nil, caos_errs.ThrowPreconditionFailed(nil, "COMMAND-x8910n", "Errors.User.NotExisting")
} }
set.ClientID = writeModel.UserName
clientSecret, secretString, err := domain.NewMachineClientSecret(generator) clientSecret, secretString, err := domain.NewMachineClientSecret(generator)
if err != nil { if err != nil {

View File

@ -137,7 +137,6 @@ func TestCommandSide_GenerateMachineSecret(t *testing.T) {
ResourceOwner: "org1", ResourceOwner: "org1",
}, },
secret: &GenerateMachineSecret{ secret: &GenerateMachineSecret{
ClientID: "user1",
ClientSecret: "a", ClientSecret: "a",
}, },
}, },
@ -157,7 +156,6 @@ func TestCommandSide_GenerateMachineSecret(t *testing.T) {
} }
if tt.res.err == nil { if tt.res.err == nil {
assert.Equal(t, tt.res.want, got) assert.Equal(t, tt.res.want, got)
assert.Equal(t, tt.args.set.ClientID, tt.res.secret.ClientID)
assert.Equal(t, tt.args.set.ClientSecret, tt.res.secret.ClientSecret) assert.Equal(t, tt.args.set.ClientSecret, tt.res.secret.ClientSecret)
} }
}) })

View File

@ -9,6 +9,7 @@ const (
IDPStateActive IDPStateActive
IDPStateInactive IDPStateInactive
IDPStateRemoved IDPStateRemoved
IDPStateMigrated
idpStateCount idpStateCount
) )
@ -18,7 +19,7 @@ func (s IDPState) Valid() bool {
} }
func (s IDPState) Exists() bool { func (s IDPState) Exists() bool {
return s != IDPStateUnspecified && s != IDPStateRemoved return s != IDPStateUnspecified && s != IDPStateRemoved && s != IDPStateMigrated
} }
type IDPType int32 type IDPType int32

View File

@ -140,7 +140,7 @@ func (db *CRDB) Push(ctx context.Context, events []*repository.Event, uniqueCons
"aggregateType", event.AggregateType, "aggregateType", event.AggregateType,
"eventType", event.Type, "eventType", event.Type,
"instanceID", event.InstanceID, "instanceID", event.InstanceID,
).WithError(err).Info("query failed") ).WithError(err).Debug("query failed")
return caos_errs.ThrowInternal(err, "SQL-SBP37", "unable to create event") return caos_errs.ThrowInternal(err, "SQL-SBP37", "unable to create event")
} }
} }

View File

@ -35,7 +35,7 @@ func AssertDetails[D DetailsMsg](t testing.TB, exptected, actual D) {
gotCD := gotDetails.GetChangeDate().AsTime() gotCD := gotDetails.GetChangeDate().AsTime()
now := time.Now() now := time.Now()
assert.WithinRange(t, gotCD, now.Add(-time.Second), now.Add(time.Second)) assert.WithinRange(t, gotCD, now.Add(-time.Minute), now.Add(time.Minute))
assert.Equal(t, wantDetails.GetResourceOwner(), gotDetails.GetResourceOwner()) assert.Equal(t, wantDetails.GetResourceOwner(), gotDetails.GetResourceOwner())
} }

View File

@ -0,0 +1,77 @@
package integration
import (
"context"
"fmt"
"time"
"github.com/zitadel/logging"
"google.golang.org/grpc"
"github.com/zitadel/zitadel/pkg/grpc/admin"
object "github.com/zitadel/zitadel/pkg/grpc/object/v2alpha"
session "github.com/zitadel/zitadel/pkg/grpc/session/v2alpha"
user "github.com/zitadel/zitadel/pkg/grpc/user/v2alpha"
)
type Client struct {
CC *grpc.ClientConn
Admin admin.AdminServiceClient
UserV2 user.UserServiceClient
SessionV2 session.SessionServiceClient
}
func newClient(cc *grpc.ClientConn) Client {
return Client{
CC: cc,
Admin: admin.NewAdminServiceClient(cc),
UserV2: user.NewUserServiceClient(cc),
SessionV2: session.NewSessionServiceClient(cc),
}
}
func (s *Tester) CreateHumanUser(ctx context.Context) *user.AddHumanUserResponse {
resp, err := s.Client.UserV2.AddHumanUser(ctx, &user.AddHumanUserRequest{
Organisation: &object.Organisation{
Org: &object.Organisation_OrgId{
OrgId: s.Organisation.ID,
},
},
Profile: &user.SetHumanProfile{
FirstName: "Mickey",
LastName: "Mouse",
},
Email: &user.SetHumanEmail{
Email: fmt.Sprintf("%d@mouse.com", time.Now().UnixNano()),
Verification: &user.SetHumanEmail_ReturnCode{
ReturnCode: &user.ReturnEmailVerificationCode{},
},
},
})
logging.OnError(err).Fatal("create human user")
return resp
}
func (s *Tester) RegisterUserPasskey(ctx context.Context, userID string) {
reg, err := s.Client.UserV2.CreatePasskeyRegistrationLink(ctx, &user.CreatePasskeyRegistrationLinkRequest{
UserId: userID,
Medium: &user.CreatePasskeyRegistrationLinkRequest_ReturnCode{},
})
logging.OnError(err).Fatal("create user passkey")
pkr, err := s.Client.UserV2.RegisterPasskey(ctx, &user.RegisterPasskeyRequest{
UserId: userID,
Code: reg.GetCode(),
})
logging.OnError(err).Fatal("create user passkey")
attestationResponse, err := s.WebAuthN.CreateAttestationResponse(pkr.GetPublicKeyCredentialCreationOptions())
logging.OnError(err).Fatal("create user passkey")
_, err = s.Client.UserV2.VerifyPasskeyRegistration(ctx, &user.VerifyPasskeyRegistrationRequest{
UserId: userID,
PasskeyId: pkr.GetPasskeyId(),
PublicKeyCredential: attestationResponse,
PasskeyName: "nice name",
})
logging.OnError(err).Fatal("create user passkey")
}

View File

@ -29,6 +29,7 @@ import (
caos_errs "github.com/zitadel/zitadel/internal/errors" caos_errs "github.com/zitadel/zitadel/internal/errors"
"github.com/zitadel/zitadel/internal/eventstore/v1/models" "github.com/zitadel/zitadel/internal/eventstore/v1/models"
"github.com/zitadel/zitadel/internal/query" "github.com/zitadel/zitadel/internal/query"
"github.com/zitadel/zitadel/internal/webauthn"
"github.com/zitadel/zitadel/pkg/grpc/admin" "github.com/zitadel/zitadel/pkg/grpc/admin"
) )
@ -68,8 +69,9 @@ type Tester struct {
Organisation *query.Org Organisation *query.Org
Users map[UserType]User Users map[UserType]User
GRPCClientConn *grpc.ClientConn Client Client
wg sync.WaitGroup // used for shutdown WebAuthN *webauthn.Client
wg sync.WaitGroup // used for shutdown
} }
const commandLine = `start --masterkeyFromEnv` const commandLine = `start --masterkeyFromEnv`
@ -90,7 +92,7 @@ func (s *Tester) createClientConn(ctx context.Context) {
logging.OnError(err).Fatal("integration tester client dial") logging.OnError(err).Fatal("integration tester client dial")
logging.New().WithField("target", target).Info("finished dialing grpc client conn") logging.New().WithField("target", target).Info("finished dialing grpc client conn")
s.GRPCClientConn = cc s.Client = newClient(cc)
err = s.pollHealth(ctx) err = s.pollHealth(ctx)
logging.OnError(err).Fatal("integration tester health") logging.OnError(err).Fatal("integration tester health")
} }
@ -99,14 +101,12 @@ func (s *Tester) createClientConn(ctx context.Context) {
// TODO: remove when we make the setup blocking on all // TODO: remove when we make the setup blocking on all
// projections completed. // projections completed.
func (s *Tester) pollHealth(ctx context.Context) (err error) { func (s *Tester) pollHealth(ctx context.Context) (err error) {
client := admin.NewAdminServiceClient(s.GRPCClientConn)
for { for {
err = func(ctx context.Context) error { err = func(ctx context.Context) error {
ctx, cancel := context.WithTimeout(ctx, 5*time.Second) ctx, cancel := context.WithTimeout(ctx, 5*time.Second)
defer cancel() defer cancel()
_, err := client.Healthz(ctx, &admin.HealthzRequest{}) _, err := s.Client.Admin.Healthz(ctx, &admin.HealthzRequest{})
return err return err
}(ctx) }(ctx)
if err == nil { if err == nil {
@ -182,7 +182,7 @@ func (s *Tester) WithSystemAuthorization(ctx context.Context, u UserType) contex
// Done send an interrupt signal to cleanly shutdown the server. // Done send an interrupt signal to cleanly shutdown the server.
func (s *Tester) Done() { func (s *Tester) Done() {
err := s.GRPCClientConn.Close() err := s.Client.CC.Close()
logging.OnError(err).Error("integration tester client close") logging.OnError(err).Error("integration tester client close")
s.Shutdown <- os.Interrupt s.Shutdown <- os.Interrupt
@ -238,6 +238,7 @@ func NewTester(ctx context.Context) *Tester {
} }
tester.createClientConn(ctx) tester.createClientConn(ctx)
tester.createSystemUser(ctx) tester.createSystemUser(ctx)
tester.WebAuthN = webauthn.NewClient(tester.Config.WebAuthNName, tester.Config.ExternalDomain, "https://"+tester.Host())
return tester return tester
} }

View File

@ -347,6 +347,10 @@ func (p *idpTemplateProjection) reducers() []handler.AggregateReducer {
Event: instance.OIDCIDPChangedEventType, Event: instance.OIDCIDPChangedEventType,
Reduce: p.reduceOIDCIDPChanged, Reduce: p.reduceOIDCIDPChanged,
}, },
{
Event: instance.OIDCIDPMigratedAzureADEventType,
Reduce: p.reduceOIDCIDPMigratedAzureAD,
},
{ {
Event: instance.JWTIDPAddedEventType, Event: instance.JWTIDPAddedEventType,
Reduce: p.reduceJWTIDPAdded, Reduce: p.reduceJWTIDPAdded,
@ -755,6 +759,106 @@ func (p *idpTemplateProjection) reduceOIDCIDPChanged(event eventstore.Event) (*h
), nil ), nil
} }
func (p *idpTemplateProjection) reduceOIDCIDPMigratedAzureAD(event eventstore.Event) (*handler.Statement, error) {
var idpEvent idp.OIDCIDPMigratedAzureADEvent
switch e := event.(type) {
case *org.OIDCIDPMigratedAzureADEvent:
idpEvent = e.OIDCIDPMigratedAzureADEvent
case *instance.OIDCIDPMigratedAzureADEvent:
idpEvent = e.OIDCIDPMigratedAzureADEvent
default:
return nil, errors.ThrowInvalidArgumentf(nil, "HANDL-p1582ks", "reduce.wrong.event.type %v", []eventstore.EventType{org.OIDCIDPMigratedAzureADEventType, instance.OIDCIDPMigratedAzureADEventType})
}
return crdb.NewMultiStatement(
&idpEvent,
crdb.AddUpdateStatement(
[]handler.Column{
handler.NewCol(IDPTemplateChangeDateCol, idpEvent.CreationDate()),
handler.NewCol(IDPTemplateSequenceCol, idpEvent.Sequence()),
handler.NewCol(IDPTemplateNameCol, idpEvent.Name),
handler.NewCol(IDPTemplateTypeCol, domain.IDPTypeAzureAD),
handler.NewCol(IDPTemplateIsCreationAllowedCol, idpEvent.IsCreationAllowed),
handler.NewCol(IDPTemplateIsLinkingAllowedCol, idpEvent.IsLinkingAllowed),
handler.NewCol(IDPTemplateIsAutoCreationCol, idpEvent.IsAutoCreation),
handler.NewCol(IDPTemplateIsAutoUpdateCol, idpEvent.IsAutoUpdate),
},
[]handler.Condition{
handler.NewCond(IDPTemplateIDCol, idpEvent.ID),
handler.NewCond(IDPTemplateInstanceIDCol, idpEvent.Aggregate().InstanceID),
},
),
crdb.AddDeleteStatement(
[]handler.Condition{
handler.NewCond(OIDCIDCol, idpEvent.ID),
handler.NewCond(OIDCInstanceIDCol, idpEvent.Aggregate().InstanceID),
},
crdb.WithTableSuffix(IDPTemplateOIDCSuffix),
),
crdb.AddCreateStatement(
[]handler.Column{
handler.NewCol(AzureADIDCol, idpEvent.ID),
handler.NewCol(AzureADInstanceIDCol, idpEvent.Aggregate().InstanceID),
handler.NewCol(AzureADClientIDCol, idpEvent.ClientID),
handler.NewCol(AzureADClientSecretCol, idpEvent.ClientSecret),
handler.NewCol(AzureADScopesCol, database.StringArray(idpEvent.Scopes)),
handler.NewCol(AzureADTenantCol, idpEvent.Tenant),
handler.NewCol(AzureADIsEmailVerified, idpEvent.IsEmailVerified),
},
crdb.WithTableSuffix(IDPTemplateAzureADSuffix),
),
), nil
}
func (p *idpTemplateProjection) reduceOIDCIDPMigratedGoogle(event eventstore.Event) (*handler.Statement, error) {
var idpEvent idp.OIDCIDPMigratedGoogleEvent
switch e := event.(type) {
case *org.OIDCIDPMigratedGoogleEvent:
idpEvent = e.OIDCIDPMigratedGoogleEvent
case *instance.OIDCIDPMigratedGoogleEvent:
idpEvent = e.OIDCIDPMigratedGoogleEvent
default:
return nil, errors.ThrowInvalidArgumentf(nil, "HANDL-p1582ks", "reduce.wrong.event.type %v", []eventstore.EventType{org.OIDCIDPMigratedGoogleEventType, instance.OIDCIDPMigratedGoogleEventType})
}
return crdb.NewMultiStatement(
&idpEvent,
crdb.AddUpdateStatement(
[]handler.Column{
handler.NewCol(IDPTemplateChangeDateCol, idpEvent.CreationDate()),
handler.NewCol(IDPTemplateSequenceCol, idpEvent.Sequence()),
handler.NewCol(IDPTemplateNameCol, idpEvent.Name),
handler.NewCol(IDPTemplateTypeCol, domain.IDPTypeGoogle),
handler.NewCol(IDPTemplateIsCreationAllowedCol, idpEvent.IsCreationAllowed),
handler.NewCol(IDPTemplateIsLinkingAllowedCol, idpEvent.IsLinkingAllowed),
handler.NewCol(IDPTemplateIsAutoCreationCol, idpEvent.IsAutoCreation),
handler.NewCol(IDPTemplateIsAutoUpdateCol, idpEvent.IsAutoUpdate),
},
[]handler.Condition{
handler.NewCond(IDPTemplateIDCol, idpEvent.ID),
handler.NewCond(IDPTemplateInstanceIDCol, idpEvent.Aggregate().InstanceID),
},
),
crdb.AddDeleteStatement(
[]handler.Condition{
handler.NewCond(OIDCIDCol, idpEvent.ID),
handler.NewCond(OIDCInstanceIDCol, idpEvent.Aggregate().InstanceID),
},
crdb.WithTableSuffix(IDPTemplateOIDCSuffix),
),
crdb.AddCreateStatement(
[]handler.Column{
handler.NewCol(GoogleIDCol, idpEvent.ID),
handler.NewCol(GoogleInstanceIDCol, idpEvent.Aggregate().InstanceID),
handler.NewCol(GoogleClientIDCol, idpEvent.ClientID),
handler.NewCol(GoogleClientSecretCol, idpEvent.ClientSecret),
handler.NewCol(GoogleScopesCol, database.StringArray(idpEvent.Scopes)),
},
crdb.WithTableSuffix(IDPTemplateGoogleSuffix),
),
), nil
}
func (p *idpTemplateProjection) reduceJWTIDPAdded(event eventstore.Event) (*handler.Statement, error) { func (p *idpTemplateProjection) reduceJWTIDPAdded(event eventstore.Event) (*handler.Statement, error) {
var idpEvent idp.JWTIDPAddedEvent var idpEvent idp.JWTIDPAddedEvent
var idpOwnerType domain.IdentityProviderType var idpOwnerType domain.IdentityProviderType

View File

@ -2686,6 +2686,278 @@ func TestIDPTemplateProjection_reducesOIDC(t *testing.T) {
}, },
}, },
}, },
{
name: "instance reduceOIDCIDPMigratedAzureAD",
args: args{
event: getEvent(testEvent(
repository.EventType(instance.OIDCIDPMigratedAzureADEventType),
instance.AggregateType,
[]byte(`{
"id": "idp-id",
"name": "name",
"client_id": "client_id",
"client_secret": {
"cryptoType": 0,
"algorithm": "RSA-265",
"keyId": "key-id"
},
"tenant": "tenant",
"isEmailVerified": true,
"scopes": ["profile"],
"isCreationAllowed": true,
"isLinkingAllowed": true,
"isAutoCreation": true,
"isAutoUpdate": true
}`),
), instance.OIDCIDPMigratedAzureADEventMapper),
},
reduce: (&idpTemplateProjection{}).reduceOIDCIDPMigratedAzureAD,
want: wantReduce{
aggregateType: eventstore.AggregateType("instance"),
sequence: 15,
previousSequence: 10,
executer: &testExecuter{
executions: []execution{
{
expectedStmt: "UPDATE projections.idp_templates5 SET (change_date, sequence, name, type, is_creation_allowed, is_linking_allowed, is_auto_creation, is_auto_update) = ($1, $2, $3, $4, $5, $6, $7, $8) WHERE (id = $9) AND (instance_id = $10)",
expectedArgs: []interface{}{
anyArg{},
uint64(15),
"name",
domain.IDPTypeAzureAD,
true,
true,
true,
true,
"idp-id",
"instance-id",
},
},
{
expectedStmt: "DELETE FROM projections.idp_templates5_oidc WHERE (idp_id = $1) AND (instance_id = $2)",
expectedArgs: []interface{}{
"idp-id",
"instance-id",
},
},
{
expectedStmt: "INSERT INTO projections.idp_templates5_azure (idp_id, instance_id, client_id, client_secret, scopes, tenant, is_email_verified) VALUES ($1, $2, $3, $4, $5, $6, $7)",
expectedArgs: []interface{}{
"idp-id",
"instance-id",
"client_id",
anyArg{},
database.StringArray{"profile"},
"tenant",
true,
},
},
},
},
},
},
{
name: "org reduceOIDCIDPMigratedAzureAD",
args: args{
event: getEvent(testEvent(
repository.EventType(org.OIDCIDPMigratedAzureADEventType),
org.AggregateType,
[]byte(`{
"id": "idp-id",
"name": "name",
"client_id": "client_id",
"client_secret": {
"cryptoType": 0,
"algorithm": "RSA-265",
"keyId": "key-id"
},
"tenant": "tenant",
"isEmailVerified": true,
"scopes": ["profile"],
"isCreationAllowed": true,
"isLinkingAllowed": true,
"isAutoCreation": true,
"isAutoUpdate": true
}`),
), org.OIDCIDPMigratedAzureADEventMapper),
},
reduce: (&idpTemplateProjection{}).reduceOIDCIDPMigratedAzureAD,
want: wantReduce{
aggregateType: eventstore.AggregateType("org"),
sequence: 15,
previousSequence: 10,
executer: &testExecuter{
executions: []execution{
{
expectedStmt: "UPDATE projections.idp_templates5 SET (change_date, sequence, name, type, is_creation_allowed, is_linking_allowed, is_auto_creation, is_auto_update) = ($1, $2, $3, $4, $5, $6, $7, $8) WHERE (id = $9) AND (instance_id = $10)",
expectedArgs: []interface{}{
anyArg{},
uint64(15),
"name",
domain.IDPTypeAzureAD,
true,
true,
true,
true,
"idp-id",
"instance-id",
},
},
{
expectedStmt: "DELETE FROM projections.idp_templates5_oidc WHERE (idp_id = $1) AND (instance_id = $2)",
expectedArgs: []interface{}{
"idp-id",
"instance-id",
},
},
{
expectedStmt: "INSERT INTO projections.idp_templates5_azure (idp_id, instance_id, client_id, client_secret, scopes, tenant, is_email_verified) VALUES ($1, $2, $3, $4, $5, $6, $7)",
expectedArgs: []interface{}{
"idp-id",
"instance-id",
"client_id",
anyArg{},
database.StringArray{"profile"},
"tenant",
true,
},
},
},
},
},
},
{
name: "instance reduceOIDCIDPMigratedGoogle",
args: args{
event: getEvent(testEvent(
repository.EventType(instance.OIDCIDPMigratedGoogleEventType),
instance.AggregateType,
[]byte(`{
"id": "idp-id",
"name": "name",
"clientId": "client_id",
"clientSecret": {
"cryptoType": 0,
"algorithm": "RSA-265",
"keyId": "key-id"
},
"scopes": ["profile"],
"isCreationAllowed": true,
"isLinkingAllowed": true,
"isAutoCreation": true,
"isAutoUpdate": true
}`),
), instance.OIDCIDPMigratedGoogleEventMapper),
},
reduce: (&idpTemplateProjection{}).reduceOIDCIDPMigratedGoogle,
want: wantReduce{
aggregateType: eventstore.AggregateType("instance"),
sequence: 15,
previousSequence: 10,
executer: &testExecuter{
executions: []execution{
{
expectedStmt: "UPDATE projections.idp_templates5 SET (change_date, sequence, name, type, is_creation_allowed, is_linking_allowed, is_auto_creation, is_auto_update) = ($1, $2, $3, $4, $5, $6, $7, $8) WHERE (id = $9) AND (instance_id = $10)",
expectedArgs: []interface{}{
anyArg{},
uint64(15),
"name",
domain.IDPTypeGoogle,
true,
true,
true,
true,
"idp-id",
"instance-id",
},
},
{
expectedStmt: "DELETE FROM projections.idp_templates5_oidc WHERE (idp_id = $1) AND (instance_id = $2)",
expectedArgs: []interface{}{
"idp-id",
"instance-id",
},
},
{
expectedStmt: "INSERT INTO projections.idp_templates5_google (idp_id, instance_id, client_id, client_secret, scopes) VALUES ($1, $2, $3, $4, $5)",
expectedArgs: []interface{}{
"idp-id",
"instance-id",
"client_id",
anyArg{},
database.StringArray{"profile"},
},
},
},
},
},
},
{
name: "org reduceOIDCIDPMigratedGoogle",
args: args{
event: getEvent(testEvent(
repository.EventType(org.OIDCIDPMigratedGoogleEventType),
org.AggregateType,
[]byte(`{
"id": "idp-id",
"name": "name",
"clientId": "client_id",
"clientSecret": {
"cryptoType": 0,
"algorithm": "RSA-265",
"keyId": "key-id"
},
"scopes": ["profile"],
"isCreationAllowed": true,
"isLinkingAllowed": true,
"isAutoCreation": true,
"isAutoUpdate": true
}`),
), org.OIDCIDPMigratedGoogleEventMapper),
},
reduce: (&idpTemplateProjection{}).reduceOIDCIDPMigratedGoogle,
want: wantReduce{
aggregateType: eventstore.AggregateType("org"),
sequence: 15,
previousSequence: 10,
executer: &testExecuter{
executions: []execution{
{
expectedStmt: "UPDATE projections.idp_templates5 SET (change_date, sequence, name, type, is_creation_allowed, is_linking_allowed, is_auto_creation, is_auto_update) = ($1, $2, $3, $4, $5, $6, $7, $8) WHERE (id = $9) AND (instance_id = $10)",
expectedArgs: []interface{}{
anyArg{},
uint64(15),
"name",
domain.IDPTypeGoogle,
true,
true,
true,
true,
"idp-id",
"instance-id",
},
},
{
expectedStmt: "DELETE FROM projections.idp_templates5_oidc WHERE (idp_id = $1) AND (instance_id = $2)",
expectedArgs: []interface{}{
"idp-id",
"instance-id",
},
},
{
expectedStmt: "INSERT INTO projections.idp_templates5_google (idp_id, instance_id, client_id, client_secret, scopes) VALUES ($1, $2, $3, $4, $5)",
expectedArgs: []interface{}{
"idp-id",
"instance-id",
"client_id",
anyArg{},
database.StringArray{"profile"},
},
},
},
},
},
},
} }
for _, tt := range tests { for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) { t.Run(tt.name, func(t *testing.T) {

View File

@ -13,7 +13,7 @@ import (
) )
const ( const (
SessionsProjectionTable = "projections.sessions" SessionsProjectionTable = "projections.sessions1"
SessionColumnID = "id" SessionColumnID = "id"
SessionColumnCreationDate = "creation_date" SessionColumnCreationDate = "creation_date"
@ -26,6 +26,7 @@ const (
SessionColumnUserID = "user_id" SessionColumnUserID = "user_id"
SessionColumnUserCheckedAt = "user_checked_at" SessionColumnUserCheckedAt = "user_checked_at"
SessionColumnPasswordCheckedAt = "password_checked_at" SessionColumnPasswordCheckedAt = "password_checked_at"
SessionColumnPasskeyCheckedAt = "passkey_checked_at"
SessionColumnMetadata = "metadata" SessionColumnMetadata = "metadata"
SessionColumnTokenID = "token_id" SessionColumnTokenID = "token_id"
) )
@ -51,6 +52,7 @@ func newSessionProjection(ctx context.Context, config crdb.StatementHandlerConfi
crdb.NewColumn(SessionColumnUserID, crdb.ColumnTypeText, crdb.Nullable()), crdb.NewColumn(SessionColumnUserID, crdb.ColumnTypeText, crdb.Nullable()),
crdb.NewColumn(SessionColumnUserCheckedAt, crdb.ColumnTypeTimestamp, crdb.Nullable()), crdb.NewColumn(SessionColumnUserCheckedAt, crdb.ColumnTypeTimestamp, crdb.Nullable()),
crdb.NewColumn(SessionColumnPasswordCheckedAt, crdb.ColumnTypeTimestamp, crdb.Nullable()), crdb.NewColumn(SessionColumnPasswordCheckedAt, crdb.ColumnTypeTimestamp, crdb.Nullable()),
crdb.NewColumn(SessionColumnPasskeyCheckedAt, crdb.ColumnTypeTimestamp, crdb.Nullable()),
crdb.NewColumn(SessionColumnMetadata, crdb.ColumnTypeJSONB, crdb.Nullable()), crdb.NewColumn(SessionColumnMetadata, crdb.ColumnTypeJSONB, crdb.Nullable()),
crdb.NewColumn(SessionColumnTokenID, crdb.ColumnTypeText, crdb.Nullable()), crdb.NewColumn(SessionColumnTokenID, crdb.ColumnTypeText, crdb.Nullable()),
}, },
@ -78,6 +80,10 @@ func (p *sessionProjection) reducers() []handler.AggregateReducer {
Event: session.PasswordCheckedType, Event: session.PasswordCheckedType,
Reduce: p.reducePasswordChecked, Reduce: p.reducePasswordChecked,
}, },
{
Event: session.PasskeyCheckedType,
Reduce: p.reducePasskeyChecked,
},
{ {
Event: session.TokenSetType, Event: session.TokenSetType,
Reduce: p.reduceTokenSet, Reduce: p.reduceTokenSet,
@ -165,6 +171,26 @@ func (p *sessionProjection) reducePasswordChecked(event eventstore.Event) (*hand
), nil ), nil
} }
func (p *sessionProjection) reducePasskeyChecked(event eventstore.Event) (*handler.Statement, error) {
e, ok := event.(*session.PasskeyCheckedEvent)
if !ok {
return nil, errors.ThrowInvalidArgumentf(nil, "HANDL-WieM4", "reduce.wrong.event.type %s", session.PasskeyCheckedType)
}
return crdb.NewUpdateStatement(
e,
[]handler.Column{
handler.NewCol(SessionColumnChangeDate, e.CreationDate()),
handler.NewCol(SessionColumnSequence, e.Sequence()),
handler.NewCol(SessionColumnPasskeyCheckedAt, e.CheckedAt),
},
[]handler.Condition{
handler.NewCond(SessionColumnID, e.Aggregate().ID),
handler.NewCond(SessionColumnInstanceID, e.Aggregate().InstanceID),
},
), nil
}
func (p *sessionProjection) reduceTokenSet(event eventstore.Event) (*handler.Statement, error) { func (p *sessionProjection) reduceTokenSet(event eventstore.Event) (*handler.Statement, error) {
e, ok := event.(*session.TokenSetEvent) e, ok := event.(*session.TokenSetEvent)
if !ok { if !ok {

View File

@ -40,7 +40,7 @@ func TestSessionProjection_reduces(t *testing.T) {
executer: &testExecuter{ executer: &testExecuter{
executions: []execution{ executions: []execution{
{ {
expectedStmt: "INSERT INTO projections.sessions (id, instance_id, creation_date, change_date, resource_owner, state, sequence, creator) VALUES ($1, $2, $3, $4, $5, $6, $7, $8)", expectedStmt: "INSERT INTO projections.sessions1 (id, instance_id, creation_date, change_date, resource_owner, state, sequence, creator) VALUES ($1, $2, $3, $4, $5, $6, $7, $8)",
expectedArgs: []interface{}{ expectedArgs: []interface{}{
"agg-id", "agg-id",
"instance-id", "instance-id",
@ -76,7 +76,7 @@ func TestSessionProjection_reduces(t *testing.T) {
executer: &testExecuter{ executer: &testExecuter{
executions: []execution{ executions: []execution{
{ {
expectedStmt: "UPDATE projections.sessions SET (change_date, sequence, user_id, user_checked_at) = ($1, $2, $3, $4) WHERE (id = $5) AND (instance_id = $6)", expectedStmt: "UPDATE projections.sessions1 SET (change_date, sequence, user_id, user_checked_at) = ($1, $2, $3, $4) WHERE (id = $5) AND (instance_id = $6)",
expectedArgs: []interface{}{ expectedArgs: []interface{}{
anyArg{}, anyArg{},
anyArg{}, anyArg{},
@ -109,7 +109,7 @@ func TestSessionProjection_reduces(t *testing.T) {
executer: &testExecuter{ executer: &testExecuter{
executions: []execution{ executions: []execution{
{ {
expectedStmt: "UPDATE projections.sessions SET (change_date, sequence, password_checked_at) = ($1, $2, $3) WHERE (id = $4) AND (instance_id = $5)", expectedStmt: "UPDATE projections.sessions1 SET (change_date, sequence, password_checked_at) = ($1, $2, $3) WHERE (id = $4) AND (instance_id = $5)",
expectedArgs: []interface{}{ expectedArgs: []interface{}{
anyArg{}, anyArg{},
anyArg{}, anyArg{},
@ -141,7 +141,7 @@ func TestSessionProjection_reduces(t *testing.T) {
executer: &testExecuter{ executer: &testExecuter{
executions: []execution{ executions: []execution{
{ {
expectedStmt: "UPDATE projections.sessions SET (change_date, sequence, token_id) = ($1, $2, $3) WHERE (id = $4) AND (instance_id = $5)", expectedStmt: "UPDATE projections.sessions1 SET (change_date, sequence, token_id) = ($1, $2, $3) WHERE (id = $4) AND (instance_id = $5)",
expectedArgs: []interface{}{ expectedArgs: []interface{}{
anyArg{}, anyArg{},
anyArg{}, anyArg{},
@ -175,7 +175,7 @@ func TestSessionProjection_reduces(t *testing.T) {
executer: &testExecuter{ executer: &testExecuter{
executions: []execution{ executions: []execution{
{ {
expectedStmt: "UPDATE projections.sessions SET (change_date, sequence, metadata) = ($1, $2, $3) WHERE (id = $4) AND (instance_id = $5)", expectedStmt: "UPDATE projections.sessions1 SET (change_date, sequence, metadata) = ($1, $2, $3) WHERE (id = $4) AND (instance_id = $5)",
expectedArgs: []interface{}{ expectedArgs: []interface{}{
anyArg{}, anyArg{},
anyArg{}, anyArg{},
@ -207,7 +207,7 @@ func TestSessionProjection_reduces(t *testing.T) {
executer: &testExecuter{ executer: &testExecuter{
executions: []execution{ executions: []execution{
{ {
expectedStmt: "DELETE FROM projections.sessions WHERE (id = $1) AND (instance_id = $2)", expectedStmt: "DELETE FROM projections.sessions1 WHERE (id = $1) AND (instance_id = $2)",
expectedArgs: []interface{}{ expectedArgs: []interface{}{
"agg-id", "agg-id",
"instance-id", "instance-id",
@ -234,7 +234,7 @@ func TestSessionProjection_reduces(t *testing.T) {
executer: &testExecuter{ executer: &testExecuter{
executions: []execution{ executions: []execution{
{ {
expectedStmt: "DELETE FROM projections.sessions WHERE (instance_id = $1)", expectedStmt: "DELETE FROM projections.sessions1 WHERE (instance_id = $1)",
expectedArgs: []interface{}{ expectedArgs: []interface{}{
"agg-id", "agg-id",
}, },

View File

@ -32,6 +32,7 @@ type Session struct {
Creator string Creator string
UserFactor SessionUserFactor UserFactor SessionUserFactor
PasswordFactor SessionPasswordFactor PasswordFactor SessionPasswordFactor
PasskeyFactor SessionPasskeyFactor
Metadata map[string][]byte Metadata map[string][]byte
} }
@ -46,6 +47,10 @@ type SessionPasswordFactor struct {
PasswordCheckedAt time.Time PasswordCheckedAt time.Time
} }
type SessionPasskeyFactor struct {
PasskeyCheckedAt time.Time
}
type SessionsSearchQueries struct { type SessionsSearchQueries struct {
SearchRequest SearchRequest
Queries []SearchQuery Queries []SearchQuery
@ -108,6 +113,10 @@ var (
name: projection.SessionColumnPasswordCheckedAt, name: projection.SessionColumnPasswordCheckedAt,
table: sessionsTable, table: sessionsTable,
} }
SessionColumnPasskeyCheckedAt = Column{
name: projection.SessionColumnPasskeyCheckedAt,
table: sessionsTable,
}
SessionColumnMetadata = Column{ SessionColumnMetadata = Column{
name: projection.SessionColumnMetadata, name: projection.SessionColumnMetadata,
table: sessionsTable, table: sessionsTable,
@ -198,6 +207,7 @@ func prepareSessionQuery(ctx context.Context, db prepareDatabase) (sq.SelectBuil
LoginNameNameCol.identifier(), LoginNameNameCol.identifier(),
HumanDisplayNameCol.identifier(), HumanDisplayNameCol.identifier(),
SessionColumnPasswordCheckedAt.identifier(), SessionColumnPasswordCheckedAt.identifier(),
SessionColumnPasskeyCheckedAt.identifier(),
SessionColumnMetadata.identifier(), SessionColumnMetadata.identifier(),
SessionColumnToken.identifier(), SessionColumnToken.identifier(),
).From(sessionsTable.identifier()). ).From(sessionsTable.identifier()).
@ -212,6 +222,7 @@ func prepareSessionQuery(ctx context.Context, db prepareDatabase) (sq.SelectBuil
loginName sql.NullString loginName sql.NullString
displayName sql.NullString displayName sql.NullString
passwordCheckedAt sql.NullTime passwordCheckedAt sql.NullTime
passkeyCheckedAt sql.NullTime
metadata database.Map[[]byte] metadata database.Map[[]byte]
token sql.NullString token sql.NullString
) )
@ -229,6 +240,7 @@ func prepareSessionQuery(ctx context.Context, db prepareDatabase) (sq.SelectBuil
&loginName, &loginName,
&displayName, &displayName,
&passwordCheckedAt, &passwordCheckedAt,
&passkeyCheckedAt,
&metadata, &metadata,
&token, &token,
) )
@ -245,6 +257,7 @@ func prepareSessionQuery(ctx context.Context, db prepareDatabase) (sq.SelectBuil
session.UserFactor.LoginName = loginName.String session.UserFactor.LoginName = loginName.String
session.UserFactor.DisplayName = displayName.String session.UserFactor.DisplayName = displayName.String
session.PasswordFactor.PasswordCheckedAt = passwordCheckedAt.Time session.PasswordFactor.PasswordCheckedAt = passwordCheckedAt.Time
session.PasskeyFactor.PasskeyCheckedAt = passkeyCheckedAt.Time
session.Metadata = metadata session.Metadata = metadata
return session, token.String, nil return session, token.String, nil
@ -265,6 +278,7 @@ func prepareSessionsQuery(ctx context.Context, db prepareDatabase) (sq.SelectBui
LoginNameNameCol.identifier(), LoginNameNameCol.identifier(),
HumanDisplayNameCol.identifier(), HumanDisplayNameCol.identifier(),
SessionColumnPasswordCheckedAt.identifier(), SessionColumnPasswordCheckedAt.identifier(),
SessionColumnPasskeyCheckedAt.identifier(),
SessionColumnMetadata.identifier(), SessionColumnMetadata.identifier(),
countColumn.identifier(), countColumn.identifier(),
).From(sessionsTable.identifier()). ).From(sessionsTable.identifier()).
@ -282,6 +296,7 @@ func prepareSessionsQuery(ctx context.Context, db prepareDatabase) (sq.SelectBui
loginName sql.NullString loginName sql.NullString
displayName sql.NullString displayName sql.NullString
passwordCheckedAt sql.NullTime passwordCheckedAt sql.NullTime
passkeyCheckedAt sql.NullTime
metadata database.Map[[]byte] metadata database.Map[[]byte]
) )
@ -298,6 +313,7 @@ func prepareSessionsQuery(ctx context.Context, db prepareDatabase) (sq.SelectBui
&loginName, &loginName,
&displayName, &displayName,
&passwordCheckedAt, &passwordCheckedAt,
&passkeyCheckedAt,
&metadata, &metadata,
&sessions.Count, &sessions.Count,
) )
@ -310,6 +326,7 @@ func prepareSessionsQuery(ctx context.Context, db prepareDatabase) (sq.SelectBui
session.UserFactor.LoginName = loginName.String session.UserFactor.LoginName = loginName.String
session.UserFactor.DisplayName = displayName.String session.UserFactor.DisplayName = displayName.String
session.PasswordFactor.PasswordCheckedAt = passwordCheckedAt.Time session.PasswordFactor.PasswordCheckedAt = passwordCheckedAt.Time
session.PasskeyFactor.PasskeyCheckedAt = passkeyCheckedAt.Time
session.Metadata = metadata session.Metadata = metadata
sessions.Sessions = append(sessions.Sessions, session) sessions.Sessions = append(sessions.Sessions, session)

View File

@ -17,41 +17,43 @@ import (
) )
var ( var (
expectedSessionQuery = regexp.QuoteMeta(`SELECT projections.sessions.id,` + expectedSessionQuery = regexp.QuoteMeta(`SELECT projections.sessions1.id,` +
` projections.sessions.creation_date,` + ` projections.sessions1.creation_date,` +
` projections.sessions.change_date,` + ` projections.sessions1.change_date,` +
` projections.sessions.sequence,` + ` projections.sessions1.sequence,` +
` projections.sessions.state,` + ` projections.sessions1.state,` +
` projections.sessions.resource_owner,` + ` projections.sessions1.resource_owner,` +
` projections.sessions.creator,` + ` projections.sessions1.creator,` +
` projections.sessions.user_id,` + ` projections.sessions1.user_id,` +
` projections.sessions.user_checked_at,` + ` projections.sessions1.user_checked_at,` +
` projections.login_names2.login_name,` + ` projections.login_names2.login_name,` +
` projections.users8_humans.display_name,` + ` projections.users8_humans.display_name,` +
` projections.sessions.password_checked_at,` + ` projections.sessions1.password_checked_at,` +
` projections.sessions.metadata,` + ` projections.sessions1.passkey_checked_at,` +
` projections.sessions.token_id` + ` projections.sessions1.metadata,` +
` FROM projections.sessions` + ` projections.sessions1.token_id` +
` LEFT JOIN projections.login_names2 ON projections.sessions.user_id = projections.login_names2.user_id AND projections.sessions.instance_id = projections.login_names2.instance_id` + ` FROM projections.sessions1` +
` LEFT JOIN projections.users8_humans ON projections.sessions.user_id = projections.users8_humans.user_id AND projections.sessions.instance_id = projections.users8_humans.instance_id` + ` LEFT JOIN projections.login_names2 ON projections.sessions1.user_id = projections.login_names2.user_id AND projections.sessions1.instance_id = projections.login_names2.instance_id` +
` LEFT JOIN projections.users8_humans ON projections.sessions1.user_id = projections.users8_humans.user_id AND projections.sessions1.instance_id = projections.users8_humans.instance_id` +
` AS OF SYSTEM TIME '-1 ms'`) ` AS OF SYSTEM TIME '-1 ms'`)
expectedSessionsQuery = regexp.QuoteMeta(`SELECT projections.sessions.id,` + expectedSessionsQuery = regexp.QuoteMeta(`SELECT projections.sessions1.id,` +
` projections.sessions.creation_date,` + ` projections.sessions1.creation_date,` +
` projections.sessions.change_date,` + ` projections.sessions1.change_date,` +
` projections.sessions.sequence,` + ` projections.sessions1.sequence,` +
` projections.sessions.state,` + ` projections.sessions1.state,` +
` projections.sessions.resource_owner,` + ` projections.sessions1.resource_owner,` +
` projections.sessions.creator,` + ` projections.sessions1.creator,` +
` projections.sessions.user_id,` + ` projections.sessions1.user_id,` +
` projections.sessions.user_checked_at,` + ` projections.sessions1.user_checked_at,` +
` projections.login_names2.login_name,` + ` projections.login_names2.login_name,` +
` projections.users8_humans.display_name,` + ` projections.users8_humans.display_name,` +
` projections.sessions.password_checked_at,` + ` projections.sessions1.password_checked_at,` +
` projections.sessions.metadata,` + ` projections.sessions1.passkey_checked_at,` +
` projections.sessions1.metadata,` +
` COUNT(*) OVER ()` + ` COUNT(*) OVER ()` +
` FROM projections.sessions` + ` FROM projections.sessions1` +
` LEFT JOIN projections.login_names2 ON projections.sessions.user_id = projections.login_names2.user_id AND projections.sessions.instance_id = projections.login_names2.instance_id` + ` LEFT JOIN projections.login_names2 ON projections.sessions1.user_id = projections.login_names2.user_id AND projections.sessions1.instance_id = projections.login_names2.instance_id` +
` LEFT JOIN projections.users8_humans ON projections.sessions.user_id = projections.users8_humans.user_id AND projections.sessions.instance_id = projections.users8_humans.instance_id` + ` LEFT JOIN projections.users8_humans ON projections.sessions1.user_id = projections.users8_humans.user_id AND projections.sessions1.instance_id = projections.users8_humans.instance_id` +
` AS OF SYSTEM TIME '-1 ms'`) ` AS OF SYSTEM TIME '-1 ms'`)
sessionCols = []string{ sessionCols = []string{
@ -67,6 +69,7 @@ var (
"login_name", "login_name",
"display_name", "display_name",
"password_checked_at", "password_checked_at",
"passkey_checked_at",
"metadata", "metadata",
"token", "token",
} }
@ -84,6 +87,7 @@ var (
"login_name", "login_name",
"display_name", "display_name",
"password_checked_at", "password_checked_at",
"passkey_checked_at",
"metadata", "metadata",
"count", "count",
} }
@ -133,6 +137,7 @@ func Test_SessionsPrepare(t *testing.T) {
"login-name", "login-name",
"display-name", "display-name",
testNow, testNow,
testNow,
[]byte(`{"key": "dmFsdWU="}`), []byte(`{"key": "dmFsdWU="}`),
}, },
}, },
@ -160,6 +165,9 @@ func Test_SessionsPrepare(t *testing.T) {
PasswordFactor: SessionPasswordFactor{ PasswordFactor: SessionPasswordFactor{
PasswordCheckedAt: testNow, PasswordCheckedAt: testNow,
}, },
PasskeyFactor: SessionPasskeyFactor{
PasskeyCheckedAt: testNow,
},
Metadata: map[string][]byte{ Metadata: map[string][]byte{
"key": []byte("value"), "key": []byte("value"),
}, },
@ -188,6 +196,7 @@ func Test_SessionsPrepare(t *testing.T) {
"login-name", "login-name",
"display-name", "display-name",
testNow, testNow,
testNow,
[]byte(`{"key": "dmFsdWU="}`), []byte(`{"key": "dmFsdWU="}`),
}, },
{ {
@ -203,6 +212,7 @@ func Test_SessionsPrepare(t *testing.T) {
"login-name2", "login-name2",
"display-name2", "display-name2",
testNow, testNow,
testNow,
[]byte(`{"key": "dmFsdWU="}`), []byte(`{"key": "dmFsdWU="}`),
}, },
}, },
@ -230,6 +240,9 @@ func Test_SessionsPrepare(t *testing.T) {
PasswordFactor: SessionPasswordFactor{ PasswordFactor: SessionPasswordFactor{
PasswordCheckedAt: testNow, PasswordCheckedAt: testNow,
}, },
PasskeyFactor: SessionPasskeyFactor{
PasskeyCheckedAt: testNow,
},
Metadata: map[string][]byte{ Metadata: map[string][]byte{
"key": []byte("value"), "key": []byte("value"),
}, },
@ -251,6 +264,9 @@ func Test_SessionsPrepare(t *testing.T) {
PasswordFactor: SessionPasswordFactor{ PasswordFactor: SessionPasswordFactor{
PasswordCheckedAt: testNow, PasswordCheckedAt: testNow,
}, },
PasskeyFactor: SessionPasskeyFactor{
PasskeyCheckedAt: testNow,
},
Metadata: map[string][]byte{ Metadata: map[string][]byte{
"key": []byte("value"), "key": []byte("value"),
}, },
@ -332,6 +348,7 @@ func Test_SessionPrepare(t *testing.T) {
"login-name", "login-name",
"display-name", "display-name",
testNow, testNow,
testNow,
[]byte(`{"key": "dmFsdWU="}`), []byte(`{"key": "dmFsdWU="}`),
"tokenID", "tokenID",
}, },
@ -354,6 +371,9 @@ func Test_SessionPrepare(t *testing.T) {
PasswordFactor: SessionPasswordFactor{ PasswordFactor: SessionPasswordFactor{
PasswordCheckedAt: testNow, PasswordCheckedAt: testNow,
}, },
PasskeyFactor: SessionPasskeyFactor{
PasskeyCheckedAt: testNow,
},
Metadata: map[string][]byte{ Metadata: map[string][]byte{
"key": []byte("value"), "key": []byte("value"),
}, },

View File

@ -162,3 +162,93 @@ func OIDCIDPChangedEventMapper(event *repository.Event) (eventstore.Event, error
return e, nil return e, nil
} }
type OIDCIDPMigratedAzureADEvent struct {
AzureADIDPAddedEvent
}
func NewOIDCIDPMigratedAzureADEvent(
base *eventstore.BaseEvent,
id,
name,
clientID string,
clientSecret *crypto.CryptoValue,
scopes []string,
tenant string,
isEmailVerified bool,
options Options,
) *OIDCIDPMigratedAzureADEvent {
return &OIDCIDPMigratedAzureADEvent{
AzureADIDPAddedEvent: AzureADIDPAddedEvent{
BaseEvent: *base,
ID: id,
Name: name,
ClientID: clientID,
ClientSecret: clientSecret,
Scopes: scopes,
Tenant: tenant,
IsEmailVerified: isEmailVerified,
Options: options,
},
}
}
func (e *OIDCIDPMigratedAzureADEvent) Data() interface{} {
return e
}
func (e *OIDCIDPMigratedAzureADEvent) UniqueConstraints() []*eventstore.EventUniqueConstraint {
return nil
}
func OIDCIDPMigratedAzureADEventMapper(event *repository.Event) (eventstore.Event, error) {
e, err := AzureADIDPAddedEventMapper(event)
if err != nil {
return nil, err
}
return &OIDCIDPMigratedAzureADEvent{AzureADIDPAddedEvent: *e.(*AzureADIDPAddedEvent)}, nil
}
type OIDCIDPMigratedGoogleEvent struct {
GoogleIDPAddedEvent
}
func NewOIDCIDPMigratedGoogleEvent(
base *eventstore.BaseEvent,
id,
name,
clientID string,
clientSecret *crypto.CryptoValue,
scopes []string,
options Options,
) *OIDCIDPMigratedGoogleEvent {
return &OIDCIDPMigratedGoogleEvent{
GoogleIDPAddedEvent: GoogleIDPAddedEvent{
BaseEvent: *base,
ID: id,
Name: name,
ClientID: clientID,
ClientSecret: clientSecret,
Scopes: scopes,
Options: options,
},
}
}
func (e *OIDCIDPMigratedGoogleEvent) Data() interface{} {
return e
}
func (e *OIDCIDPMigratedGoogleEvent) UniqueConstraints() []*eventstore.EventUniqueConstraint {
return nil
}
func OIDCIDPMigratedGoogleEventMapper(event *repository.Event) (eventstore.Event, error) {
e, err := GoogleIDPAddedEventMapper(event)
if err != nil {
return nil, err
}
return &OIDCIDPMigratedGoogleEvent{GoogleIDPAddedEvent: *e.(*GoogleIDPAddedEvent)}, nil
}

View File

@ -74,6 +74,8 @@ func RegisterEventMappers(es *eventstore.Eventstore) {
RegisterFilterEventMapper(AggregateType, OAuthIDPChangedEventType, OAuthIDPChangedEventMapper). RegisterFilterEventMapper(AggregateType, OAuthIDPChangedEventType, OAuthIDPChangedEventMapper).
RegisterFilterEventMapper(AggregateType, OIDCIDPAddedEventType, OIDCIDPAddedEventMapper). RegisterFilterEventMapper(AggregateType, OIDCIDPAddedEventType, OIDCIDPAddedEventMapper).
RegisterFilterEventMapper(AggregateType, OIDCIDPChangedEventType, OIDCIDPChangedEventMapper). RegisterFilterEventMapper(AggregateType, OIDCIDPChangedEventType, OIDCIDPChangedEventMapper).
RegisterFilterEventMapper(AggregateType, OIDCIDPMigratedAzureADEventType, OIDCIDPMigratedAzureADEventMapper).
RegisterFilterEventMapper(AggregateType, OIDCIDPMigratedGoogleEventType, OIDCIDPMigratedGoogleEventMapper).
RegisterFilterEventMapper(AggregateType, JWTIDPAddedEventType, JWTIDPAddedEventMapper). RegisterFilterEventMapper(AggregateType, JWTIDPAddedEventType, JWTIDPAddedEventMapper).
RegisterFilterEventMapper(AggregateType, JWTIDPChangedEventType, JWTIDPChangedEventMapper). RegisterFilterEventMapper(AggregateType, JWTIDPChangedEventType, JWTIDPChangedEventMapper).
RegisterFilterEventMapper(AggregateType, AzureADIDPAddedEventType, AzureADIDPAddedEventMapper). RegisterFilterEventMapper(AggregateType, AzureADIDPAddedEventType, AzureADIDPAddedEventMapper).

View File

@ -15,6 +15,8 @@ const (
OAuthIDPChangedEventType eventstore.EventType = "instance.idp.oauth.changed" OAuthIDPChangedEventType eventstore.EventType = "instance.idp.oauth.changed"
OIDCIDPAddedEventType eventstore.EventType = "instance.idp.oidc.added" OIDCIDPAddedEventType eventstore.EventType = "instance.idp.oidc.added"
OIDCIDPChangedEventType eventstore.EventType = "instance.idp.oidc.changed" OIDCIDPChangedEventType eventstore.EventType = "instance.idp.oidc.changed"
OIDCIDPMigratedAzureADEventType eventstore.EventType = "instance.idp.oidc.migrated.azure"
OIDCIDPMigratedGoogleEventType eventstore.EventType = "instance.idp.oidc.migrated.google"
JWTIDPAddedEventType eventstore.EventType = "instance.idp.jwt.added" JWTIDPAddedEventType eventstore.EventType = "instance.idp.jwt.added"
JWTIDPChangedEventType eventstore.EventType = "instance.idp.jwt.changed" JWTIDPChangedEventType eventstore.EventType = "instance.idp.jwt.changed"
AzureADIDPAddedEventType eventstore.EventType = "instance.idp.azure.added" AzureADIDPAddedEventType eventstore.EventType = "instance.idp.azure.added"
@ -198,6 +200,90 @@ func OIDCIDPChangedEventMapper(event *repository.Event) (eventstore.Event, error
return &OIDCIDPChangedEvent{OIDCIDPChangedEvent: *e.(*idp.OIDCIDPChangedEvent)}, nil return &OIDCIDPChangedEvent{OIDCIDPChangedEvent: *e.(*idp.OIDCIDPChangedEvent)}, nil
} }
type OIDCIDPMigratedAzureADEvent struct {
idp.OIDCIDPMigratedAzureADEvent
}
func NewOIDCIDPMigratedAzureADEvent(
ctx context.Context,
aggregate *eventstore.Aggregate,
id,
name,
clientID string,
clientSecret *crypto.CryptoValue,
scopes []string,
tenant string,
isEmailVerified bool,
options idp.Options,
) *OIDCIDPMigratedAzureADEvent {
return &OIDCIDPMigratedAzureADEvent{
OIDCIDPMigratedAzureADEvent: *idp.NewOIDCIDPMigratedAzureADEvent(
eventstore.NewBaseEventForPush(
ctx,
aggregate,
OIDCIDPMigratedAzureADEventType,
),
id,
name,
clientID,
clientSecret,
scopes,
tenant,
isEmailVerified,
options,
),
}
}
func OIDCIDPMigratedAzureADEventMapper(event *repository.Event) (eventstore.Event, error) {
e, err := idp.OIDCIDPMigratedAzureADEventMapper(event)
if err != nil {
return nil, err
}
return &OIDCIDPMigratedAzureADEvent{OIDCIDPMigratedAzureADEvent: *e.(*idp.OIDCIDPMigratedAzureADEvent)}, nil
}
type OIDCIDPMigratedGoogleEvent struct {
idp.OIDCIDPMigratedGoogleEvent
}
func NewOIDCIDPMigratedGoogleEvent(
ctx context.Context,
aggregate *eventstore.Aggregate,
id,
name,
clientID string,
clientSecret *crypto.CryptoValue,
scopes []string,
options idp.Options,
) *OIDCIDPMigratedGoogleEvent {
return &OIDCIDPMigratedGoogleEvent{
OIDCIDPMigratedGoogleEvent: *idp.NewOIDCIDPMigratedGoogleEvent(
eventstore.NewBaseEventForPush(
ctx,
aggregate,
OIDCIDPMigratedAzureADEventType,
),
id,
name,
clientID,
clientSecret,
scopes,
options,
),
}
}
func OIDCIDPMigratedGoogleEventMapper(event *repository.Event) (eventstore.Event, error) {
e, err := idp.OIDCIDPMigratedGoogleEventMapper(event)
if err != nil {
return nil, err
}
return &OIDCIDPMigratedGoogleEvent{OIDCIDPMigratedGoogleEvent: *e.(*idp.OIDCIDPMigratedGoogleEvent)}, nil
}
type JWTIDPAddedEvent struct { type JWTIDPAddedEvent struct {
idp.JWTIDPAddedEvent idp.JWTIDPAddedEvent
} }

View File

@ -83,6 +83,8 @@ func RegisterEventMappers(es *eventstore.Eventstore) {
RegisterFilterEventMapper(AggregateType, OAuthIDPChangedEventType, OAuthIDPChangedEventMapper). RegisterFilterEventMapper(AggregateType, OAuthIDPChangedEventType, OAuthIDPChangedEventMapper).
RegisterFilterEventMapper(AggregateType, OIDCIDPAddedEventType, OIDCIDPAddedEventMapper). RegisterFilterEventMapper(AggregateType, OIDCIDPAddedEventType, OIDCIDPAddedEventMapper).
RegisterFilterEventMapper(AggregateType, OIDCIDPChangedEventType, OIDCIDPChangedEventMapper). RegisterFilterEventMapper(AggregateType, OIDCIDPChangedEventType, OIDCIDPChangedEventMapper).
RegisterFilterEventMapper(AggregateType, OIDCIDPMigratedAzureADEventType, OIDCIDPMigratedAzureADEventMapper).
RegisterFilterEventMapper(AggregateType, OIDCIDPMigratedGoogleEventType, OIDCIDPMigratedGoogleEventMapper).
RegisterFilterEventMapper(AggregateType, JWTIDPAddedEventType, JWTIDPAddedEventMapper). RegisterFilterEventMapper(AggregateType, JWTIDPAddedEventType, JWTIDPAddedEventMapper).
RegisterFilterEventMapper(AggregateType, JWTIDPChangedEventType, JWTIDPChangedEventMapper). RegisterFilterEventMapper(AggregateType, JWTIDPChangedEventType, JWTIDPChangedEventMapper).
RegisterFilterEventMapper(AggregateType, AzureADIDPAddedEventType, AzureADIDPAddedEventMapper). RegisterFilterEventMapper(AggregateType, AzureADIDPAddedEventType, AzureADIDPAddedEventMapper).

View File

@ -15,6 +15,8 @@ const (
OAuthIDPChangedEventType eventstore.EventType = "org.idp.oauth.changed" OAuthIDPChangedEventType eventstore.EventType = "org.idp.oauth.changed"
OIDCIDPAddedEventType eventstore.EventType = "org.idp.oidc.added" OIDCIDPAddedEventType eventstore.EventType = "org.idp.oidc.added"
OIDCIDPChangedEventType eventstore.EventType = "org.idp.oidc.changed" OIDCIDPChangedEventType eventstore.EventType = "org.idp.oidc.changed"
OIDCIDPMigratedAzureADEventType eventstore.EventType = "org.idp.oidc.migrated.azure"
OIDCIDPMigratedGoogleEventType eventstore.EventType = "org.idp.oidc.migrated.google"
JWTIDPAddedEventType eventstore.EventType = "org.idp.jwt.added" JWTIDPAddedEventType eventstore.EventType = "org.idp.jwt.added"
JWTIDPChangedEventType eventstore.EventType = "org.idp.jwt.changed" JWTIDPChangedEventType eventstore.EventType = "org.idp.jwt.changed"
AzureADIDPAddedEventType eventstore.EventType = "org.idp.azure.added" AzureADIDPAddedEventType eventstore.EventType = "org.idp.azure.added"
@ -198,6 +200,90 @@ func OIDCIDPChangedEventMapper(event *repository.Event) (eventstore.Event, error
return &OIDCIDPChangedEvent{OIDCIDPChangedEvent: *e.(*idp.OIDCIDPChangedEvent)}, nil return &OIDCIDPChangedEvent{OIDCIDPChangedEvent: *e.(*idp.OIDCIDPChangedEvent)}, nil
} }
type OIDCIDPMigratedAzureADEvent struct {
idp.OIDCIDPMigratedAzureADEvent
}
func NewOIDCIDPMigratedAzureADEvent(
ctx context.Context,
aggregate *eventstore.Aggregate,
id,
name,
clientID string,
clientSecret *crypto.CryptoValue,
scopes []string,
tenant string,
isEmailVerified bool,
options idp.Options,
) *OIDCIDPMigratedAzureADEvent {
return &OIDCIDPMigratedAzureADEvent{
OIDCIDPMigratedAzureADEvent: *idp.NewOIDCIDPMigratedAzureADEvent(
eventstore.NewBaseEventForPush(
ctx,
aggregate,
OIDCIDPMigratedAzureADEventType,
),
id,
name,
clientID,
clientSecret,
scopes,
tenant,
isEmailVerified,
options,
),
}
}
func OIDCIDPMigratedAzureADEventMapper(event *repository.Event) (eventstore.Event, error) {
e, err := idp.OIDCIDPMigratedAzureADEventMapper(event)
if err != nil {
return nil, err
}
return &OIDCIDPMigratedAzureADEvent{OIDCIDPMigratedAzureADEvent: *e.(*idp.OIDCIDPMigratedAzureADEvent)}, nil
}
type OIDCIDPMigratedGoogleEvent struct {
idp.OIDCIDPMigratedGoogleEvent
}
func NewOIDCIDPMigratedGoogleEvent(
ctx context.Context,
aggregate *eventstore.Aggregate,
id,
name,
clientID string,
clientSecret *crypto.CryptoValue,
scopes []string,
options idp.Options,
) *OIDCIDPMigratedGoogleEvent {
return &OIDCIDPMigratedGoogleEvent{
OIDCIDPMigratedGoogleEvent: *idp.NewOIDCIDPMigratedGoogleEvent(
eventstore.NewBaseEventForPush(
ctx,
aggregate,
OIDCIDPMigratedGoogleEventType,
),
id,
name,
clientID,
clientSecret,
scopes,
options,
),
}
}
func OIDCIDPMigratedGoogleEventMapper(event *repository.Event) (eventstore.Event, error) {
e, err := idp.OIDCIDPMigratedGoogleEventMapper(event)
if err != nil {
return nil, err
}
return &OIDCIDPMigratedGoogleEvent{OIDCIDPMigratedGoogleEvent: *e.(*idp.OIDCIDPMigratedGoogleEvent)}, nil
}
type JWTIDPAddedEvent struct { type JWTIDPAddedEvent struct {
idp.JWTIDPAddedEvent idp.JWTIDPAddedEvent
} }

View File

@ -6,6 +6,8 @@ func RegisterEventMappers(es *eventstore.Eventstore) {
es.RegisterFilterEventMapper(AggregateType, AddedType, AddedEventMapper). es.RegisterFilterEventMapper(AggregateType, AddedType, AddedEventMapper).
RegisterFilterEventMapper(AggregateType, UserCheckedType, UserCheckedEventMapper). RegisterFilterEventMapper(AggregateType, UserCheckedType, UserCheckedEventMapper).
RegisterFilterEventMapper(AggregateType, PasswordCheckedType, PasswordCheckedEventMapper). RegisterFilterEventMapper(AggregateType, PasswordCheckedType, PasswordCheckedEventMapper).
RegisterFilterEventMapper(AggregateType, PasskeyChallengedType, eventstore.GenericEventMapper[PasskeyChallengedEvent]).
RegisterFilterEventMapper(AggregateType, PasskeyCheckedType, eventstore.GenericEventMapper[PasskeyCheckedEvent]).
RegisterFilterEventMapper(AggregateType, TokenSetType, TokenSetEventMapper). RegisterFilterEventMapper(AggregateType, TokenSetType, TokenSetEventMapper).
RegisterFilterEventMapper(AggregateType, MetadataSetType, MetadataSetEventMapper). RegisterFilterEventMapper(AggregateType, MetadataSetType, MetadataSetEventMapper).
RegisterFilterEventMapper(AggregateType, TerminateType, TerminateEventMapper) RegisterFilterEventMapper(AggregateType, TerminateType, TerminateEventMapper)

View File

@ -5,19 +5,22 @@ import (
"encoding/json" "encoding/json"
"time" "time"
"github.com/zitadel/zitadel/internal/domain"
"github.com/zitadel/zitadel/internal/errors" "github.com/zitadel/zitadel/internal/errors"
"github.com/zitadel/zitadel/internal/eventstore" "github.com/zitadel/zitadel/internal/eventstore"
"github.com/zitadel/zitadel/internal/eventstore/repository" "github.com/zitadel/zitadel/internal/eventstore/repository"
) )
const ( const (
sessionEventPrefix = "session." sessionEventPrefix = "session."
AddedType = sessionEventPrefix + "added" AddedType = sessionEventPrefix + "added"
UserCheckedType = sessionEventPrefix + "user.checked" UserCheckedType = sessionEventPrefix + "user.checked"
PasswordCheckedType = sessionEventPrefix + "password.checked" PasswordCheckedType = sessionEventPrefix + "password.checked"
TokenSetType = sessionEventPrefix + "token.set" PasskeyChallengedType = sessionEventPrefix + "passkey.challenged"
MetadataSetType = sessionEventPrefix + "metadata.set" PasskeyCheckedType = sessionEventPrefix + "passkey.checked"
TerminateType = sessionEventPrefix + "terminated" TokenSetType = sessionEventPrefix + "token.set"
MetadataSetType = sessionEventPrefix + "metadata.set"
TerminateType = sessionEventPrefix + "terminated"
) )
type AddedEvent struct { type AddedEvent struct {
@ -141,6 +144,78 @@ func PasswordCheckedEventMapper(event *repository.Event) (eventstore.Event, erro
return added, nil return added, nil
} }
type PasskeyChallengedEvent struct {
eventstore.BaseEvent `json:"-"`
Challenge string `json:"challenge,omitempty"`
AllowedCrentialIDs [][]byte `json:"allowedCrentialIDs,omitempty"`
UserVerification domain.UserVerificationRequirement `json:"userVerification,omitempty"`
}
func (e *PasskeyChallengedEvent) Data() interface{} {
return e
}
func (e *PasskeyChallengedEvent) UniqueConstraints() []*eventstore.EventUniqueConstraint {
return nil
}
func (e *PasskeyChallengedEvent) SetBaseEvent(base *eventstore.BaseEvent) {
e.BaseEvent = *base
}
func NewPasskeyChallengedEvent(
ctx context.Context,
aggregate *eventstore.Aggregate,
challenge string,
allowedCrentialIDs [][]byte,
userVerification domain.UserVerificationRequirement,
) *PasskeyChallengedEvent {
return &PasskeyChallengedEvent{
BaseEvent: *eventstore.NewBaseEventForPush(
ctx,
aggregate,
PasskeyChallengedType,
),
Challenge: challenge,
AllowedCrentialIDs: allowedCrentialIDs,
UserVerification: userVerification,
}
}
type PasskeyCheckedEvent struct {
eventstore.BaseEvent `json:"-"`
CheckedAt time.Time `json:"checkedAt"`
}
func (e *PasskeyCheckedEvent) Data() interface{} {
return e
}
func (e *PasskeyCheckedEvent) UniqueConstraints() []*eventstore.EventUniqueConstraint {
return nil
}
func (e *PasskeyCheckedEvent) SetBaseEvent(base *eventstore.BaseEvent) {
e.BaseEvent = *base
}
func NewPasskeyCheckedEvent(
ctx context.Context,
aggregate *eventstore.Aggregate,
checkedAt time.Time,
) *PasswordCheckedEvent {
return &PasswordCheckedEvent{
BaseEvent: *eventstore.NewBaseEventForPush(
ctx,
aggregate,
PasskeyCheckedType,
),
CheckedAt: checkedAt,
}
}
type TokenSetEvent struct { type TokenSetEvent struct {
eventstore.BaseEvent `json:"-"` eventstore.BaseEvent `json:"-"`

View File

@ -476,6 +476,8 @@ Errors:
Terminated: Session bereits beendet Terminated: Session bereits beendet
Token: Token:
Invalid: Session Token ist ungültig Invalid: Session Token ist ungültig
Passkey:
NoChallenge: Sitzung ohne Passkey-Herausforderung
Intent: Intent:
IDPMissing: IDP ID fehlt im Request IDPMissing: IDP ID fehlt im Request
SuccessURLMissing: Success URL fehlt im Request SuccessURLMissing: Success URL fehlt im Request

View File

@ -476,6 +476,8 @@ Errors:
Terminated: Session already terminated Terminated: Session already terminated
Token: Token:
Invalid: Session Token is invalid Invalid: Session Token is invalid
Passkey:
NoChallenge: Session without passkey challenge
Intent: Intent:
IDPMissing: IDP ID is missing in the request IDPMissing: IDP ID is missing in the request
SuccessURLMissing: Success URL is missing in the request SuccessURLMissing: Success URL is missing in the request

View File

@ -476,6 +476,8 @@ Errors:
Terminated: Sesión ya terminada Terminated: Sesión ya terminada
Token: Token:
Invalid: El identificador de sesión no es válido Invalid: El identificador de sesión no es válido
Passkey:
NoChallenge: Sesión sin desafío de contraseña
Intent: Intent:
IDPMissing: Falta IDP en la solicitud IDPMissing: Falta IDP en la solicitud
SuccessURLMissing: Falta la URL de éxito en la solicitud SuccessURLMissing: Falta la URL de éxito en la solicitud

View File

@ -476,6 +476,8 @@ Errors:
Terminated: La session est déjà terminée Terminated: La session est déjà terminée
Token: Token:
Invalid: Le jeton de session n'est pas valide Invalid: Le jeton de session n'est pas valide
Passkey:
NoChallenge: Session sans défi de clé d'accès
Intent: Intent:
IDPMissing: IDP manquant dans la requête IDPMissing: IDP manquant dans la requête
SuccessURLMissing: Success URL absent de la requête SuccessURLMissing: Success URL absent de la requête

View File

@ -476,6 +476,8 @@ Errors:
Terminated: Sessione già terminata Terminated: Sessione già terminata
Token: Token:
Invalid: Il token della sessione non è valido Invalid: Il token della sessione non è valido
Passkey:
NoChallenge: Sessione senza sfida passkey
Intent: Intent:
IDPMissing: IDP mancante nella richiesta IDPMissing: IDP mancante nella richiesta
SuccessURLMissing: URL di successo mancante nella richiesta SuccessURLMissing: URL di successo mancante nella richiesta

Some files were not shown because too many files have changed in this diff Show More