This post is the next installment in my series titled Connect a WinForms Data Grid to an Arbitrary ASP.NET Core WebAPI Service Powered by EF Core. The content relates to my previously published post Modern Desktop Apps And Their Complex Architectures and aims to illustrate an application system architecture that includes a data access service in addition to a WinForms application with a DevExpress Data Grid.
Table of Contents
- Intro — Modern Desktop Apps And Their Complex Architectures | Choosing a Framework/App Architecture for Desktop & Mobile Cross-Platform Apps / GitHub sample
- Part 1 — Connect a WinForms Data Grid to an Arbitrary ASP.NET Core WebAPI Service Powered by EF Core — Architecture and Data Binding / GitHub sample
- Part 2 — Connect a WinForms Data Grid to an Arbitrary ASP.NET Core WebAPI Service Powered by EF Core — Add Editing Features / GitHub sample
- Part 3 — Connect a WinForms Data Grid to an Arbitrary ASP.NET Core WebAPI Service Powered by EF Core — Authenticate users and protect data / GitHub sample(this post)
- Part 4 — Connect a .NET Desktop Client to a Secure Backend Web API Service (EF Core with OData)
- Part 5 — Connect a .NET Desktop Client to a Backend Using a Middle Tier Server (EF Core without OData)
- Part 6 (TBD) — Connect a .NET Desktop Client to Azure Databases with Data API Builder
- Part 7 (TBD) — Connect a .NET Desktop Client to GraphQL APIs
We also have related blog series, which may be of interest for you as well: JavaScript — Consume the DevExpress Backend Web API with Svelte (7 parts from data editing to validation, localization, reporting).
The demo repository
You can find the sample code for this demo in the GitHub repository. The Readme file describes how to run the sample. Please contact us if you have any questions or comments!
If you are interested in a few technical points about the sample code, please read on for a description of the sample structure and some of the relevant code files.
Overview of security-related use-case scenarios
With runtime configurable data access options enabled by a GridControl, and editing features added in the previous post, it is time to add a very important feature that is really a cross-cutting concern in all software: security. For a start, we will allow the user to log in to the application and limit their access depending on permissions derived from a simple role assignment.
We will employ claims based authentication, but in step 1 we will stick to a simple authentication approach which assumes that the client app can “see” username and password when they are passed through to the authentication service. This approach is based on a flow called Resource Owner Password Credentials (ROPC). Today, this is generally not the preferred approach, since we have alternative flows where authentication can be handled without allowing the client app access to user credentials. But typical client/servers of the past usually had access to this information, and the shift to claims based authentication with a separate token service is already an important and somewhat complex step — so for this post it will be enough, and we’ll cover other options in later posts.
If you’re interested, Microsoft has documentation for the ROPC Flow— somewhat specific to Entra ID, but the diagram and details apply to other implementations, too.
The following text is quite long and detailed, hopefully you will find all the information you need for your own implementation. Please read on for the following four blocks:
- Run and configure Keycloak describes the setup of a Docker-based instance of the Keycloak service, which handles user management and token services
- Activate authentication and authorization on the server is about the process of activating JWT authentication and authorization for the Data Service project in the sample solution.
- Enable user logins in the WinForms app contains a description of the basic steps required to log in with Keycloak, retrieve an access token and pass it to the Data Service.
- Finally, Evaluate token claims on the client describes some steps that are important but optional, to allow the client app to adapt depending on permissions granted to a user.
Choosing the right auth provider
There are many services, cloud-based and stand-alone, which can handle users and groups and their credentials, and issue tokens for use in application systems. Microsoft Entra ID is one such option, a rather powerful and therefore complex one, others include Auth0 and Amazon Cognito. There are also many OAuth 2.0 compatible identity providers which don’t include the functionality to manage your own user accounts — you have probably used your GitHub or Google account to sign in to a third party service before!
For this demo we chose to use an Open Source solution called Keycloak, which can be run on any machine by using a container. Keycloak is a powerful and widely used solution as well, named an “Incubation Project” by the Cloud Native Computing Foundation, so you should certainly consider it if you’re looking for a solution for your own projects.
Run and configure Keycloak
The following description assumes the use of Docker, but of course Podman or similar alternatives will also work. To run Keycloak in a container, with Docker Desktop installed on your machine, you just need a single command:
> docker run -p 8080:8080 -e KEYCLOAK_ADMIN=admin -e KEYCLOAK_ADMIN_PASSWORD=admin -v ./data:/opt/keycloak/data quay.io/keycloak/keycloak:latest start-dev
The two -e
options provide environment variables
to the container, which configure the initial admin access. For
dev and test purposes, a simple password is good enough, but be
careful in deployment!
The -v
option with its parameter is optional. It
means that the data directory of the Keycloak process in the
container is mounted from the host machin. The host directory
data
(in the current path when you run the
command) will receive the data from the inside of the
container, and you won’t lose your setup if you stop and
restart the container. Without this, any data would be stored
only in the container itself and deleted when the container is
removed.
The -p
option makes the port 8080 from inside the
container available on the host machine (using the same port).
This means that you can now connect to
http://localhost:8080/admin
and communicate with
the process in the running container. If you’re following
along, bring up this page in a browser now, to configure
Keycloak for the demo.
Once you signed in with the credentials you set in the
environment variables in the previous step, you find yourself
in the master realm. Open the combo box in the top left corner
and click the button “Create Realm”. Enter a realm name of your
own choosing and click Create. For the demo, I will
call the realm winappdemo
.
Now select Clients from the navigation bar on the
left. Click Create Client to create a new
application-specific client registration. Enter a “Client ID”.
For the demo, I’m going to use the simple name
app1
.
Click Next and make sure that Direct access grants is active. This is required for Keycloak to support the Resource Owner Password Credentials Flow.
No further changes are required and you can click Save to create the client. Select Realm Roles from the navigation bar now and then click Create role.
Note: it is also possible to create per-client roles, but for this demo I chose to use realm roles which could apply to multiple clients within the same realm.
Give the role the name writers
and save it.
The final piece of the user management puzzle for this demo is the creation of user accounts. Before we get there though, select Realm Settings from the navigation bar and go to the Login tab. Switch off the setting Login with email. Now activate the tab User Profile. Click the email entry and switch the Required field flag off. These setting changes make it easier to create test accounts as needed, and email addresses are not required for the purpose. Note that if you wish, you can also configure the First Name and Last Name fields to be optional — this makes the quick generation of test accounts even more straightforward. However, the frontend app will include a part where this information is read from the token, so it makes sense to leave it in place for now.
Now select Users from the navigation bar and click
Create a new user. Assign the Usernamewriter
, enter random values for the first and last
name, and save the account.
Note that in addition to the username, the first and last name fields are required. The Keycloak UI does not reflect this correctly in current versions.
On the Credentials tab, click Set Password and enter a password. Make sure to deselect Temporary so the user can log in with the password right away.
Go to the tab Role mapping now and click
Assign role. Use the filter combo box to filter by
realm roles and assign the role writers
.
Finally, create a second user called reader
.
Assign first name, last name, and a password, in the same way
as before, but do not assign a role. This gives you two user
accounts to test with, one that has write permissions, and one
without.
Now Keycloak is configured, you can use a simple command line
tool like curl
to simulate the process of
retrieving an access token based on the user credentials. Just
post the user credentials to the right endpoint and receive
back the token in JSON format!
> curl -X POST http://localhost:8080/realms/winappdemo/protocol/openid-connect/token -d "grant_type=password" -d "client_id=app1" -d "username=reader" -d "password=reader"
{"access_token":"eyJhbGciOiJSUzI1Ni ....
Note: of course you should make sure to run Keycloak behind an HTTPS proxy in reality, in case you decide to use it for your real solution!
Activate authentication and authorization on the server
In the DataService
application of the demo
solution, there are now established rules to secure the service
against any access by unauthorized users. This setup required
two sets of changes.
First, the initialization of the service has been extended to
configure the JwtBearer
authentication scheme
using standard ASP.NET Core functionality. The central part of
this is the block where the
TokenValidationParameters
are set up:
options.TokenValidationParameters = new TokenValidationParameters
{
ValidateIssuer = true,
ValidIssuer = $"{builder.Configuration["Jwt:KeycloakUrl"]}/realms/{builder.Configuration["Jwt:Realm"]}",
ValidateAudience = true,
ValidAudience = builder.Configuration["Jwt:Audience"],
ValidateLifetime = true,
ValidateIssuerSigningKey = true,
IssuerSigningKey = publicKey
};
When the setup is complete and the service is running, each
incoming request will be checked and a contained JWT (JSON Web
Token) validated according to these rules. The public key
assigned to the IssuerSigningKey
field is fetched
from the Keycloak server — if the signature is found to be
valid, we know that the token has been issued by this server.
All the details are loaded from appsettings.json
,
which includes this block:
"Jwt": {
"Issuer": "http://localhost:8080/realms/winappdemo",
"Audience": "account",
"KeycloakUrl": "http://localhost:8080",
"Realm": "winappdemo"
}
In case you adjust any of the values in your own tests with this demo, you may need to make changes. In reality you can set up the values as you need, and you can extend or change the validation settings to match your requirements.
To reflect the structure of the access token with the specific
configuration of the Keycloak-based realm roles, the helper
method RequireRealmRole
has been added to the
project, and it is used to set up a policy which reflects the
writers
role you assigned for one of the demo user
accounts.
builder.Services.AddAuthorization(o =>
{
o.AddPolicy("writers", p => p.RequireRealmRole("writers"));
});
...
public static class PolicyHelpers
{
public static void RequireRealmRole(this AuthorizationPolicyBuilder policy, string roleName)
{
policy.RequireAssertion(context =>
{
var realmAccess = context.User.FindFirst("realm_access")?.Value;
if (realmAccess == null) return false;
var node = JsonNode.Parse(realmAccess);
if (node == null || node["roles"] == null) return false;
var array = node["roles"]!.AsArray();
return array.Select(r => r?.GetValue<string>()).Contains(roleName);
});
}
}
With the initialization process adjusted, the individual
service endpoints can now be protected as needed. In a
controller based structure, you would use the
AuthorizeAttribute
for this, but since the demo
project uses
Minimal APIs, we added RequireAuthorization
calls as needed:
-
The endpoint
/api/populateTestData
does not require authorization. This is a test- and demo-only setup, allowing you to simply activate this endpoint by entering the URL in a browser. -
The GET endpoints
/data/OrderItems
and/data/OrderItem/{id}
callRequireAuthorization();
, so that an authenticated user is required to successfully execute them, but no specific roles are needed. -
The remaining endpoints POST to
/data/OrderItem
, and PUT and DELETE to/data/OrderItem/{id}
, callRequireAuthorization("writers");
, so that the policywriters
is applied and the realm rolewriters
is required.
To illustrate this, here’s how the call is chained with the handler declaration, using the example of the POST endpoint:
app.MapPost("/data/OrderItem", async (DataServiceDbContext dbContext, OrderItem orderItem) =>
{
dbContext.OrderItems.Add(orderItem);
await dbContext.SaveChangesAsync();
return Results.Created($"/data/OrderItem/{orderItem.Id}", orderItem);
}).RequireAuthorization("writers");
At this point we can see that the old unmodified WinForms app in this solution, which executes service requests without any attempt at authorization, shows the error 401 (Unauthorized)— this serves to demonstrate, for the time being, that the server is now protected.
Enable and test user logins in the WinForms app
Logging a user in means retrieving an access token from
Keycloak. The application now includes a new
LoginForm
, which shows username and password entry
fields to the user, collects the data and triggers the log-in
process. The following code runs to contact the Keycloak
service with the credentials, much like you saw before in the
curl
example, and to retrieve the access token.
var content = new FormUrlEncodedContent(new Dictionary<string, string>
{
{"client_id", clientId},
{"username", username},
{"password", password},
{"grant_type", "password"}
});
var url = $"{authUrl}/realms/{realm}/protocol/openid-connect/token";
var response = await bareHttpClient.PostAsync(url, content);
try
{
response.EnsureSuccessStatusCode();
var responseString = await response.Content.ReadAsStringAsync();
(accessToken, refreshToken, expiresIn) = GetTokens(responseString);
...
}
catch (Exception ex) { ... }
...
static (string? access_token, string? refresh_token, int? expires_in) GetTokens(string jsonString)
{
var node = JsonNode.Parse(jsonString);
if (node == null)
return (null, null, null);
else
return (node["access_token"]?.GetValue<string>(),
node["refresh_token"]?.GetValue<string>(),
node["expires_in"]?.GetValue<int>());
}
The code also includes the helper function
GetTokens
, which extracts the required info from
the JSON string returned by the server. In addition to the
access token, a refresh token and an expiration time span are
also retrieved. By default, access tokens always have a limited
lifespan, which helps ensure that clients must get back to the
server after a maximum number of seconds given in the
expires_in
value.
It is one of the responsibilities of a client to check the validity and expiration of an access token before its use and use the refresh token to retrieve a new access token if the old one has expired. In the sample, you can find the relevant code in the class BearerTokenHandler. It is left out here for brevity, since it looks quite similar to the original retrieval code.
However, the one most important line of code from this handler
class is the following one where the
Authorization
request header is set to carry the
value of the current access token to the server in a specific
standard format.
request.Headers.Authorization = new AuthenticationHeaderValue("Bearer", accessToken);
With the additional functionality implemented up to this point, the client application is able to authenticate to the Keycloak server and send the access token to the data service, proving its permission to access the data endpoints. Using the code implemented previously, the server determines which roles a particular logged-in user account has and allows or denies access to the endpoints accordingly.
Evaluate token claims on the client
In addition to the server-side checking of token claims, we can (and should) also decode the token on the client side and use its content to make decisions. Of course, from a security standpoint, the server-side protection is the most important part! But a user would like the UI to reflect what they can actually do, for instance by activating and deactivating elements based on permissions — rather than seeing errors from the server after an operation fails.
Since you have already seen the way the Data Service analyzes
the token and extracts pieces from it, including the roles, the
code on the client side holds no surprises. The only bit that
is extra is a short piece of code required to convert the
string that is passed as the access token into a data structure
that can be read for further analysis. The type
JwtSecurityTokenHandler
is part of the Nuget
package System.IdentityModel.Tokens.Jwt
, which has
been added to the project.
static (string? name, string?[] realmRoles) GetUserDetails(string? accessToken)
{
if (String.IsNullOrEmpty(accessToken))
return (null, []);
var handler = new JwtSecurityTokenHandler();
var token = handler.ReadJwtToken(accessToken);
// ... extract user name and realm roles as before
return (name, realmRoles);
}
On the basis of these details, the UI application can now make
choices at runtime to enable or disable certain user interface
elements. This happens in the method
EvaluateRoles
in the MainForm
:
private void EvaluateRoles()
{
if (DataServiceClient.LoggedIn)
{
if (DataServiceClient.UserHasRole("writers"))
{
userIsWriter = true;
addItemButton.Enabled = true;
deleteItemButton.Enabled = true;
}
else
{
...
}
}
}
It also evaluates the userIsWriter
flag in the
event handlers, like here for the
Add Order Item button:
private async void addItemButton_ItemClick(object sender, DevExpress.XtraBars.ItemClickEventArgs e)
{
if (!userIsWriter)
{
NotAWriterError();
return;
}
...
Of course these are just examples, but they illustrate how the UI application can adjust its feature set and the presentation of information based on information obtained from the access token.
Your Feedback Matters!
You can also download our GitHub example and play with different configurations on your own. Please send your feedback — thanks to all of you who have already done that! — and any questions or ideas, we will attempt to consider everything!