Artificial Intelligence, the Internet-of-Things, and ...

AI in security IoT authentication login form security MFA integration smart device login
D
David Kim

Full-Stack Developer & DevOps Architect

 
January 29, 2026 8 min read

TL;DR

This article covers the convergence of ai and iot within the authentication landscape for modern enterprises. It explore how smart devices creates new vulnerabilities in login forms and how machine learning can fix them. You'll learn about mfa integration for hardware, passwordless futures, and using login4website tools to secure your b2b infrastructure against automated bot attacks and credential stuffing.

Introduction to the jose ecosystem in b2c

Ever tried explaining to a product manager why we can't just base64 encode a user's subscription status and call it a day? It's usually followed by a blank stare until you mention that anyone with a browser console could then give themselves a free lifetime "Premium" pass.

In the world of b2c identity, we're moving away from those heavy, old-school xml standards. Honestly, nobody misses soap. The JSON Object Signing and Encryption (JOSE) framework is what makes modern, scalable login possible. It isn't just one thing; it's a bunch of parts working together:

  • jws (Signature): This proves the token hasn't been messed with. If a retail app issues a discount token, jws ensures the user didn't change "5%" to "50%".
  • jwe (Encryption): This hides the data. In healthcare, you might sign a token, but you also gotta encrypt it so prying eyes can't read a patient ID sitting in local storage.
  • jwk (Keys): These are just json objects representing your cryptographic keys.
  • jwa (Algorithms): The math behind the curtain, like RS256 or the newer Ed25519.

Diagram 1

Figure 1: The relationship between JWS (signing), JWE (encryption), and JWK (key format) within the JOSE framework.

Consumer tokens are basically the passports of the internet. They travel across messy, public networks where anyone might try to grab them. These standards have been evolving for years to handle things like token theft and replay attacks.

Signing is great for integrity, but in finance or high-security apps, it isn't enough for privacy. If your token contains a user's home address, just signing it means it's still readable by any script on the page. That's where the "Encryption" part of jose becomes a lifesaver.

Next, we're gonna look at the mechanics of how JWS actually works to protect your data from being changed by some random user.

Fully-Specified vs Polymorphic algorithms

Ever had a bug report where a token worked fine on your staging server but blew up on a specific mobile client? It’s usually because of those "polymorphic" algorithm names that look simple but hide a ton of ambiguity under the hood.

In the early days of jose, we liked things flexible. We used identifiers like EdDSA or es256 and thought we were being clear. But as noted in the IETF draft on fully-specified algorithms, these are actually "polymorphic" because they don't tell you everything you need to know.

Take EdDSA for example. If your server says it supports it, does that mean it supports the Ed25519 curve? Or Ed448? You don't know until you actually look at the key. This makes negotiation a nightmare for things like oAuth metadata.

  • Cross-curve attacks: If you're using ecdh for encryption, an attacker might try to use a key from a different curve (like P-256) against a P-384 implementation to leak data.
  • Negotiation failure: If a webauthn authenticator only does Ed25519, but your api just says "I support EdDSA," the handshake might fail because the client doesn't know if you can actually handle its specific curve.

According to the IETF draft, polymorphic algorithms require info beyond the alg id to work. This is basically a "security smell" in modern iam architectures.

The industry is moving toward "fully-specified" algorithms. These identifiers leave zero room for guessing. Instead of saying "I use ecdsa," you say "I use ecdsa with p-256 and sha-256."

Diagram 5

Figure 5: Comparison showing how polymorphic algorithms rely on key metadata versus fully-specified algorithms that define parameters in the header.

The IANA has been busy updating the JSON Object Signing and Encryption (JOSE) registry. We now have specific names like Ed25519 and Ed448 that replace the generic EdDSA tag.

Signing tokens with jws for integrity

So, we've established that base64 encoding your user data is a recipe for disaster. Now let's talk about the actual "glue" that keeps your b2c tokens from being tampered with: the jws (JSON Web Signature).

Think of a jws like a wax seal on a letter. It doesn't hide the message—anyone can still read it—but if someone tries to steam it open and change the contents, the seal breaks. In the world of apis, that "broken seal" means the request gets tossed immediately.

A jws isn't some monolithic block of text; it’s actually three distinct parts shoved together with dots. If you’ve ever looked at a jwt, you’ve seen this: header.payload.signature.

Diagram 2

Figure 2: The structure of a JWS showing the Header, Payload, and Signature components.

When you're building for millions of consumers, the choice of algorithm matters. Most teams default to rs256 because it's the "old reliable" of the industry. It uses an RSA key pair, which is widely supported.

But honestly? If you're starting fresh, look at es256 (ECDSA using P-256). As noted in the IANA JOSE Registry, es256 is "Recommended+" for a reason. The keys are way smaller than RSA keys, which means the tokens are shorter and the math is faster on your cpu. For a mobile app where every byte and millisecond counts, that's a huge win.

The biggest foot-gun in the history of jose is the alg: none vulnerability. Back in the day, some libraries would see a token with the algorithm set to "none" and just... skip the signature check.

According to the IANA JOSE Registry, the "none" algorithm is technically optional, but in a production b2c environment, it's basically a "hack me" sign.

You gotta make sure your api gateway or backend service explicitly rejects any token that doesn't use your approved algorithms. Don't let the token tell you how to verify it; you tell the token what's allowed.

Signing is great for keeping things honest, but it doesn't hide a thing. If you're handling sensitive consumer data, you need to wrap that payload in a jwe (JSON Web Encryption) layer.

Encryption with jwe for data privacy

Confusing integrity with privacy is a classic mistake. If your token contains pii (personally identifiable information), you need jwe. Compliance is the big driver here. Under gdpr and ccpa, letting pii sit unencrypted in a browser's local storage is basically asking for a massive fine.

If jws is a three-part string, jwe is its more complicated cousin with five parts. It looks like header.encrypted_key.iv.ciphertext.tag.

Diagram 3

Figure 3: The five-part structure of a JWE, including the encrypted key and authentication tag.

Usually, you don't just send a jwe. You want to sign the data first so you know who sent it, and then encrypt it so nobody else can read it. This is called a nested jwt.

# Sign-then-Encrypt (Sender side)
import jose_library

claims = {"sub": "user123", "diagnosis": "A1-B2", "iss": "clinic-api"} signed_token = jose_library.jws.sign(claims, private_key, alg='ES256')

encrypted_token = jose_library.jwe.encrypt( signed_token, app_public_key, alg='RSA-OAEP-256', enc='A128GCM' )

To get the data back, the recipient has to do the decrypt-then-verify flow. First, you use your private key to decrypt the JWE. This gives you back the original JWS string. Then, you take that JWS and verify the signature using the sender's public key. If you try to verify before decrypting, it won't work because the signature is hidden inside the ciphertext!

JWK management and rotation

Now we gotta talk about the keys themselves. A JWK (JSON Web Key) is just a standard way to represent a key as a JSON object instead of those messy PEM files. When you have a bunch of them, you put them in a JWKS (JSON Web Key Set).

The kid (Key ID) header is your best friend here. When you sign a token, you include the kid so the receiver knows exactly which key to use. This is what makes key rotation possible without breaking your app.

Here is how a rotation strategy usually works:

  1. Generate a new key: Add it to your JWKS endpoint, but don't use it for signing yet.
  2. Wait: Give your caches time to refresh.
  3. Switch: Start signing new tokens with the new key.
  4. Retire: Keep the old key in the JWKS for a while so old tokens can still be verified until they expire.

Don't hardcode your public keys; use a jwks endpoint (usually at /.well-known/jwks.json) so you can swap keys whenever you need.

Implementing passwordless with jose standards

Passwordless is about moving the security logic from the user's brain to the cryptographic layer. By combining passwordless flows with the jose framework, we can build systems that are both harder to hack and easier to use.

If you're tired of managing complex auth state, tools like MojoAuth make it pretty simple to implement passwordless magic links or otp flows using signed tokens.

Passkeys are the big shift everyone is talking about. They use webauthn to create a public/private key pair on the user's device. When the user registers, their device generates a public key that you save as a jwk on your server.

Diagram 4

Figure 4: The WebAuthn registration flow where a public key is stored as a JWK on the server.

As mentioned earlier in the section on algorithms, sticking to "Recommended+" choices like es256 ensures that these mobile-heavy flows stay fast. Nobody wants to wait three seconds for an rsa signature to verify on a spotty 4g connection.

Threat modeling and breach prevention

So you’ve got your tokens signed and maybe even encrypted. Great. But if you’re leaving your keys in a public s3 bucket, you’re basically building a vault and leaving the door propped open with a brick.

The first line of defense is being incredibly picky about who a token is for. I’ve seen teams skip the aud (audience) and iss (issuer) claims because "we only have one api."

  • Strict Audience Validation: Your backend should always check that the aud claim matches its own identifier.
  • Short-Lived access tokens: I usually aim for 15 to 60 minutes.
  • Secure jwks endpoints: If someone spoofs this, they can sign their own tokens and you'll trust them.

Most big identity breaches don't happen because someone broke the math. They happen because of "key sprawl." If your signing keys are sitting in a config file in git, you've already lost.

Diagram 6

Figure 6: A threat model diagram showing potential attack vectors like key theft and token interception.

JOSE isn't a silver bullet, it's a toolkit. If you use it right—meaning you sign for integrity, encrypt for privacy, and use fully-specified algorithms to avoid ambiguity—you're ahead of 90% of the pack. Keep your keys in an hsm, keep your tokens short, and for the love of everything, stop using alg: none.

D
David Kim

Full-Stack Developer & DevOps Architect

 

David Kim is a Full-Stack Developer and DevOps Architect with 11 years of experience building scalable web applications and authentication systems. Based in Vancouver, he currently works as a Principal Engineer at a fast-growing Canadian tech startup where he architected their zero-trust authentication platform. David is an AWS Certified Solutions Architect and has contributed to numerous open-source authentication projects. He's also a mentor at local coding bootcamps and co-organizes the Vancouver Web Developers meetup. Outside of coding, David is an avid rock climber and craft beer enthusiast who enjoys exploring British Columbia's mountain trails.

Related Articles

What is the Standard of Good Practice for Information ...
information security standards

What is the Standard of Good Practice for Information ...

Explore the standard of good practice for information security focusing on login forms, MFA, and AI-driven authentication for tech professionals.

By Hiroshi Tanaka January 30, 2026 6 min read
common.read_full_article
What are some common cybersecurity best practices for organizations?
cybersecurity best practices

What are some common cybersecurity best practices for organizations?

Discover the most effective cybersecurity best practices for organizations. Learn about MFA, password management, AI in security, and login form optimization.

By David Kim January 28, 2026 6 min read
common.read_full_article
What are the 5 C's of cybersecurity?
5 C's of cybersecurity

What are the 5 C's of cybersecurity?

Explore the 5 C's of cybersecurity: Change, Continuity, Cost, Compliance, and Coverage. Learn how they apply to login security, MFA, and AI in 2025.

By David Kim January 27, 2026 7 min read
common.read_full_article
SCAP (Security Content Automation Protocol - CSRC
SCAP

SCAP (Security Content Automation Protocol - CSRC

Learn how SCAP (Security Content Automation Protocol) from CSRC improves login form security, MFA integration, and automated vulnerability management for tech businesses.

By David Kim January 26, 2026 6 min read
common.read_full_article