Perficient Blogs https://blogs.perficient.com/ Expert Digital Insights Thu, 03 Apr 2025 17:25:14 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Perficient Blogs https://blogs.perficient.com/ 32 32 30508587 A Customer-Centric Shoptalk Spring 2025 https://blogs.perficient.com/2025/04/03/a-customer-centric-shoptalk-spring-2025/ https://blogs.perficient.com/2025/04/03/a-customer-centric-shoptalk-spring-2025/#respond Thu, 03 Apr 2025 17:25:14 +0000 https://blogs.perficient.com/?p=379578

Perficient’s experts recently attended Shoptalk Spring in Las Vegas, immersing themselves in three days of meetings and networking with brands and partners amidst the lively atmosphere of smoke-filled hallways, pulsating music, and dazzling lasers. Justin Racine, Principal of Commerce, shared his insights with CMSWire, and we’ve highlighted some of his key takeaways below.

516a9c8d 7ac6 471e B0b6 85c02582ba1a

The Golden Age of Retail

Retail and customer experience are about to enter a transformative era—the Golden Age of retail. From the advent of department stores to the rise of shopping malls, consumers and brands are now shifting focus toward people over products. Businesses are increasingly prioritizing human connections, bringing joy and excitement back into shopping. Retail will serve as a medium for inspiring consumers to explore who they are, express their identity, and connect with the world around them.

Creating Customer Connections at Gap

Justin had the opportunity to hear from Gap CEO Richard Dickson, who underscored the importance of fostering meaningful connections between brands and their consumers. According to Dickson, Gap’s mission is to create products that empower customers to express their individuality. “We pride ourselves on giving customers the ability to make Gap their own—to wear it the way they want,” Dickson explained. He emphasized that while price and affordability matter, customers are willing to invest in experiences and products that elevate their sense of self.

Gap has successfully cultivated generational loyalty by creating memorable experiences for families. Parents shop at Gap for their kids, and those children grow up wearing the brand, forming a deep emotional connection. These cherished memories are often captured in photos, further embedding the brand into customers’ lives. By facilitating connections on a deeper, emotional level, Gap builds lasting generational impact and loyalty.

AI in Advertisements: A Conversation with Meta

While Shoptalk Spring emphasized the human side of consumer behavior, discussions around AI inevitably arose. Clara Shih, VP of Business AI at Meta explored the future of branding through AI, focusing on Meta’s Advantage+ toolset. This suite enables businesses to deliver targeted media and content across various channels. Shih showcased new features, including location-based ads on Facebook that integrate maps directing customers to nearby stores. Another demo highlighted AI-powered live chat within ads, allowing consumers to engage with brands directly in their active channel. These innovative features fulfill customers’ desire for seamless interaction and enhance their ability to connect with brands on a humanistic level.

A Conversation with Liza Lefkowski of Wayfair

Liza Lefkowski, Chief Merchant, Vp Of Stores, Wayfair, On Stage Talking At Shoptalk

Wayfair is also deepening its understanding of customers through the integration of data and experience. Liza Lefkowski, Chief Merchant and VP of Stores at Wayfair, discussed the brand’s expansion into physical retail and its aim to inspire and excite consumers. During her session, Lefkowski explained how store associates provide personalized guidance, bridging the gap left by an exclusively online presence. This approach fosters emotional connections between customers and the brand. “Stores are designed to stand on their own but also integrate seamlessly into the overall customer experience—it’s the immersive manifestation of Wayfair,” she said.

Retail Should Spark Emotion, Not Just Transactions

This spring marked Justin’s first time attending Shoptalk Spring, but the themes from the event echoed those from Shoptalk Fall last year: retail must delight, surprise, and connect with customers. While technology and AI are crucial, human connections remain the cornerstone of retail success. By inspiring customers to be the best versions of themselves, brands can create genuine, personal relationships that drive loyalty and satisfaction.

For more insights, visit Perficient’s retail and commerce expertise page.

To read Justin’s full article, head over to CMSWire.

 

]]>
https://blogs.perficient.com/2025/04/03/a-customer-centric-shoptalk-spring-2025/feed/ 0 379578
How to Split Data with Newline Characters into Separate Rows in Excel Using Power Query https://blogs.perficient.com/2025/04/03/how-to-split-data-with-newline-characters-into-separate-rows-in-excel-using-power-query/ https://blogs.perficient.com/2025/04/03/how-to-split-data-with-newline-characters-into-separate-rows-in-excel-using-power-query/#respond Thu, 03 Apr 2025 12:10:31 +0000 https://blogs.perficient.com/?p=379606

When working with datasets in Excel, you might encounter situations where multiple values are stored in a single cell, separated by a newline character (added using Alt + Enter). This can make data analysis challenging.

In this blog, we’ll walk you through how to split such data into separate rows using Power Query, a powerful tool within Excel for data transformation.

Example Dataset:

Employee Name Department Skills
Sarah Marketing SEO
Content Writing
John IT Java
Python
Emily HR Recruitment
Onboarding
Michael Finance Budgeting
Jessica IT C++
JavaScript
Daniel Sales Negotiation

Let’s consider the above example.

Step-by-Step Guide to Using Power Query

Step 1: Load Data into Power Query

  1. Select your dataset by CTRL+A (including headers).
  2. Go to the Data tab → Click From Table/Range.
  3. In the dialog box, ensure My table has headers is checked → Click OK.
    Step 1Step 2
  4. This will open a new window.

Step 2: Split Column by Delimiter

    1. Select a column where multiple values are separated by newlines.
    2. Under the Home tab, find Split Columns and select By Delimiter.
    3. In the Advanced Options, choose to split the values into Rows or Columns, In this case we are splitting the columns in row wise. Hence selected as Rows.
    4. Under Insert Special Characters, select Line Feed and click OK.

Step 3Step 4Step 5

Once it is loaded into the new Sheet then below is the output data.

Employee Name Department Skills
Sarah Marketing SEO
Sarah Marketing Content Writing
John IT Java
John IT Python
Emily HR Recruitment
Emily HR Onboarding
Michael Finance Budgeting
Jessica IT C++
Jessica IT JavaScript
Daniel Sales Negotiation

 

By following these steps, you can efficiently split data with newline characters into separate rows, making your data analysis much easier.

]]>
https://blogs.perficient.com/2025/04/03/how-to-split-data-with-newline-characters-into-separate-rows-in-excel-using-power-query/feed/ 0 379606
Perficient Included in IDC Market Glance: Payer, 1Q25 https://blogs.perficient.com/2025/04/02/perficient-included-in-idc-market-glance-payer-1q25/ https://blogs.perficient.com/2025/04/02/perficient-included-in-idc-market-glance-payer-1q25/#respond Wed, 02 Apr 2025 18:55:18 +0000 https://blogs.perficient.com/?p=379587

Health insurers today are navigating intense technological and regulatory requirements, along with rising consumer demand for seamless digital experiences. Leading organizations are investing in advanced technologies and automations to modernize operations, streamline experiences, and unlock reliable insights. By leveraging scalable infrastructures, you can turn data into a powerful tool that accelerates business success.

IDC Market Glance: Payer, 1Q25

Perficient is proud to be included in the IDC Market Glance: Payer, 1Q25 (doc#US53200825, March 2025) report for the second year in a row. According to IDC, this report “provides a glance at the current makeup of the payer IT landscape, illustrates who some of the major players are, and depicts the segments and structure of the market.”

Perficient is included in the categories of IT Services and Data Platforms/Interoperability. IDC defines the IT Services segment as, “Systems integration organizations providing advisory, consulting, development, and implementation services. Some IT Services firms also have products/solutions.” The Data Platforms/Interoperability segment is defined by IDC as, “Firms that provide data, data aggregation, data translation, data as a service and/or analytics solutions; either as off-premise, cloud, or tools on premise used for every aspect of operations.”

Discover Strategic Investments for Innovation and Success

Our strategists are committed to driving innovative solutions and guiding insurers on their digital transformation journey. We feel that our inclusion in this report reinforces our expertise in leveraging digital capabilities to unlock personalized experiences and drive greater operational efficiencies with our clients’ highly regulated, complex healthcare data.

The ten largest health insurers in the United States have counted on us to help drive the outcomes that matter most to businesses and consumers. Our experts can help you pragmatically and confidently navigate the intense regulatory requirements and consumer trends influencing digital investments. Learn more and contact us to discover how we partner to boost efficiencies, elevate health outcomes, and create differentiated experiences that enhance consumer trust.

]]>
https://blogs.perficient.com/2025/04/02/perficient-included-in-idc-market-glance-payer-1q25/feed/ 0 379587
Deena Piquion from Xerox on Data, Disruption, and Digital Natives https://blogs.perficient.com/2025/04/02/deena-piquion-xerox-data-disruption-digital-natives/ https://blogs.perficient.com/2025/04/02/deena-piquion-xerox-data-disruption-digital-natives/#respond Wed, 02 Apr 2025 11:00:54 +0000 https://blogs.perficient.com/?p=379538

In the new episode of the “What If? So What?” podcast, Jim Hertzfeld and Deena Piquion, chief growth and disruption officer at Xerox, discuss how disruption and digital transformation can position companies to succeed in a rapidly changing technology landscape.

Deena is leading Xerox on a unique and pivotal reinvention journey as the company undergoes a significant transformation, expanding beyond its traditional print and copy services. Deena explains how the company is now focusing on enabling the modern workforce with AI-powered platforms, workflow automation, and IT solutions.

Data plays a crucial role in Xerox’s digital transformation strategy and highlights the importance of integrating data from various sources to create a unified view that enables better decision-making and more effective marketing.

Listen to the podcast to hear more about internal disruption and digital innovation!

Listen now on your favorite podcast platform or visit our website.

 

Subscribe Where You Listen

Apple | Spotify | Amazon | Overcast

Meet our Guest

Deena Piquion headshot

Deena Piquion, Chief Growth and Disruption Officer, Xerox

Deena Piquion is chief growth and disruption officer at Xerox. She previously served as chief marketing officer, and senior vice president and general manager of Xerox Latin America operations. Prior to joining Xerox in 2019, she was with Tech Data Corporation, where she last served as vice president and general manager of Latin America & Caribbean.

She is a member of the Advisory Board of Teach for America Miami Dade County, a nonprofit organization dedicated to educational equity and excellence. Deena was awarded the Florida Diversity Council Glass Ceiling Award in 2016, was selected as a CRN Women of the Channel Honoree in 2017, and was named to Diversity First’s Top 50 Women in Tech 2021 and Top 100 CMOs in 2022.

Deena is actively engaged in her community and passionate about supporting children’s cancer research, and diversity and inclusion in technology. She is a dynamic blogger who created her own branded platform to share tips on personal and professional growth with an engaged following in the industry.

Connect with Deena

 

Meet the Host

Jim Hertzfeld

Jim Hertzfeld is Area Vice President, Strategy for Perficient.

For over two decades, he has worked with clients to convert market insights into real-world digital products and customer experiences that actually grow their business. More than just a strategist, Jim is a pragmatic rebel known for challenging the conventional and turning grand visions into actionable steps. His candid demeanor, sprinkled with a dose of cynical optimism, shapes a narrative that challenges and inspires listeners.

Connect with Jim:

LinkedIn | Perficient

]]>
https://blogs.perficient.com/2025/04/02/deena-piquion-xerox-data-disruption-digital-natives/feed/ 0 379538
Log Framework Integration in Azure Functions with Azure Cosmos DB https://blogs.perficient.com/2025/04/02/log-framework-integration-in-azure-functions-with-azure-cosmos-db/ https://blogs.perficient.com/2025/04/02/log-framework-integration-in-azure-functions-with-azure-cosmos-db/#respond Wed, 02 Apr 2025 09:30:54 +0000 https://blogs.perficient.com/?p=379516

Introduction

Logging is an essential part of application development, especially in cloud environments where monitoring and debugging are crucial. In Azure Functions, there is no built-in provision to log application-level details into a centralized database, making it challenging to check logs every time in the Azure portal. This blog focuses on integrating NLog into Azure Functions to store all logs in a single database (Cosmos DB), ensuring a unified logging approach for better monitoring and debugging.

Steps to Integrate Logging Framework

Integration steps

 

1. Create an Azure Function Project

Begin by creating an Azure Function project using the Azure Function template in Visual Studio.

2. Install Required Nuget Packages

To enable logging using NLog, install the following NuGet packages:Function App Explorer

Install-Package NLog
Install-Package NLog.Extensions.Logging
Install-Package Microsoft.Azure.Cosmos

 

 

3. Create and Configure Nlog.config

NLog uses an XML-based configuration file to define logging targets and rules. Create a new file named Nlog.config in the project root and configure it with the necessary settings.

Refer to the official NLog documentation for database target configuration: NLog Database Target

Important: Set Copy to Output Directory to Copy Always in the file properties to ensure deployment.

N Log Config Code

 

4. Create Log Database

Create an Azure Cosmos DB account with the SQL API.

Sample Cosmos DB Database and Container

  1. Database Name: LogDemoDb
  2. Container Name: Logs
  3. Partition Key: /Application

5. Define Necessary Variables

In the local.settings.json file, define the Cosmos DB connection string.

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "UseDevelopmentStorage=true",
    "CosmosDBConnectionString": "AccountEndpoint=https://your-cosmosdb.documents.azure.com:443/;AccountKey=your-account-key;"
  }
}

Json App Settings

 

6. Configure NLog in Startup.cs

Modify Startup.cs to configure NLog and instantiate database connection strings and log variables.

using Microsoft.Azure.Functions.Extensions.DependencyInjection;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;
using NLog.Extensions.Logging;
using Microsoft.Azure.Cosmos;

[assembly: FunctionsStartup(typeof(MyFunctionApp.Startup))]
namespace MyFunctionApp
{
    public class Startup : FunctionsStartup
    {
        public override void Configure(IFunctionsHostBuilder builder)
        {
            builder.Services.AddLogging(loggingBuilder =>
            {
                loggingBuilder.ClearProviders();
                loggingBuilder.SetMinimumLevel(LogLevel.Information);
                loggingBuilder.AddNLog();
            });

            builder.Services.AddSingleton(new CosmosClient(
                Environment.GetEnvironmentVariable("CosmosDBConnectionString")));
        }
    }
}

Startup Code

 

7. Add Logs in Necessary Places

To ensure efficient logging, add logs based on the following log level hierarchy:

Log Levels

Example Logging in Function Code:

 

using System;
using System.Threading.Tasks;
using Microsoft.Azure.Cosmos;
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Logging;

public class MyFunction
{
    private readonly ILogger<MyFunction> _logger;
    private readonly CosmosClient _cosmosClient;
    private readonly Container _container;

    public MyFunction(ILogger<MyFunction> logger, CosmosClient cosmosClient)
    {
        _logger = logger;
        _cosmosClient = cosmosClient;

        // Initialize Cosmos DB container
        _container = _cosmosClient.GetContainer("YourDatabaseName", "YourContainerName");
    }

    [FunctionName("MyFunction")]
    public async Task Run([TimerTrigger("0 */5 * * * *")] TimerInfo myTimer)
    {
        var logEntry = new
        {
            id = Guid.NewGuid().ToString(),
            timestamp = DateTime.UtcNow,
            logLevel = "Information",
            message = "Function executed at " + DateTime.UtcNow
        };

        // Insert log into Cosmos DB
        await _container.CreateItemAsync(logEntry, new PartitionKey(logEntry.id));

        _logger.LogInformation("Function executed at {time}", DateTime.UtcNow);
    }
}

8. Deployment

Once the function is ready, deploy it to Azure Function App using Visual Studio or Azure DevOps.

Deployment Considerations:

  • Define necessary environment variables in Azure Function Configuration Settings.
  • Ensure Azure Function App Service and SQL Database are in the same network to avoid connection issues.
  • Monitor logs using Application Insights for additional diagnostics.

Conclusion

By following these steps, you can successfully integrate NLog into your Azure Functions for efficient logging. This setup enables real-time monitoring, structured log storage, and improved debugging capabilities.

]]>
https://blogs.perficient.com/2025/04/02/log-framework-integration-in-azure-functions-with-azure-cosmos-db/feed/ 0 379516
Tips for building top performer teams https://blogs.perficient.com/2025/04/01/tips-for-building-top-performer-teams-ev/ https://blogs.perficient.com/2025/04/01/tips-for-building-top-performer-teams-ev/#respond Tue, 01 Apr 2025 19:19:11 +0000 https://blogs.perficient.com/?p=379528

There’s no doubt that every Director or Manager wants a high-performance team that delivers the best results and allows them to focus on building new business opportunities.

Come on, let’s face it! If we were comparing a work team with a sports team, who wouldn’t want to have a Barcelona Soccer Club, the Dodgers baseball team, or the Philadelphia Eagles in American football?

It’s easy to think and say, right? But where does the secret for building high-performance teams lives?

Martin Zwilling, founder and CEO of Founder & CEO at Startup Professionals, Inc., recommends the following list of actions for both entrepreneurs and senior executives to achieve the highest performance from team members (Zwilling, 2020):

Clearly and iteratively communicate team goals and objectives: 

Don’t rely on those who understand the message quickly; at least repeat it five times in different forums to ensure it was heard and understood.

Define and document role content and standards for performance: 

Don’t assume that team members already know what the expected standards of excellence are.

Give team members the right to make decisions in their role. 

Remember that micromanagement is not an effective way to achieve top performance. Instead, you can practice process coaching and let the team make their own decisions and improve step by step.

Relay regular informal observations on progress and results. 

Take the time to provide informal feedback weekly or even daily. This will help address gaps gradually and increase the team members’ psychological safety.

Give team members the training, tools, and data to do the job. 

As a Scrum Master working in an agile framework, you are a servant leader. Team members cannot be top performers without necessary resources. Leaders should anticipate these requirements, listen carefully to feedback from team members, and provide resources on a timely basis.

Diligently provide follow-up and support on assistance requests. 

As a leader you should recognize and support your team in situations that go beyond their domain.

Reward positive results. 

Recognition is important for leveraging the team members confidence and the team’s health.

Related to this topic, the Center for Human Capital Innovation provides also some examples and key factors for high-performance teams:

1992 US men’s Olympic basketball team, known as the “Dream Team” tell us that “the essence of a high-performance team isn’t found in the individual capabilities of its members but in their ability to adapt, learn, and evolve into a synergistic unit. This transformation was marked by a shift in the team’s approach to playing together, emphasizing mutual understanding, trust, and a unified strategy” (Center for Human Capital Innovation, 2024).

Taking in consideration the last paragraph, high-performance teams relays on:

  • Shared Vision and Direction by aligning team members to a common objective.
  • Quality of Interaction: ensure trust, open communication and desire to embrace conflict happens.
  • Sense of Renewal: high performer teams should feel empowered to take risk and innovate.

On the other hand, Expert Panel a former Forbes Councils Member provide these tips for optimizing the team’s level and avoid burnt out as well: 

  • Set boundaries and priorities between work and personal life.
  • Encourage your team to succeed by discussing the goals so everyone is on the same page with priorities, timelines and deadlines.
  • Identify tasks to be automated so everyone can have more time for learning, improve their performance and also have more time.
  • Be transparent by sharing the business case, listen to the team’s feedback and ensure everyone feels valuable on what their role provides to the business.

I hope these tips will help you to get your desired top performer team. Be patient but most importantly, work on it!

Bibliography:

]]>
https://blogs.perficient.com/2025/04/01/tips-for-building-top-performer-teams-ev/feed/ 0 379528
Creating a Launch Checklist https://blogs.perficient.com/2025/04/01/creating-a-launch-checklist/ https://blogs.perficient.com/2025/04/01/creating-a-launch-checklist/#respond Tue, 01 Apr 2025 14:53:35 +0000 https://blogs.perficient.com/?p=379522

Are you a PM or BA who has been assigned a project or platform that is new to your company? If so, you may find that there’s a learning curve for everything that needs to be executed, especially when it comes to the launch. Not all platforms are the same; they can require different steps to go live. Below is a list of steps I take when creating a launch checklist.

Meet with Your Team

Start by meeting with your team and stakeholders to create a list of action items needed for the launch. Ask each individual what they need to complete, when they need to finish it, and how long it will take. Don’t just focus on activities for the day of the launch; also inquire about tasks that need to be completed in the days, weeks, and even months leading up to it. Remember, there may also be post-launch activities to consider.

List in Order

After compiling your action items, group them into time frames. I like to break them down into categories: one month before launch, two weeks before launch, the day before launch, the day of launch, and post-launch. Work with your team to identify any dependencies between tasks. Some team members may not be able to complete their tasks until others are finished, while some tasks can be done in parallel.

Creating the Checklist

Once you have your list of activities, you’re ready to create a checklist to distribute to your team. Consider including the following fields:

  • Name of the task
  • Start Date
  • End Date
  • Duration
  • Person Assigned to the Task

Checklist

Distribute and Notify

After completing your checklist, share it with everyone on your team. It may be helpful to store it in a shared drive where all team members can access and update it. Depending on the activities required, you might also need to contact third parties or vendors to handle certain tasks on their end.

Update Often

As you work through the tasks, ensure that team members are updating the checklist regularly. If you’re focusing on action items to be completed before the launch, it’s a good idea to check in with the team during scrums or status meetings to confirm they are on track to complete everything on time.

Do you have any other tips or ideas on how to approach launch checklists? Feel free to leave a comment!

]]>
https://blogs.perficient.com/2025/04/01/creating-a-launch-checklist/feed/ 0 379522
Understanding and Implementing OAuth2 and OpenID Connect in .NET https://blogs.perficient.com/2025/04/01/understanding-and-implementing-oauth2-and-openid-connect-in-net/ https://blogs.perficient.com/2025/04/01/understanding-and-implementing-oauth2-and-openid-connect-in-net/#respond Tue, 01 Apr 2025 11:34:05 +0000 https://blogs.perficient.com/?p=378734

Authentication and authorization are two crucial aspects of web development. In modern applications, it’s essential to ensure that users are who they say they are (authentication) and have permission to access specific resources (authorization). OAuth2 and OpenID Connect are two widely used protocols that help achieve both goals.

What is OAuth2?

OAuth2 (Open Authorization 2.0) is an authorization framework that enables third-party applications to access a user’s resources without requiring them to share their credentials (username and password). It allows for delegated access, meaning that users can grant specific, controlled access to their data without revealing their login information.

OAuth2 is commonly used to enable users to authenticate via their existing accounts from services like Google, Facebook, or Microsoft. This allows users to securely log in to applications without exposing their sensitive credentials to the requesting application.

Key Concepts in OAuth2

  1. Resource Owner: The user who owns the data and grants permission to the client application.
  2. Client Application: The application requesting access to the user’s resources (e.g., a mobile app or a web application).
  3. Authorization Server: The server responsible for authenticating the user and issuing access tokens.
  4. Resource Server: The server that hosts the protected resources and validates the access tokens provided by the client.
  5. Access Token: A token issued by the authorization server that grants the client access to the protected resources.

OAuth2 Flow

  1. The user is redirected to the authorization server (e.g., Google’s OAuth2 server).
  2. The user authenticates and grants permission for the client application to access specific data (e.g., their Google profile).
  3. The authorization server issues an authorization code.
  4. The client application exchanges the authorization code for an access token.
  5. The client application uses the access token to request protected resources from the resource server.

Key Benefits of OAuth2

OAuth 2.0 is a widely adopted authorization framework that allows third-party applications to access user resources without sharing the credentials. It provides a secure and scalable way to manage authorization. Here are some key benefits of OAuth 2.0:

  1. Granular Access Control: Allows user to define fine-grained permissions for specific resources and grant third-party apps access to certain data or actions without providing blanket access to all their information.
  2. Improved Security: Credentials Protection and Scoped Access.
  3. Support for Multiple Grant Types: Supports several grant types (e.g., Authorization Code, Implicit, Client Credentials, and Resource Owner Password Credentials)
  4. Token-Based Authentication: Uses access tokens, which are temporary and can be scoped and time-limited.
  5. Token Expiry and Revocation: Tokens issued by OAuth 2.0 have an expiry time, which helps limit the duration of access.
  6. Interoperability: OAuth 2.0 is a well-defined, open standard that is widely supported by various service providers and applications, ensuring smooth integration between different systems and platforms.

 

Oauth

 

What is OpenID Connect?

OpenID Connect (OIDC) is an identity layer built on top of OAuth2. It is used to verify the identity of the user and obtain their profile information. While OAuth2 is used for authorization, OpenID Connect extends OAuth2 to include authentication.

In simple terms, OAuth2 tells the client what the user is allowed to do (authorization), while OpenID Connect tells the client who the user is (authentication).

Key Concepts in OpenID Connect

  1. ID Token: This is a JWT (JSON Web Token) that contains information about the authenticated user. It is issued by the authorization server and can be used by the client application to authenticate the user.
  2. Authentication Request: The client sends a request to the authorization server to authenticate the user and receive an ID token along with an access token.

OAuth2 and OpenID Connect in .NET

For better understanding, we’ll integrate with Google as the OAuth2 and OpenID Connect provider.

Step 1: Create a Google Developer Project

1.1 Open the Google Cloud Console

    • Open your browser and navigate to the Google Cloud Console.
    • Sign in to your Google account. If you’re not already signed in, you’ll be prompted to log in.

1.2 Create a New Project

    • On the top left of the page, you’ll see the Google Cloud Platform logo. To the right of the logo, there will be a dropdown that may say something like “Select a Project” or “My First Project.”
    • Click on this dropdown. A new window will appear, showing a list of your existing projects.
    • In the top right of the window, you’ll see a button that says “New Project.” Click on this button.

1.3 Fill in the Project Details

    • Project Name: Enter a name for your project.
    • After filling in the details, click Create.

Once your project is created, you’ll be redirected to the newly created project’s dashboard in the Google Cloud Console.


Step 2: Enable the Google+ API

2.1 Navigate to the APIs & Services Library

    • In the left sidebar, click on the hamburger icon (three horizontal lines) to open the navigation menu.
    • From the menu, go to APIs & Services > Library.

2.2 Search for the Google+ API

    • In the search bar at the top of the Library page, type Google+ API and press enter.
    • Click on the Google+ API result.
    • Then, click the Enable button to enable this API for your project.

Step 3: Create OAuth2 Credentials

3.1 Go to the Credentials Page

    • In the left sidebar, under APIs & Services, click on Credentials.

3.2 Create OAuth 2.0 Client ID

    • On the Credentials page, click the Create Credentials button at the top and select OAuth 2.0 Client ID.

3.3 Configure the OAuth Consent Screen

    • Before creating the OAuth credentials, you need to configure the OAuth consent screen. Click on the OAuth consent screen tab.
    • Choose External as the user type.

3.4 Fill in the Required Fields

    • App Name: Enter a name for your application.
    • User Support Email: Provide your email address.
    • Developer Contact Information: Enter your email address.
    • Click Save and Continue to proceed.

3.5 Create the OAuth 2.0 Client ID

    • After completing the OAuth consent screen setup, you’ll be asked to configure the OAuth credentials.

    • Select Web application as the application type.

    • Add the following Authorized redirect URI:

      Example Redirect URI (if your app is running on https://localhost:{{portnumber}}):

      https://localhost:{{portnumber}}/signin-google
    • Click Create.


Step 4: Obtain Client ID and Client Secret

4.1 Create OAuth Credentials

After completing the OAuth 2.0 configuration and clicking Create, a new window will appear with your Client ID and Client Secret.

4.2 Copy and Store Credentials

It is crucial to copy both the Client ID and Client Secret and store them securely. These credentials will be necessary for integrating Google authentication into your app and ensuring secure access to users’ Google accounts.

4.3 Use the Credentials in Your App

You will use these credentials in your application to authenticate users and interact with Google’s services. Keep them safe, as they allow access to sensitive user data.

Set Up the .NET Core Project:

    1. Create a new ASP.NET Core web application using the template for Web Application (Model-View-Controller).
    2. Add the Microsoft.AspNetCore.Authentication.Google NuGet package.
    3. Configure OAuth2 and OpenID Connect in Program.cs

In the Program.cs file, configure the authentication middleware to use Google’s OAuth2 and OpenID Connect:

// Add authentication services (Google login, etc.)

builder.Services.AddAuthentication(options =>

{

options.DefaultScheme = "Cookies";

options.DefaultChallengeScheme = GoogleDefaults.AuthenticationScheme;

})

.AddCookie()

.(options =>

{

options.ClientId = "client-id";

options.ClientSecret = "client-secret";

options.CallbackPath = "/signin-google";

});


app.UseAuthentication();

app.UseAuthorization();
  • AddGoogle enables OAuth2 and OpenID Connect using Google as the identity provider.
  • SaveTokens = true ensures that both the access token and ID token are saved.

The .AddGoogle method is part of the ASP.NET Core framework, introduced in .NET 9.0 for authentication purposes (typically in the context of OAuth 2.0 or OpenID Connect for integrating Google sign-in). If we are using a version of .NET earlier than 9.0, we will not have access to this method directly.

To handle lower versions of .NET (e.g., .NET 5.0, .NET 6.0, or .NET 7.0) and integrate Google authentication, we will need to use the older approach provided by the AddOAuth or AddGoogle method from previous versions. Here’s how we can implement it for .NET 6.0 or lower:

.AddOAuth("Google", options =>

{

options.ClientId = Configuration["Google:ClientId"];

options.ClientSecret = Configuration["Google:ClientSecret"];

options.CallbackPath = new PathString("/signin-google");

options.AuthorizationEndpoint = "https://accounts.google.com/o/oauth2/auth";

options.TokenEndpoint = "https://oauth2.googleapis.com/token";

options.UserInformationEndpoint = "https://www.googleapis.com/oauth2/v3/userinfo";

options.Scope.Add("openid");

options.Scope.Add("profile");

options.Scope.Add("email");

options.SaveTokens = true;

}

Step 5: Create a Controller to Handle Authentication

Create a simple controller to handle login and display the authenticated user’s profile.

  • The Login action redirects users to Google’s login page.
  • The Logout action logs the user out.
  • The Profile action the authenticated principal and displays the ID token.

Identity Management in ASP.NET Core relies heavily on OAuth 2.0 or OpenID Connect for user authentication. The Profile action retrieves user information, typically stored in claims, such as the ID token, and uses it for user management. With the appropriate configuration and token handling, the application can securely manage user identity, retrieve profile data, and display relevant information to the user.

Step 6: Add Views to Display User Information

Add a simple view to show the user’s profile in the Views/Account/Profile.cshtml:

Step 7: Run the Application

Run application using:

dotnet run

Conclusion

OAuth2 and OpenID Connect are powerful protocols for handling authentication and authorization in modern applications. By integrating these protocols into .NET applications, we can securely authenticate users, delegate access to resources, and ensure that only authorized users can access services.

By following the steps we should now have a basic understanding of how to implement OAuth2 and OpenID Connect in a .NET application. These concepts are essential for any developer working on building secure, scalable, and modern web applications.

]]>
https://blogs.perficient.com/2025/04/01/understanding-and-implementing-oauth2-and-openid-connect-in-net/feed/ 0 378734
Comparing Figma-to-Compose Conversion Methods for Android Development https://blogs.perficient.com/2025/03/31/comparing-figma-to-compose-conversion-methods-for-android-development/ https://blogs.perficient.com/2025/03/31/comparing-figma-to-compose-conversion-methods-for-android-development/#respond Tue, 01 Apr 2025 01:55:39 +0000 https://blogs.perficient.com/?p=379466

The modern Android development landscape increasingly relies on two powerful tools: Figma for collaborative UI/UX design and Jetpack Compose for building native UIs declaratively. A crucial step in the development workflow is translating the polished designs from Figma into functional Compose code. But what’s the most effective way to do this?

Several approaches exist, each with its own strengths and weaknesses, impacting development speed, code quality, and flexibility. Let’s compare the primary methods available as of early 2025: Manual Conversion, Assisted Conversion (like the soon-to-be-sunset Relay), and AI Assistance.

1. Manual Conversion: The Foundation

This is the most traditional and widely practiced method. Developers meticulously examine the Figma design specifications (layouts, spacing, typography, colors, components) and manually write the corresponding Jetpack Compose code (Column, Row, Text, Image, custom composables, etc.). They implement state management, interaction logic, and navigation using standard Compose APIs and libraries like Navigation Compose.

Advantages:

  • Maximum Control & Flexibility: Developers have complete control over the generated code, ensuring it meets specific architectural patterns, performance requirements, and accessibility standards.
  • Optimal Code Quality: Manual implementation allows for clean, idiomatic, and maintainable Compose code tailored to the project’s needs.
  • Deep Understanding: This process forces developers to deeply understand both the design and Compose principles, leading to better overall skills.
  • Best for Complexity: Handles complex layouts, custom interactions, intricate state management, and dynamic UIs effectively.

Disadvantages:

  • Time-Consuming: Can be the slowest method, especially for complex screens or large applications.
  • Prone to Human Error: Manual translation can introduce inconsistencies or visual discrepancies compared to the Figma design if not done carefully.
  • Requires Strong Compose Skills: Developers need a solid understanding of Jetpack Compose and its best practices.
  • Effectiveness for Flows & Navigation: Navigation and complex user flows are implemented entirely by the developer using libraries like Navigation Compose. This ensures the navigation logic is robust and integrated correctly with the app’s architecture.

2. Assisted Conversion: Tools Aiming for Automation (Caution: Relay Sunset)

Tools in this category aim to automate parts of the conversion process, often via plugins or specific workflows. Relay was a prominent example from Google. It allowed designers to annotate Figma components and developers to import them as Compose code packages.

Relay (Sunsetting April 30, 2025):

Original Intent: Relay aimed to streamline the designer-developer handoff for UI components, translating design system elements into reusable Compose code.

Limitations (Even Before Sunset): Often best suited for simpler, static components. Handling complex state, interactions, or highly dynamic layouts usually required significant manual code modification after import. The generated code sometimes wasn’t as clean or idiomatic as manually written code. It introduced its own workflow and potential build complexities.

Current Status (Crucial): Relay is being sunset on April 30, 2025. This means it will no longer be supported or updated, making it not a viable option for new projects and a risk for existing ones. Teams relying on it need to migrate away.

Other Potential Tools (Less Common/Mature for Compose):

While other design-to-code tools exist for different frameworks, mature, widely adopted, and robust Figma-to-Compose converters (beyond Relay) haven’t fully materialized or gained significant traction in the Android community for generating production-ready code.

Advantages (Historically/Conceptually):

  • Potential for faster initial component generation, better consistency if the tool’s workflow is strictly followed.

Disadvantages:

  • Relay’s sunset makes it obsolete. Other tools are often immature for Compose, generate subpar code, lack flexibility, struggle with complexity, and can lead to vendor lock-in or specific workflow dependencies.
  • Effectiveness for Flows & Navigation: These tools typically focus on individual components or screens, not application-wide navigation logic. Navigation still requires manual implementation.

3. AI Assistance: The Intelligent Helper

Developers leverage Large Language Models (LLMs) and AI coding assistants (like Gemini, GitHub Copilot powered by OpenAI’s models, ChatGPT, DeepSeek Coder, etc.) to aid in the conversion process. This can take several forms:

  • Pasting descriptions or screenshots of Figma elements and asking the AI to generate corresponding Compose code snippets.
  • Asking the AI to refactor existing code or implement specific Compose patterns.
  • Using AI code completion features within the IDE (like Studio Bot powered by Gemini in Android Studio).
  • Feeding simplified representations of layouts (e.g., textual descriptions of structure and attributes) to the AI.

Determining the single best AI is difficult and context-dependent and it is important to take in account that currently are raising new AI assistant models that overcome others in less than months

  • Gemini (especially integrated into Android Studio as Studio Bot): Strong potential due to Google’s focus on Android. Designed to understand Android development contexts, including Compose. Good for generating boilerplate, explaining code, and answering Android-specific questions.
  • GitHub Copilot (OpenAI): Widely used, excellent code completion and suggestion capabilities across many languages, including Kotlin/Compose. Learns from the context of your project.
  • ChatGPT (OpenAI): Versatile for generating code snippets from descriptions, explaining concepts, and brainstorming approaches. Less integrated into the IDE workflow compared to Copilot or Studio Bot.
  • DeepSeek Coder: Specifically trained on vast amounts of code, potentially offering strong code generation capabilities, though perhaps less context-aware of specific Android/Compose nuances compared to Gemini/Studio Bot unless specifically prompted.

For Android development specifically, Gemini (via Studio Bot) has a strategic advantage due to its integration and Google’s focus. GitHub Copilot is a very strong general-purpose contender. The best choice often depends on individual workflow preference, specific task complexity, and current model performance (which evolves rapidly). Experimentation is key.

Strengths:

  • Speed Boost: Can significantly accelerate the generation of boilerplate code and simple components.
  • Learning Aid: Helps developers understand how to implement certain UI elements in Compose.
  • Idea Generation: Can suggest different implementation approaches.
  • Reduces Tedium: Automates the writing of repetitive code patterns.

Weaknesses:

  • Code Quality Varies: AI-generated code requires careful review and often refactoring to ensure it meets quality standards, follows best practices, and handles state correctly.
  • Limited Context/Complexity: AI often struggles with complex layouts, intricate state dependencies, accessibility nuances, and the overall application architecture without very specific guidance. It typically cannot “read” a Figma file directly and understand all its implications.
  • Not a Full Solution: AI is an assistant, not a replacement for a developer. It rarely produces production-ready screens directly from a design prompt without significant developer intervention.
  • Requires Verification: Always needs human oversight to catch errors, ensure visual fidelity, and implement correct logic.
  • Effectiveness for Flows & Navigation: AI can generate boilerplate code for Navigation Compose (e.g., setting up a NavHost), but it cannot design or implement the complex logic of your app’s navigation graph based solely on a Figma design. This still requires manual developer effort and architectural understanding.

Conversion Method Comparison

Feature Manual Conversion Assisted (Relay, plugins) AI Assistance
UI Accuracy High (Developer Controlled) Medium (Tool Dependent) Medium (Requires Refinement)
Code Quality High (Developer Controlled) Low-Medium (Often Needs Refactor) Variable (Requires Review)
Flexibility Very High Low Medium (Assists, doesn’t dictate)
Speed Slow Medium (Initial Setup) – Fast (Plugins) Fast (for Boilerplate/Snippets)
Complexity Handling Excellent Poor Poor-Medium (Best for simple parts)
Navigation/Flows Full Manual Implementation Not Handled Boilerplate Help Only
Learning Curve High (Requires Compose Skills) Medium (Tool Specific) – Low (Plugins) Low-Medium (Prompting/Review)
Current Viability High (Standard) None (Sunset Relay) – Low (Plugins) High (As an Assistant)

Conclusion: The Hybrid Approach

As of March 2025, with Relay sunsetting, there is no magic bullet for instantly converting complex Figma designs into perfect, production-ready Jetpack Compose applications.

  • Manual Conversion remains the most reliable method for achieving high-quality, flexible, and maintainable UI code, especially for complex screens and ensuring correct logic and state management. It’s essential for implementing navigation flows.
  • AI Assistance (using tools like Gemini/Studio Bot, Copilot, ChatGPT) is rapidly becoming an indispensable helper. It excels at accelerating development by generating boilerplate, suggesting implementations for simpler components, and reducing repetitive tasks. However, it requires significant developer oversight, review, and refinement.

The most effective strategy today is often a hybrid one:

  1. Analyze the Figma design thoroughly.
  2. Manually structure the core screen layouts, navigation, and state management logic in Compose.
  3. Leverage AI to generate code for simpler, repetitive UI elements (buttons, text fields, basic layouts) or to get initial boilerplate.
  4. Critically review and refactor any AI-generated code, integrating it into your manual structure.
  5. Manually implement complex components, interactions, and precise styling details.

Ultimately, developer skill, a deep understanding of Jetpack Compose, and careful attention to detail are paramount, regardless of the tools used. AI can augment this process, but it doesn’t replace the need for expert human developers to bridge the final gap between design and functional, high-quality code.

]]>
https://blogs.perficient.com/2025/03/31/comparing-figma-to-compose-conversion-methods-for-android-development/feed/ 0 379466
We Are Perficient: Transforming the Digital Strategies with Adobe https://blogs.perficient.com/2025/03/31/we-are-perficient-transforming-the-mobile-experience-and-digital-strategies-with-adobe/ https://blogs.perficient.com/2025/03/31/we-are-perficient-transforming-the-mobile-experience-and-digital-strategies-with-adobe/#comments Mon, 31 Mar 2025 20:26:09 +0000 https://blogs.perficient.com/?p=379495

Today in our “We Are Perficient” series, we explore how businesses can take their digital experience to the next level through mobile optimization. In an exclusive conversation with Jonathan Crockett, Managing Director of Go-To-Market, Sales, and Solutions at Perficient, we dive into key strategies to ensure brands deliver seamless, high-impact experiences on mobile devices. 

In today’s digital world, user experience is everything. Companies looking to stand out must provide seamless, personalized, and optimized interactions at every touchpoint. In this video, we explore how the combination of Artificial Intelligence, advanced digital experience strategies, and collaboration with technology leaders like Adobe is redefining the way brands connect with their customers. 

Optimizing for a Mobile-First World 

 Today, most digital interactions happen on mobile devices. Without a well-optimized mobile strategy, brands risk losing conversions and engagement. From ultra-fast loading times to intuitive and accessible interfaces, mobile optimization is no longer optional—it’s essential to improving customer retention and conversion rates. 

 AI-Driven Personalization 

Artificial intelligence is transforming user experiences by enabling real-time personalization based on data. From content recommendations to adaptive interfaces that respond to user behavior, AI helps deliver unique and relevant experiences at every interaction. This not only enhances customer satisfaction but also boosts lifetime value and brand loyalty. 

Adobe and the Digital Strategies Evolution 

As an Adobe strategic partner, Perficient helps businesses unlock the full potential of Adobe’s cutting-edge solutions. From Adobe Experience Manager to Adobe Sensei, our strategies merge creativity and technology to design immersive, scalable, and highly effective digital experiences. 

Ready to Take Your Digital Strategies to the Next Level? 

The future of digital experience lies in personalization, optimization, and continuous innovation. If you’re looking to transform how your customers interact with your brand, Perficient can help. 

Contact us today and discover how we can elevate your digital strategy. 

]]>
https://blogs.perficient.com/2025/03/31/we-are-perficient-transforming-the-mobile-experience-and-digital-strategies-with-adobe/feed/ 1 379495
End-to-End Lineage and External Raw Data Access in Databricks https://blogs.perficient.com/2025/03/31/eference-architecture-end-to-end-lineage-external-raw-data-access-databricks/ https://blogs.perficient.com/2025/03/31/eference-architecture-end-to-end-lineage-external-raw-data-access-databricks/#respond Mon, 31 Mar 2025 20:01:27 +0000 https://blogs.perficient.com/?p=379496

Achieving end-to-end lineage in Databricks while allowing external users to access raw data can be a challenging task. In Databricks, leveraging Unity Catalog for end-to-end lineage is a best practice. However, enabling external users to access raw data while maintaining security and lineage integrity requires a well-thought-out architecture. This blog outlines a reference architecture to achieve this balance.

Key Requirements

To meet the needs of both internal and external users, the architecture must:

  1. Maintain end-to-end lineage within Databricks using Unity Catalog.
  2. Allow external users to access raw data without compromising governance.
  3. Secure data while maintaining flexibility for different use cases.

Recommended Architecture

1. Shared Raw Data Lake (Pre-Bronze)

The architecture starts with a shared data lake as a landing zone for raw, unprocessed data from various sources. This data lake is located in external cloud storage, such as AWS S3 or Azure Data Lake, and is independent of Databricks. Access to this data is managed using IAM roles and policies, allowing both Databricks and external users to interact with the data without overlapping permissions.

Benefits:

  • External users can access raw data without direct entry into the Databricks Lakehouse.
  • Secure and isolated raw data management.
  • Maintains data availability for non-Databricks consumers.

2. Bronze Layer (Managed by Databricks)

The bronze layer ingests raw data from the shared data lake into Databricks. Using Delta Live Tables (DLT), data is processed and stored as managed or external Delta tables. Unity Catalog governs these tables, enforcing fine-grained access control to maintain data security and lineage. End-to-end lineage and Databricks begins with the bronse layer and can be easily maintained throughout silver and gold by using DLTs.

Governance:

  • Permissions are enforced through Unity Catalog.
  • Data versioning and lineage tracking are maintained within Databricks.

3. Silver and Gold Layers (Processed Data)

Subsequent data processing transforms bronze data into refined (silver) and aggregated (gold) tables. These layers are exclusively managed within Databricks to ensure lineage continuity, leveraging Delta Lake’s optimization features.

Access:

  • Internal users access data through Unity Catalog with appropriate permissions.
  • External users do not have direct access to these curated layers, preserving data quality.

Access Patterns

  • External Users: Access raw data from the shared data lake through configured IAM policies. No direct access to Databricks-managed bronze tables.
  • Internal Users: Access the full data pipeline from bronze to gold within Databricks, leveraging Unity Catalog for secure and controlled access.

Why This Architecture Works

  • Security: Separates raw data from managed bronze, reducing exposure.
  • Governance: Unity Catalog maintains strict access control and lineage.
  • Performance: Internal data processing benefits from Delta Lake optimizations, while raw data remains easily accessible for external systems.

End-to-end lineage in Databricks

This reference architecture offers a balanced approach to handling raw data access while maintaining governance and lineage within Databricks. By isolating raw data in a shared lake and managing processed data within Databricks, organizations can effectively support both internal analytics and external data sharing.

Contact us to learn more about how to empower your teams with the right tools, processes, and training to unlock Databricks’ full potential across your enterprise.

]]>
https://blogs.perficient.com/2025/03/31/eference-architecture-end-to-end-lineage-external-raw-data-access-databricks/feed/ 0 379496
Perficient Publishes 2024 Community Impact Report https://blogs.perficient.com/2025/03/31/perficient-publishes-2024-community-impact-report/ https://blogs.perficient.com/2025/03/31/perficient-publishes-2024-community-impact-report/#respond Mon, 31 Mar 2025 15:47:02 +0000 https://blogs.perficient.com/?p=379449

At Perficient, we’re passionate about forging the future by advancing STEM education and improving health and well-being. Philanthropy is at the heart of our global company. Beyond obsessing over outcomes and delivering innovative digital solutions for our clients, Perficient and our colleagues share another common goal—making a difference in the communities where we live and work.  

Cover Image

We’re proud to share the 2024 Community Impact Report, which celebrates the unique acts of kindness and generosity found across our global company.   

This engaging report captures some of the incredible stories, activities, and results of how our colleagues are giving back through their time, talent, and resources. Dive into the 2024 Community Impact Report to learn how Perficient and our colleagues are making a difference around the globe.

Perficient’s Commitment to Our Global Communities 

Perficient is guided by our two corporate giving pillars that directly align to our business—advancing STEM education and improving health and well-being. These pillars have a direct impact on the philanthropic activities and initiatives that are outlined in the 2024 Community Impact Report.  

Read the report to learn how we: 

  • Fought food insecurity during Hunger Action Month by donating more than 11,000 pounds of food and 46,000+ meals to those in need. 
  • And much more! 

Share the 2024 Community Impact Report 

This report highlights just some of the ways Perficient is making a difference. You can discover even more ways we’re giving back on the Life at Perficient blog and on Perficient’s social media channels. 

See More: Perficient Celebrates the 2024 Community Impact Report on LinkedIn 

Read our previous Community Impact Reports, including the 2023 Community Impact Report and 2022 Community Impact Report.   

Perficient believes that every act of kindness can make a difference, no matter how big or small, and that we each have the power to make the world a healthier and happier place. Together, we’re building a better world.


READY TO GROW YOUR CAREER?  

It’s no secret our success is because of our people. No matter the technology or time zone, our colleagues are committed to delivering innovative, end-to-end digital solutions for the world’s biggest brands, and we bring a collaborative spirit to every interaction. We’re always seeking the best and brightest to work with us. Join our team and experience a culture that challenges, champions, and celebrates our people.  

Visit our Careers page to see career opportunities and more!  

Go inside Life at Perficient and connect with us on LinkedIn, YouTube,Facebook, TikTok, and Instagram.

]]>
https://blogs.perficient.com/2025/03/31/perficient-publishes-2024-community-impact-report/feed/ 0 379449