Perficient’s experts recently attended Shoptalk Spring in Las Vegas, immersing themselves in three days of meetings and networking with brands and partners amidst the lively atmosphere of smoke-filled hallways, pulsating music, and dazzling lasers. Justin Racine, Principal of Commerce, shared his insights with CMSWire, and we’ve highlighted some of his key takeaways below.
Retail and customer experience are about to enter a transformative era—the Golden Age of retail. From the advent of department stores to the rise of shopping malls, consumers and brands are now shifting focus toward people over products. Businesses are increasingly prioritizing human connections, bringing joy and excitement back into shopping. Retail will serve as a medium for inspiring consumers to explore who they are, express their identity, and connect with the world around them.
Justin had the opportunity to hear from Gap CEO Richard Dickson, who underscored the importance of fostering meaningful connections between brands and their consumers. According to Dickson, Gap’s mission is to create products that empower customers to express their individuality. “We pride ourselves on giving customers the ability to make Gap their own—to wear it the way they want,” Dickson explained. He emphasized that while price and affordability matter, customers are willing to invest in experiences and products that elevate their sense of self.
Gap has successfully cultivated generational loyalty by creating memorable experiences for families. Parents shop at Gap for their kids, and those children grow up wearing the brand, forming a deep emotional connection. These cherished memories are often captured in photos, further embedding the brand into customers’ lives. By facilitating connections on a deeper, emotional level, Gap builds lasting generational impact and loyalty.
While Shoptalk Spring emphasized the human side of consumer behavior, discussions around AI inevitably arose. Clara Shih, VP of Business AI at Meta explored the future of branding through AI, focusing on Meta’s Advantage+ toolset. This suite enables businesses to deliver targeted media and content across various channels. Shih showcased new features, including location-based ads on Facebook that integrate maps directing customers to nearby stores. Another demo highlighted AI-powered live chat within ads, allowing consumers to engage with brands directly in their active channel. These innovative features fulfill customers’ desire for seamless interaction and enhance their ability to connect with brands on a humanistic level.
Wayfair is also deepening its understanding of customers through the integration of data and experience. Liza Lefkowski, Chief Merchant and VP of Stores at Wayfair, discussed the brand’s expansion into physical retail and its aim to inspire and excite consumers. During her session, Lefkowski explained how store associates provide personalized guidance, bridging the gap left by an exclusively online presence. This approach fosters emotional connections between customers and the brand. “Stores are designed to stand on their own but also integrate seamlessly into the overall customer experience—it’s the immersive manifestation of Wayfair,” she said.
This spring marked Justin’s first time attending Shoptalk Spring, but the themes from the event echoed those from Shoptalk Fall last year: retail must delight, surprise, and connect with customers. While technology and AI are crucial, human connections remain the cornerstone of retail success. By inspiring customers to be the best versions of themselves, brands can create genuine, personal relationships that drive loyalty and satisfaction.
For more insights, visit Perficient’s retail and commerce expertise page.
To read Justin’s full article, head over to CMSWire.
]]>
When working with datasets in Excel, you might encounter situations where multiple values are stored in a single cell, separated by a newline character (added using Alt + Enter). This can make data analysis challenging.
In this blog, we’ll walk you through how to split such data into separate rows using Power Query, a powerful tool within Excel for data transformation.
Employee Name | Department | Skills |
Sarah | Marketing | SEO Content Writing |
John | IT | Java Python |
Emily | HR | Recruitment Onboarding |
Michael | Finance | Budgeting |
Jessica | IT | C++ JavaScript |
Daniel | Sales | Negotiation |
Let’s consider the above example.
Step 1: Load Data into Power Query
Step 2: Split Column by Delimiter
Once it is loaded into the new Sheet then below is the output data.
Employee Name | Department | Skills |
Sarah | Marketing | SEO |
Sarah | Marketing | Content Writing |
John | IT | Java |
John | IT | Python |
Emily | HR | Recruitment |
Emily | HR | Onboarding |
Michael | Finance | Budgeting |
Jessica | IT | C++ |
Jessica | IT | JavaScript |
Daniel | Sales | Negotiation |
By following these steps, you can efficiently split data with newline characters into separate rows, making your data analysis much easier.
]]>Health insurers today are navigating intense technological and regulatory requirements, along with rising consumer demand for seamless digital experiences. Leading organizations are investing in advanced technologies and automations to modernize operations, streamline experiences, and unlock reliable insights. By leveraging scalable infrastructures, you can turn data into a powerful tool that accelerates business success.
Perficient is proud to be included in the IDC Market Glance: Payer, 1Q25 (doc#US53200825, March 2025) report for the second year in a row. According to IDC, this report “provides a glance at the current makeup of the payer IT landscape, illustrates who some of the major players are, and depicts the segments and structure of the market.”
Perficient is included in the categories of IT Services and Data Platforms/Interoperability. IDC defines the IT Services segment as, “Systems integration organizations providing advisory, consulting, development, and implementation services. Some IT Services firms also have products/solutions.” The Data Platforms/Interoperability segment is defined by IDC as, “Firms that provide data, data aggregation, data translation, data as a service and/or analytics solutions; either as off-premise, cloud, or tools on premise used for every aspect of operations.”
Our strategists are committed to driving innovative solutions and guiding insurers on their digital transformation journey. We feel that our inclusion in this report reinforces our expertise in leveraging digital capabilities to unlock personalized experiences and drive greater operational efficiencies with our clients’ highly regulated, complex healthcare data.
The ten largest health insurers in the United States have counted on us to help drive the outcomes that matter most to businesses and consumers. Our experts can help you pragmatically and confidently navigate the intense regulatory requirements and consumer trends influencing digital investments. Learn more and contact us to discover how we partner to boost efficiencies, elevate health outcomes, and create differentiated experiences that enhance consumer trust.
]]>In the new episode of the “What If? So What?” podcast, Jim Hertzfeld and Deena Piquion, chief growth and disruption officer at Xerox, discuss how disruption and digital transformation can position companies to succeed in a rapidly changing technology landscape.
Deena is leading Xerox on a unique and pivotal reinvention journey as the company undergoes a significant transformation, expanding beyond its traditional print and copy services. Deena explains how the company is now focusing on enabling the modern workforce with AI-powered platforms, workflow automation, and IT solutions.
Data plays a crucial role in Xerox’s digital transformation strategy and highlights the importance of integrating data from various sources to create a unified view that enables better decision-making and more effective marketing.
Listen to the podcast to hear more about internal disruption and digital innovation!
Listen now on your favorite podcast platform or visit our website.
Apple | Spotify | Amazon | Overcast
Deena Piquion, Chief Growth and Disruption Officer, Xerox
Deena Piquion is chief growth and disruption officer at Xerox. She previously served as chief marketing officer, and senior vice president and general manager of Xerox Latin America operations. Prior to joining Xerox in 2019, she was with Tech Data Corporation, where she last served as vice president and general manager of Latin America & Caribbean.
She is a member of the Advisory Board of Teach for America Miami Dade County, a nonprofit organization dedicated to educational equity and excellence. Deena was awarded the Florida Diversity Council Glass Ceiling Award in 2016, was selected as a CRN Women of the Channel Honoree in 2017, and was named to Diversity First’s Top 50 Women in Tech 2021 and Top 100 CMOs in 2022.
Deena is actively engaged in her community and passionate about supporting children’s cancer research, and diversity and inclusion in technology. She is a dynamic blogger who created her own branded platform to share tips on personal and professional growth with an engaged following in the industry.
Connect with Deena
Jim Hertzfeld is Area Vice President, Strategy for Perficient.
For over two decades, he has worked with clients to convert market insights into real-world digital products and customer experiences that actually grow their business. More than just a strategist, Jim is a pragmatic rebel known for challenging the conventional and turning grand visions into actionable steps. His candid demeanor, sprinkled with a dose of cynical optimism, shapes a narrative that challenges and inspires listeners.
]]>Logging is an essential part of application development, especially in cloud environments where monitoring and debugging are crucial. In Azure Functions, there is no built-in provision to log application-level details into a centralized database, making it challenging to check logs every time in the Azure portal. This blog focuses on integrating NLog into Azure Functions to store all logs in a single database (Cosmos DB), ensuring a unified logging approach for better monitoring and debugging.
Begin by creating an Azure Function project using the Azure Function template in Visual Studio.
To enable logging using NLog, install the following NuGet packages:
Install-Package NLog
Install-Package NLog.Extensions.Logging
Install-Package Microsoft.Azure.Cosmos
NLog uses an XML-based configuration file to define logging targets and rules. Create a new file named Nlog.config in the project root and configure it with the necessary settings.
Refer to the official NLog documentation for database target configuration: NLog Database Target
Important: Set Copy to Output Directory to Copy Always in the file properties to ensure deployment.
Create an Azure Cosmos DB account with the SQL API.
Sample Cosmos DB Database and Container
In the local.settings.json file, define the Cosmos DB connection string.
{ "IsEncrypted": false, "Values": { "AzureWebJobsStorage": "UseDevelopmentStorage=true", "CosmosDBConnectionString": "AccountEndpoint=https://your-cosmosdb.documents.azure.com:443/;AccountKey=your-account-key;" } }
Modify Startup.cs to configure NLog and instantiate database connection strings and log variables.
using Microsoft.Azure.Functions.Extensions.DependencyInjection; using Microsoft.Extensions.DependencyInjection; using Microsoft.Extensions.Logging; using NLog.Extensions.Logging; using Microsoft.Azure.Cosmos; [assembly: FunctionsStartup(typeof(MyFunctionApp.Startup))] namespace MyFunctionApp { public class Startup : FunctionsStartup { public override void Configure(IFunctionsHostBuilder builder) { builder.Services.AddLogging(loggingBuilder => { loggingBuilder.ClearProviders(); loggingBuilder.SetMinimumLevel(LogLevel.Information); loggingBuilder.AddNLog(); }); builder.Services.AddSingleton(new CosmosClient( Environment.GetEnvironmentVariable("CosmosDBConnectionString"))); } } }
To ensure efficient logging, add logs based on the following log level hierarchy:
Example Logging in Function Code:
using System; using System.Threading.Tasks; using Microsoft.Azure.Cosmos; using Microsoft.Azure.WebJobs; using Microsoft.Extensions.Logging; public class MyFunction { private readonly ILogger<MyFunction> _logger; private readonly CosmosClient _cosmosClient; private readonly Container _container; public MyFunction(ILogger<MyFunction> logger, CosmosClient cosmosClient) { _logger = logger; _cosmosClient = cosmosClient; // Initialize Cosmos DB container _container = _cosmosClient.GetContainer("YourDatabaseName", "YourContainerName"); } [FunctionName("MyFunction")] public async Task Run([TimerTrigger("0 */5 * * * *")] TimerInfo myTimer) { var logEntry = new { id = Guid.NewGuid().ToString(), timestamp = DateTime.UtcNow, logLevel = "Information", message = "Function executed at " + DateTime.UtcNow }; // Insert log into Cosmos DB await _container.CreateItemAsync(logEntry, new PartitionKey(logEntry.id)); _logger.LogInformation("Function executed at {time}", DateTime.UtcNow); } }
Once the function is ready, deploy it to Azure Function App using Visual Studio or Azure DevOps.
Deployment Considerations:
By following these steps, you can successfully integrate NLog into your Azure Functions for efficient logging. This setup enables real-time monitoring, structured log storage, and improved debugging capabilities.
]]>There’s no doubt that every Director or Manager wants a high-performance team that delivers the best results and allows them to focus on building new business opportunities.
Come on, let’s face it! If we were comparing a work team with a sports team, who wouldn’t want to have a Barcelona Soccer Club, the Dodgers baseball team, or the Philadelphia Eagles in American football?
It’s easy to think and say, right? But where does the secret for building high-performance teams lives?
Martin Zwilling, founder and CEO of Founder & CEO at Startup Professionals, Inc., recommends the following list of actions for both entrepreneurs and senior executives to achieve the highest performance from team members (Zwilling, 2020):
Don’t rely on those who understand the message quickly; at least repeat it five times in different forums to ensure it was heard and understood.
Don’t assume that team members already know what the expected standards of excellence are.
Remember that micromanagement is not an effective way to achieve top performance. Instead, you can practice process coaching and let the team make their own decisions and improve step by step.
Take the time to provide informal feedback weekly or even daily. This will help address gaps gradually and increase the team members’ psychological safety.
As a Scrum Master working in an agile framework, you are a servant leader. Team members cannot be top performers without necessary resources. Leaders should anticipate these requirements, listen carefully to feedback from team members, and provide resources on a timely basis.
As a leader you should recognize and support your team in situations that go beyond their domain.
Recognition is important for leveraging the team members confidence and the team’s health.
Related to this topic, the Center for Human Capital Innovation provides also some examples and key factors for high-performance teams:
1992 US men’s Olympic basketball team, known as the “Dream Team” tell us that “the essence of a high-performance team isn’t found in the individual capabilities of its members but in their ability to adapt, learn, and evolve into a synergistic unit. This transformation was marked by a shift in the team’s approach to playing together, emphasizing mutual understanding, trust, and a unified strategy” (Center for Human Capital Innovation, 2024).
Taking in consideration the last paragraph, high-performance teams relays on:
On the other hand, Expert Panel a former Forbes Councils Member provide these tips for optimizing the team’s level and avoid burnt out as well:
I hope these tips will help you to get your desired top performer team. Be patient but most importantly, work on it!
Bibliography:
Are you a PM or BA who has been assigned a project or platform that is new to your company? If so, you may find that there’s a learning curve for everything that needs to be executed, especially when it comes to the launch. Not all platforms are the same; they can require different steps to go live. Below is a list of steps I take when creating a launch checklist.
Start by meeting with your team and stakeholders to create a list of action items needed for the launch. Ask each individual what they need to complete, when they need to finish it, and how long it will take. Don’t just focus on activities for the day of the launch; also inquire about tasks that need to be completed in the days, weeks, and even months leading up to it. Remember, there may also be post-launch activities to consider.
After compiling your action items, group them into time frames. I like to break them down into categories: one month before launch, two weeks before launch, the day before launch, the day of launch, and post-launch. Work with your team to identify any dependencies between tasks. Some team members may not be able to complete their tasks until others are finished, while some tasks can be done in parallel.
Once you have your list of activities, you’re ready to create a checklist to distribute to your team. Consider including the following fields:
After completing your checklist, share it with everyone on your team. It may be helpful to store it in a shared drive where all team members can access and update it. Depending on the activities required, you might also need to contact third parties or vendors to handle certain tasks on their end.
As you work through the tasks, ensure that team members are updating the checklist regularly. If you’re focusing on action items to be completed before the launch, it’s a good idea to check in with the team during scrums or status meetings to confirm they are on track to complete everything on time.
Do you have any other tips or ideas on how to approach launch checklists? Feel free to leave a comment!
]]>Authentication and authorization are two crucial aspects of web development. In modern applications, it’s essential to ensure that users are who they say they are (authentication) and have permission to access specific resources (authorization). OAuth2 and OpenID Connect are two widely used protocols that help achieve both goals.
OAuth2 (Open Authorization 2.0) is an authorization framework that enables third-party applications to access a user’s resources without requiring them to share their credentials (username and password). It allows for delegated access, meaning that users can grant specific, controlled access to their data without revealing their login information.
OAuth2 is commonly used to enable users to authenticate via their existing accounts from services like Google, Facebook, or Microsoft. This allows users to securely log in to applications without exposing their sensitive credentials to the requesting application.
OAuth 2.0 is a widely adopted authorization framework that allows third-party applications to access user resources without sharing the credentials. It provides a secure and scalable way to manage authorization. Here are some key benefits of OAuth 2.0:
OpenID Connect (OIDC) is an identity layer built on top of OAuth2. It is used to verify the identity of the user and obtain their profile information. While OAuth2 is used for authorization, OpenID Connect extends OAuth2 to include authentication.
In simple terms, OAuth2 tells the client what the user is allowed to do (authorization), while OpenID Connect tells the client who the user is (authentication).
For better understanding, we’ll integrate with Google as the OAuth2 and OpenID Connect provider.
1.1 Open the Google Cloud Console
Once your project is created, you’ll be redirected to the newly created project’s dashboard in the Google Cloud Console.
After completing the OAuth consent screen setup, you’ll be asked to configure the OAuth credentials.
Select Web application as the application type.
Add the following Authorized redirect URI:
Example Redirect URI (if your app is running on https://localhost:{{portnumber}}
):
Click Create.
After completing the OAuth 2.0 configuration and clicking Create, a new window will appear with your Client ID and Client Secret.
It is crucial to copy both the Client ID and Client Secret and store them securely. These credentials will be necessary for integrating Google authentication into your app and ensuring secure access to users’ Google accounts.
You will use these credentials in your application to authenticate users and interact with Google’s services. Keep them safe, as they allow access to sensitive user data.
Set Up the .NET Core Project:
In the Program.cs file, configure the authentication middleware to use Google’s OAuth2 and OpenID Connect:
// Add authentication services (Google login, etc.) builder.Services.AddAuthentication(options => { options.DefaultScheme = "Cookies"; options.DefaultChallengeScheme = GoogleDefaults.AuthenticationScheme; }) .AddCookie() .(options => { options.ClientId = "client-id"; options.ClientSecret = "client-secret"; options.CallbackPath = "/signin-google"; }); app.UseAuthentication(); app.UseAuthorization();
The .AddGoogle method is part of the ASP.NET Core framework, introduced in .NET 9.0 for authentication purposes (typically in the context of OAuth 2.0 or OpenID Connect for integrating Google sign-in). If we are using a version of .NET earlier than 9.0, we will not have access to this method directly.
To handle lower versions of .NET (e.g., .NET 5.0, .NET 6.0, or .NET 7.0) and integrate Google authentication, we will need to use the older approach provided by the AddOAuth or AddGoogle method from previous versions. Here’s how we can implement it for .NET 6.0 or lower:
.AddOAuth("Google", options => { options.ClientId = Configuration["Google:ClientId"]; options.ClientSecret = Configuration["Google:ClientSecret"]; options.CallbackPath = new PathString("/signin-google"); options.AuthorizationEndpoint = "https://accounts.google.com/o/oauth2/auth"; options.TokenEndpoint = "https://oauth2.googleapis.com/token"; options.UserInformationEndpoint = "https://www.googleapis.com/oauth2/v3/userinfo"; options.Scope.Add("openid"); options.Scope.Add("profile"); options.Scope.Add("email"); options.SaveTokens = true; }
Create a simple controller to handle login and display the authenticated user’s profile.
Identity Management in ASP.NET Core relies heavily on OAuth 2.0 or OpenID Connect for user authentication. The Profile action retrieves user information, typically stored in claims, such as the ID token, and uses it for user management. With the appropriate configuration and token handling, the application can securely manage user identity, retrieve profile data, and display relevant information to the user.
Add a simple view to show the user’s profile in the Views/Account/Profile.cshtml:
Run application using:
dotnet run
OAuth2 and OpenID Connect are powerful protocols for handling authentication and authorization in modern applications. By integrating these protocols into .NET applications, we can securely authenticate users, delegate access to resources, and ensure that only authorized users can access services.
By following the steps we should now have a basic understanding of how to implement OAuth2 and OpenID Connect in a .NET application. These concepts are essential for any developer working on building secure, scalable, and modern web applications.
]]>The modern Android development landscape increasingly relies on two powerful tools: Figma for collaborative UI/UX design and Jetpack Compose for building native UIs declaratively. A crucial step in the development workflow is translating the polished designs from Figma into functional Compose code. But what’s the most effective way to do this?
Several approaches exist, each with its own strengths and weaknesses, impacting development speed, code quality, and flexibility. Let’s compare the primary methods available as of early 2025: Manual Conversion, Assisted Conversion (like the soon-to-be-sunset Relay), and AI Assistance.
This is the most traditional and widely practiced method. Developers meticulously examine the Figma design specifications (layouts, spacing, typography, colors, components) and manually write the corresponding Jetpack Compose code (Column, Row, Text, Image, custom composables, etc.). They implement state management, interaction logic, and navigation using standard Compose APIs and libraries like Navigation Compose.
Tools in this category aim to automate parts of the conversion process, often via plugins or specific workflows. Relay was a prominent example from Google. It allowed designers to annotate Figma components and developers to import them as Compose code packages.
Original Intent: Relay aimed to streamline the designer-developer handoff for UI components, translating design system elements into reusable Compose code.
Limitations (Even Before Sunset): Often best suited for simpler, static components. Handling complex state, interactions, or highly dynamic layouts usually required significant manual code modification after import. The generated code sometimes wasn’t as clean or idiomatic as manually written code. It introduced its own workflow and potential build complexities.
Current Status (Crucial): Relay is being sunset on April 30, 2025. This means it will no longer be supported or updated, making it not a viable option for new projects and a risk for existing ones. Teams relying on it need to migrate away.
While other design-to-code tools exist for different frameworks, mature, widely adopted, and robust Figma-to-Compose converters (beyond Relay) haven’t fully materialized or gained significant traction in the Android community for generating production-ready code.
Developers leverage Large Language Models (LLMs) and AI coding assistants (like Gemini, GitHub Copilot powered by OpenAI’s models, ChatGPT, DeepSeek Coder, etc.) to aid in the conversion process. This can take several forms:
Determining the single best AI is difficult and context-dependent and it is important to take in account that currently are raising new AI assistant models that overcome others in less than months
For Android development specifically, Gemini (via Studio Bot) has a strategic advantage due to its integration and Google’s focus. GitHub Copilot is a very strong general-purpose contender. The best choice often depends on individual workflow preference, specific task complexity, and current model performance (which evolves rapidly). Experimentation is key.
Feature | Manual Conversion | Assisted (Relay, plugins) | AI Assistance |
---|---|---|---|
UI Accuracy | High (Developer Controlled) | Medium (Tool Dependent) | Medium (Requires Refinement) |
Code Quality | High (Developer Controlled) | Low-Medium (Often Needs Refactor) | Variable (Requires Review) |
Flexibility | Very High | Low | Medium (Assists, doesn’t dictate) |
Speed | Slow | Medium (Initial Setup) – Fast (Plugins) | Fast (for Boilerplate/Snippets) |
Complexity Handling | Excellent | Poor | Poor-Medium (Best for simple parts) |
Navigation/Flows | Full Manual Implementation | Not Handled | Boilerplate Help Only |
Learning Curve | High (Requires Compose Skills) | Medium (Tool Specific) – Low (Plugins) | Low-Medium (Prompting/Review) |
Current Viability | High (Standard) | None (Sunset Relay) – Low (Plugins) | High (As an Assistant) |
As of March 2025, with Relay sunsetting, there is no magic bullet for instantly converting complex Figma designs into perfect, production-ready Jetpack Compose applications.
The most effective strategy today is often a hybrid one:
Ultimately, developer skill, a deep understanding of Jetpack Compose, and careful attention to detail are paramount, regardless of the tools used. AI can augment this process, but it doesn’t replace the need for expert human developers to bridge the final gap between design and functional, high-quality code.
]]>Today in our “We Are Perficient” series, we explore how businesses can take their digital experience to the next level through mobile optimization. In an exclusive conversation with Jonathan Crockett, Managing Director of Go-To-Market, Sales, and Solutions at Perficient, we dive into key strategies to ensure brands deliver seamless, high-impact experiences on mobile devices.
In today’s digital world, user experience is everything. Companies looking to stand out must provide seamless, personalized, and optimized interactions at every touchpoint. In this video, we explore how the combination of Artificial Intelligence, advanced digital experience strategies, and collaboration with technology leaders like Adobe is redefining the way brands connect with their customers.
Today, most digital interactions happen on mobile devices. Without a well-optimized mobile strategy, brands risk losing conversions and engagement. From ultra-fast loading times to intuitive and accessible interfaces, mobile optimization is no longer optional—it’s essential to improving customer retention and conversion rates.
Artificial intelligence is transforming user experiences by enabling real-time personalization based on data. From content recommendations to adaptive interfaces that respond to user behavior, AI helps deliver unique and relevant experiences at every interaction. This not only enhances customer satisfaction but also boosts lifetime value and brand loyalty.
As an Adobe strategic partner, Perficient helps businesses unlock the full potential of Adobe’s cutting-edge solutions. From Adobe Experience Manager to Adobe Sensei, our strategies merge creativity and technology to design immersive, scalable, and highly effective digital experiences.
The future of digital experience lies in personalization, optimization, and continuous innovation. If you’re looking to transform how your customers interact with your brand, Perficient can help.
Contact us today and discover how we can elevate your digital strategy.
]]>Achieving end-to-end lineage in Databricks while allowing external users to access raw data can be a challenging task. In Databricks, leveraging Unity Catalog for end-to-end lineage is a best practice. However, enabling external users to access raw data while maintaining security and lineage integrity requires a well-thought-out architecture. This blog outlines a reference architecture to achieve this balance.
To meet the needs of both internal and external users, the architecture must:
The architecture starts with a shared data lake as a landing zone for raw, unprocessed data from various sources. This data lake is located in external cloud storage, such as AWS S3 or Azure Data Lake, and is independent of Databricks. Access to this data is managed using IAM roles and policies, allowing both Databricks and external users to interact with the data without overlapping permissions.
Benefits:
The bronze layer ingests raw data from the shared data lake into Databricks. Using Delta Live Tables (DLT), data is processed and stored as managed or external Delta tables. Unity Catalog governs these tables, enforcing fine-grained access control to maintain data security and lineage. End-to-end lineage and Databricks begins with the bronse layer and can be easily maintained throughout silver and gold by using DLTs.
Governance:
Subsequent data processing transforms bronze data into refined (silver) and aggregated (gold) tables. These layers are exclusively managed within Databricks to ensure lineage continuity, leveraging Delta Lake’s optimization features.
Access:
This reference architecture offers a balanced approach to handling raw data access while maintaining governance and lineage within Databricks. By isolating raw data in a shared lake and managing processed data within Databricks, organizations can effectively support both internal analytics and external data sharing.
Contact us to learn more about how to empower your teams with the right tools, processes, and training to unlock Databricks’ full potential across your enterprise.
]]>At Perficient, we’re passionate about forging the future by advancing STEM education and improving health and well-being. Philanthropy is at the heart of our global company. Beyond obsessing over outcomes and delivering innovative digital solutions for our clients, Perficient and our colleagues share another common goal—making a difference in the communities where we live and work.
We’re proud to share the 2024 Community Impact Report, which celebrates the unique acts of kindness and generosity found across our global company.
This engaging report captures some of the incredible stories, activities, and results of how our colleagues are giving back through their time, talent, and resources. Dive into the 2024 Community Impact Report to learn how Perficient and our colleagues are making a difference around the globe.
Perficient is guided by our two corporate giving pillars that directly align to our business—advancing STEM education and improving health and well-being. These pillars have a direct impact on the philanthropic activities and initiatives that are outlined in the 2024 Community Impact Report.
Read the report to learn how we:
This report highlights just some of the ways Perficient is making a difference. You can discover even more ways we’re giving back on the Life at Perficient blog and on Perficient’s social media channels.
See More: Perficient Celebrates the 2024 Community Impact Report on LinkedIn
Read our previous Community Impact Reports, including the 2023 Community Impact Report and 2022 Community Impact Report.
Perficient believes that every act of kindness can make a difference, no matter how big or small, and that we each have the power to make the world a healthier and happier place. Together, we’re building a better world.
It’s no secret our success is because of our people. No matter the technology or time zone, our colleagues are committed to delivering innovative, end-to-end digital solutions for the world’s biggest brands, and we bring a collaborative spirit to every interaction. We’re always seeking the best and brightest to work with us. Join our team and experience a culture that challenges, champions, and celebrates our people.
Visit our Careers page to see career opportunities and more!
Go inside Life at Perficient and connect with us on LinkedIn, YouTube, Facebook, TikTok, and Instagram.
]]>