Power BI, Excel, OneLake – Dreams Do Come True!

I can’t believe it’s finally here! A way to have Excel live in OneDrive and access it from Power BI nearly live! We can officially short cut files to our OneLake from both SharePoint and OneDrive! I am super excited about this feature, and I hope you are too. This feature plus User Data Functions allows us to not only have data from Excel in our reports but keep it as fresh as needed. Imagine having budget allocations that you want to adjust right before or during a meeting. Now you can! You can edit a file in Excel and hit one button to see the new numbers in your report. In the past, we relied on 3rd party services or Power Apps licensing to accomplish this sort of experience. Now we can just use Excel, an old data friend.

Please note, THIS IS IN PREVIEW AND VERY NEW. I’ve included a ton of screenshots, but please be advised these may not be entirely reflective of the GA reality once this feature is released. My example uses a OneDrive folder, but you can easily do this with SharePoint as well! One caveat, you will need a Fabric capacity. This does work with a trial capacity.

  1. Creating the ShortCut in OneLake to OneDrive Folder
  2. Connecting to the File in Power BI
  3. Creating a Refresh Schedule in Power BI Service
  4. Optional – creating a manual refresh button using Translytical Task Flows
  5. Additional Resources

Creating the ShortCut in OneLake to OneDrive Folder

1 – Navigate to OneDrive online: https://onedrive.live.com/login

2 – Select the settings gear in the top right corner and select “OneDrive settings”.

3 – On the left-hand panel, select “More Settings” then scroll all the way down to the Diagnostic Information. From there, copy the OneDrive web URL. This is what we will use in Fabric to make the short cut.
NOTE – you will need to delete everything in that URL after the _com (in the screenshot below that’s “_layouts/15/onedrive.aspx”.

4 – Navigate to a lakehouse in Fabric where you would like to access the content from.
5 – Hit the Get data drop down then select “New shortcut”

6 – Choose the OneDrive option. As of this post, it is currently in Preview and says “OneDrive (Preview)” on the button.

7 – Create a new connection. The Site URL will be what you copied from the OneDrive settings. I recommend renaming the connection to something like “[Name]’s OneDrive” so it’s clear where the data comes from.

8 – Now that we have a connection, you can point the shortcut to any folder in your OneDrive! Please note, it will grab all children folders inside as well as files. Be mindful of what you actually want to share within OneLake. On the bright side, this means you only need one shortcut per folder hierarchy which makes this much easier if you have files in multiple subfolders you want to share.

9 – Once you have a folder selected, it may give you an option to transform data (may just be me with CSVs and JSON files in my folders lol). You can skip this by hitting “previous” then “skip” or “Revert Changes” at the top. If you want to transform your CSVs to delta tables, simply hit next. I haven’t played around with these auto transformations yet, so if you have notes let me know! Anything with “Auto” and “Preview” scares me, so I stay away until at least the “Preview” is gone haha. Also, currently this feature does not work. Your shortcut will simply disappear into the void. Really excited to see where this ends up in the future though!

10 – Hit “Create” and boom! You can now access files from your OneDrive inside OneLake! It may take a couple of refreshes on your browser and a few seconds, but then you can go to the Files section of your lakehouse and see the files/folders in your shortcut. NOTE – at first it will show the folder title in your “Tables” section. That means it’s working. Try refreshing your browser (and give it a couple of minutes) and it will pop up in your files section. It does flash a share warning to you, don’t worry about that and just give it a bit to load in the right section. The time it takes is directly proportionate to the amount of information/size of files you’re dropping. If you look at my screenshot, you can see I have one file that’s over 4 GB. Probably not the best folder to pull in (should have gone one level deeper to avoid that file since all I want is in the CSVs folder), but I wanted to see if it can handle it.

If needed, hit the three dots next to your folder and manually move it to files section.

You can see the date modified matches what’s in your OneDrive! Now to test the syncing, let’s make a change to the dim_customer.csv.
Original view:

Change made, took 40 seconds to sync from my laptop to OneDrive then about instantly it was available in Fabric (by the time I moved to that tab and refreshed my page it was there!).

Holy cow! This is a game changer! No longer do folks need to upload files manually using the OneLake explorer (very buggy in my experience). Now you can just short cut it in and allow your ETL process to always grab the latest version that’s been shortcut to OneLake!!

So that brings us to the next phase, how does this work with Power BI? Can we finally have a “live” experience with Excel file data in Power BI?

Connecting to the File in Power BI

1 – Open up your Power BI file. This should work in both the Power BI Desktop and the web Power BI experience, but my demo will use the Desktop.

2 – Connect to the lakehouse with our files. To connect to the files within a lakehouse, we’ll have to do a custom query since the main lakehouse connector only allows you to pull tables/views. Thankfully, a connector does already exist, it’s just not in our standard Get Data options. Create a blank query, then use the code below with your workspace and lakehouse id. To find the IDs, grab from the URL (https://app.fabric.microsoft.com/groups/WORKSPACEID/lakehouses/LAKEHOUSEID?experience=fabric-developer).

let
    workspaceID = "YOUR WORKSPACE ID",
    lakehouseID = "YOUR LAKEHOUSE ID",
    Source = Lakehouse.Contents(null){[workspaceId= workspaceID]}[Data]{[lakehouseId =lakehouseID]}[Data],
    Files_Folder = Source{[Id="Files",ItemKind="Folder"]}[Data]
in
    Files_Folder

3 – Navigate to your file by clicking on the link under “Data” (likely called “Folder”) > then the link under “Content” (also probably called “Folder”) > then the link under “Content” that ties to the file you want to open (mine was called “Binary” for the file I wanted). Now interact with it like any other file import in Power Query.

CSVs are relatively simple, so I pulled in an Excel file for the screenshots below to show what it looks like if you have formatted tables in your file. It shows BOTH formatted tables and sheets! Pretty awesome!

Creating a Refresh Schedule in Power BI Service

Now all that is already pretty amazing. We no longer need to mess with crazy links from Excel and can access all our data from the same place – OneLake! But how do we refresh it?

1 – Navigate to the semantic model settings.

2 – Under “Data Source Credentials” there will be a new source called “Lakehouse”. Open the “Edit Credentials” link and authenticate using the credentials you want to be used for the refresh (can be a service account if needed, but MUST be a OAuth2 source). Right now, there’s no way to connect other than OAuth2. Not great if you want to use service principals, but service accounts are still an option.

If you don’t need “live” updates, you can stop here. However, if you’re as patient as I am, then you’ll want a way to trigger a refresh of this data on the fly from within your report. Enter, translytical task flows.

Optional – creating a manual refresh button using Translytical Task Flows

1 – Create a User Data Function item in Fabric. It will be mostly blank because all we really want is a way to trigger the report refresh. This function will accept a name parameter and return a little message alerting the user that report refresh has been kicked off. Don’t forget to hit that “Publish” button in the top right corner to actually have this be live! The publishing process can take a bit, be sure it finishes publishing before looking for it in Power BI.

Here’s the code I’ll be using:

import datetime
import fabric.functions as fn
import logging

udf = fn.UserDataFunctions()

@udf.function()
def refresh_report(name: str) -> str:
    logging.info('Python UDF trigger function processed a request.')

    return f"Welcome {name}! Your report refresh will kick off at {datetime.datetime.now()}!"

2 – Navigate to Power BI Desktop and add blank button. You can add any button, but blank looks cleanest and is easiest to make clear what we want people to do.

3 – Add a Data function action and ensure the “Refresh Report” toggle is on. This is the key functionality we are looking for.

4 – Create a measure to populate the current user’s name automatically.

User = USERNAME()

5 – Add that measure to the button by selecting the little fx option next to name.

6 – Configure your button with some text to let people know what it does.

7 – Publish and test! Enjoy your new button! Keep in mind, this will refresh the WHOLE model. My model is fairly small and quick, so not a huge deal. There’s not currently a way to have it only refresh one table using the UI, but if you want to make a more complex UDF notebook, you can have it refresh ONLY the table that’s been impacted. Talk about powerful.

I will make a follow up blog soon that will cover how to adapt this method to only refresh the table you need, so stay tuned and hit the subscribe button for a ping when new blogs are published!

UPDATE – I looked into ways to only refresh one table, but it requires using client secret credentials and cannot use semantic link, mssparkutils, or a large number of other libraries available in other notebooks in Fabric. I’m hoping this will change long-term, but for now please refer to this blog on how to use the REST API in a standard python notebook to refresh Power BI: https://medium.com/@arvind.g90/refresh-smarter-not-harder-power-bi-automation-with-rest-api-python-63923b37c9a6 .

Additional Resources

Becoming a creator, my website experience

Over the past few months, I have debated starting a new blog to discuss our ALS journey. I realized we were writing about ALS on the Data on Wheels blog, CaringBridge, Facebook, and LinkedIn. I also created videos on YouTube. Adding another blog would only clutter the message. So, I decided not to move forward.

What changed? At the Walk to Defeat ALS, my wife, Sheila, created some business cards with all those links. It was cluttered. About the same time, I completed the hand off of the Data on Wheels blog to my daughter, Kristyna. She made some changes to the home page which helps users find what they are looking for. This got me thinking. Surprising, I know. Could I create a single location for us to share with people that would make it easier to find our content?

I started a thing…

I researched various options for creating websites using GoDaddy where we already have our domains and email. The primary feature I was looking for was a set of tools that are easy to use. It was essential that any tool I selected be fully compatible with my eye gaze technology.  

I decided to use a hosted WordPress site on GoDaddy. I have years of experience with WordPress. The Kubio plug in added features which helped me create the site easily. (Steve Hughes – Data on Wheels ~ ALS)

I discovered Kubio while looking through WordPress themes. I chose the Rufous theme which includes the Kubio features. This set of tools uses a lot of click to create options instead of code or drag and drop. I can use these tools very effectively with my eye gaze technology.

So many plug ins

I created the website with WordPress on GoDaddy. I use the Kubio Pro plug in to make the design process easier. Right away, I realized the website was missing two key components I used on our WordPress hosted blog: statistics and email. Adding these two features would send me down some rabbit holes.

WordPress.com spoiled me with its included Jetpack stats. Jetpack did not offer a free option, so I chose MonsterInsights. I have set up a Google Analytics account because MonsterInsights uses that data for its reports.

MonsterInsights is a part of Awesome Motive. I bring this up because getting MonsterInsights led me to many other products they have that I added to my website. Here is the list of plug ins I am using beyond MonsterInsights:

  • All in One SEO (AIOSEO) Free. I use this to add focus keywords and optimize my search engine page previews. It also creates sitemaps for search engines.
  • UserFeedback Lite. This lets me add feedback surveys.
  • Duplicator. I use this to create backups of my site.
  • WPForms Lite. I created my contact form with this plug in.
  • WP Mail SMTP Pro. I needed to upgrade to Pro to use Microsoft 365 email services.
  • WPConsent. I customized and added a cookie consent form. Cookies are used to improve my analytics.
  • Broken Link Checker. It currently only validates internal links. I am evaluating an upgrade to Pro to check external links.

Except for WP Mail SMTP Pro, I am using the free versions of these plug ins. To support AIOSEO, I had to set up accounts with Google Search Console and Bing Webmaster Tools.

I also use Super Page Cache and Optimole to improve performance on my site. Optimole is used to improve image performance. I am using the free versions of these plug ins.

Subscription support

One big piece of functionality that I wanted was subscription support. This was more difficult than I thought it would be. You need a CRM to manage your subscribers or contacts. Then you need a marketing platform to send emails out to your subscribers.

Thankfully, there are many options out there that bundle these together and work for small operations like me. I chose Brevo based on research, reviews, and the right price, $0. So far, it is working well for me. I did have to upgrade my email accounts with GoDaddy to make them compliant with the various email services. Otherwise, my emails went directly to junk. I am looking forward to learning more about this platform. I created my subscription form with Brevo and will use it for my newsletters.

As you can see, I have learned much so far. It will take time and patience to get better at this. My intention is not to scare anyone who might be considering this. Remember that I first created the Data on Wheels blog in 2010 with minimal effort. I wanted to use this platform to be more creative. This option makes that possible and has been code free which has made it easier for me to build using eye gaze. That’s right, this site was created “hands free.”

T-SQL Tuesday #176: That Piece of Technical Advice for the Past Me

DIVERSIFY!

We devote a lot of time to mastering the technology that we are passionate about or use for our jobs. For example, I specialized in Microsoft business intelligence, which has a wide range of technologies indeed. But I was still relatively narrow in focus.

I would suggest that you broaden your knowledge and skills in related fields like security and networking. You do not need to master them, but you should have a solid understanding of the concepts. Learning related technologies enables you to communicate effectively and confidently with your colleagues about how to provide high-quality solutions to your customers or businesses.

Learn New Things

There are many ways to learn new skills. Microsoft Learn is a great place to start but not the only option. There are online training programs (Udemy, Pragmatic Works), YouTube channels (Tales from the Field, Data on Wheels), blogs (Simple Talk, Eric on Identity), events (SQL Saturday, Microsoft Ignite), and even books (Hands on Analysis Services, M365 Security). These are some ways you can learn.

Get Certified

Certifications require you to show that you have a solid grasp of the topic. You have to be able to reason through concepts and respond to questions about the technology and its use. While certifications do not make you an expert, they do indicate a good knowledge of the technology. Finishing the certification process also gives you confidence to engage in relevant discussions with other technologists about the subject. Another advantage is that you can include this in your resume.

Getting certified in your area of expertise should be a formality, getting certified in something different is a stretch!

Other Benefits

Here are some other reasons to diversify:

  • You will improve your troubleshooting skills.
  • You may discover a new favorite technology.
  • You could increase your value in the job market.
  • You have something new to share with the community.

Learn something new, share what you learn.

Working with ALS – Insights from the Ability Summit

The 14th annual Ability Summit is a global event that I attended a few weeks ago. It is hosted by Microsoft, and it presents the latest technology innovations and best practices for accessibility and inclusion. The event has three main session tracks: Imagine, Build, and Include. Each track examines different aspects of how technology can enable people with disabilities and make the world more inclusive. The event is free, and anyone can register online to attend. All sessions are recorded and can be watched at any time on demand.

Ability Summit 2024 Highlights

As we think about our enduring commitment and goal at Microsoft, which is to build that culture of accessibility and embed it into everything we do, grounded always by the insights of people with disabilities. – Jenny Lay-Flurrie

In the first keynote, Microsoft CEO Satya Nadella and Chief Accessibility Officer Jenny Lay-Flurrie talked about how AI can remove obstacles and create more accessible experiences, while also addressing the challenges and concerns of responsible AI. The keynote showed several examples of how AI can help people with disabilities, such as voice banking for people with ALS, descriptive audio for people with low vision, and Copilot for people with diverse speech patterns. It was very impressive to see Team Gleason featured as a partner with Microsoft to work on AI to help the ALS community preserve their voice.

Team Gleason and Microsoft Team Up to Give an ALS Person His Voice Back

As a platform company, we have to absolutely lean into that and make sure that everything we’re doing, whether it’s Copilot and Copilot Extensibility or the Copilot stack in Azure is all ultimately helping our ISVs, our customers, our partners, all achieve their own goals around innovation, around accessibility. – Satya Nadella

Build Session: Bridging the Disability Divide with AI

The conference had many sessions and keynotes, but this one about the disability divide and AI was very interesting to me. These are three main points I learned from this session: 1) how people with disabilities are benefiting from AI in their personal and professional lives; 2) advice on how to begin and advance the AI journey with accessibility as a priority; 3) the significance of accessibility as a basic value for developing technologies that enable everyone.

This session also provided some resources and opportunities for us to learn more about AI and accessibility, such as the Accessibility Bot, which is a chatbot that can answer questions about Microsoft’s products and services regarding accessibility topics; the AI Studio, which is a platform that allows users to explore and build AI applications using various cognitive services and SDKs; and the AI Platform Team, which is a group of developers and researchers who work on making AI more accessible and inclusive.

In Real Life

I belong to the ALS community (I have ALS), and I rely on a lot of accessible technology both hardware and software to accomplish work. I used a combination of Voice Access in Windows 11, a Stream Deck foot pedal, a foot pedal joystick on my wheelchair and Microsoft 365 Copilot to write this blog post. Voice Access helps me with dictation and specific commands like selecting paragraphs or capitalization. A Stream Deck allows me to do backspace and deletes. A foot pedal joystick acts as a mouse. Copilot assists me with summarizing and rewriting content. As you can tell, we need a whole set of tools to suit our needs, and there is no single tool or method that works for us. I’m excited to see how AI will enhance accessibility for all of us. My goal is to keep sharing the tools and techniques I use to live and work with ALS through my blog and YouTube channel.

2023 in Review – Steve’s Perspective

2023 was another transition year as I continue to navigate my ALS with work. It has become harder to write or at least more work is required. I use Voice Access day to day which allows me to operate my system and dictate everything from messaging in Teams to documents and blog posts such as this. While the technologies allow me to continue to contribute in a variety of ways they are not without their flaws. I find myself doing a lot of correction or relying on friends and family to correct and modify documents and PowerPoints in particular. Despite these limitations, I continue to find ways to contribute and be productive at work and in the community when I am able.

Where I Am Today

At the beginning of the year, I was using a smaller wheelchair but was able to get around well and was still able to go to Usergroup meetings and travel the country. My wife and I made a trip to Disney in May of last year. This was a great opportunity to experience what Disney had to offer for those of us with the accessibility issues. I will tell you they do a great job! As a year progressed, I started to lose more functionality in my legs. It was particularly bad after surgery in June. At that point it became increasingly difficult for me to participate in events further away in the region.

Steve and Sheila at Disney World

I now have a great new power chair which allows me to get around easily and we can use a mobility access van as well. There are still limitations on my ability to travel but we were able to make it out to Kristyna’s wedding in October using these tools.

Kristyna And Steve at her wedding

The other big change for me has been the effect on my breathing. This has limited my ability to talk in long sentences or for long periods of times without requiring a breathing break. Because of this new limitation, I have begun limiting my external speaking and webinars because I would not be able to maintain a long session over 15 minutes or so. As you can imagine this is very disappointing for me as I spent much of my career in the public speaking arena.

I mention all this not so that you can feel sorry for me, but so that you can understand why you may see me in different circumstances or contributing in different ways I’ve never done before. I have always loved working in the community and wish to do so where I am able to. And more about that next.

Contributing to YouTube

This year we launched the new YouTube channel for Data on Wheels. We launched this channel to give me a voice to support some of the working with ALS content I have been creating. My first content was related to using Voice Access. This is a series of videos that introduced users to how to use Voice Access in practical ways. It has a particular focus on enabling users like me on the capabilities on Voice Access. I use Voice Access for most of my navigation and dictation needs in the day-to-day work. I do however lean on Microsoft 365 dictation for longer content creation such as this blog post. Someday I hope to do all of it in one tool.

Fabric 5 video series logo

One of the big contributions I wanted to make was a series of discussions on Microsoft Fabric architectural decision points. That was how the Fabric 5 was born! I was able to maintain about 5 minutes of good conversation regarding Fabric and the various architectural decisions customers should make. This is allowing me to continue to contribute to the community despite the disabilities in front of me. I look forward to contributing even more this upcoming year as Fabric continues to change the landscape of data analytics as we know it.

Spirit of 3Cloud

This year was capped off with my company awarding me the Spirit of 3Cloud award. This award reflects the contributions I have made to the company while battling my disability but at the same time providing support for other team members and growing our organization. I believe this award represents my ability to continue to encourage others to give their best in their lives and in work. Thank you to all of those who continue to support me in this journey, and I hope to continue to represent 3Cloud well throughout it all.

Steve with the Spirit of 3Cloud Award

What’s Next for Data on Wheels

This year I am handing the primary reins all the Data on Wheels blog and YouTube channel over to my daughter, Kristyna, as well as our Data on Rails program for new bloggers. I look forward to seeing great things from her as she continues to grow in her experience and community involvement. You should continue to look for great content from her as she takes the primary role and voice for Data on Wheels. I have truly enjoyed contributing to the blog for over 12 years and will continue to contribute as I’m able to both the blog and YouTube channel.

I am not totally getting out of all of this but want to express my sincere thanks for all the support you all have given us through the years. As I continue to work through my ever-changing disease, your support continues to be appreciated and I will keep you all up to date.

Thank you and Happy New Year!