Power BI, Excel, OneLake – Dreams Do Come True!

I can’t believe it’s finally here! A way to have Excel live in OneDrive and access it from Power BI nearly live! We can officially short cut files to our OneLake from both SharePoint and OneDrive! I am super excited about this feature, and I hope you are too. This feature plus User Data Functions allows us to not only have data from Excel in our reports but keep it as fresh as needed. Imagine having budget allocations that you want to adjust right before or during a meeting. Now you can! You can edit a file in Excel and hit one button to see the new numbers in your report. In the past, we relied on 3rd party services or Power Apps licensing to accomplish this sort of experience. Now we can just use Excel, an old data friend.

Please note, THIS IS IN PREVIEW AND VERY NEW. I’ve included a ton of screenshots, but please be advised these may not be entirely reflective of the GA reality once this feature is released. My example uses a OneDrive folder, but you can easily do this with SharePoint as well! One caveat, you will need a Fabric capacity. This does work with a trial capacity.

  1. Creating the ShortCut in OneLake to OneDrive Folder
  2. Connecting to the File in Power BI
  3. Creating a Refresh Schedule in Power BI Service
  4. Optional – creating a manual refresh button using Translytical Task Flows
  5. Additional Resources

Creating the ShortCut in OneLake to OneDrive Folder

1 – Navigate to OneDrive online: https://onedrive.live.com/login

2 – Select the settings gear in the top right corner and select “OneDrive settings”.

3 – On the left-hand panel, select “More Settings” then scroll all the way down to the Diagnostic Information. From there, copy the OneDrive web URL. This is what we will use in Fabric to make the short cut.
NOTE – you will need to delete everything in that URL after the _com (in the screenshot below that’s “_layouts/15/onedrive.aspx”.

4 – Navigate to a lakehouse in Fabric where you would like to access the content from.
5 – Hit the Get data drop down then select “New shortcut”

6 – Choose the OneDrive option. As of this post, it is currently in Preview and says “OneDrive (Preview)” on the button.

7 – Create a new connection. The Site URL will be what you copied from the OneDrive settings. I recommend renaming the connection to something like “[Name]’s OneDrive” so it’s clear where the data comes from.

8 – Now that we have a connection, you can point the shortcut to any folder in your OneDrive! Please note, it will grab all children folders inside as well as files. Be mindful of what you actually want to share within OneLake. On the bright side, this means you only need one shortcut per folder hierarchy which makes this much easier if you have files in multiple subfolders you want to share.

9 – Once you have a folder selected, it may give you an option to transform data (may just be me with CSVs and JSON files in my folders lol). You can skip this by hitting “previous” then “skip” or “Revert Changes” at the top. If you want to transform your CSVs to delta tables, simply hit next. I haven’t played around with these auto transformations yet, so if you have notes let me know! Anything with “Auto” and “Preview” scares me, so I stay away until at least the “Preview” is gone haha. Also, currently this feature does not work. Your shortcut will simply disappear into the void. Really excited to see where this ends up in the future though!

10 – Hit “Create” and boom! You can now access files from your OneDrive inside OneLake! It may take a couple of refreshes on your browser and a few seconds, but then you can go to the Files section of your lakehouse and see the files/folders in your shortcut. NOTE – at first it will show the folder title in your “Tables” section. That means it’s working. Try refreshing your browser (and give it a couple of minutes) and it will pop up in your files section. It does flash a share warning to you, don’t worry about that and just give it a bit to load in the right section. The time it takes is directly proportionate to the amount of information/size of files you’re dropping. If you look at my screenshot, you can see I have one file that’s over 4 GB. Probably not the best folder to pull in (should have gone one level deeper to avoid that file since all I want is in the CSVs folder), but I wanted to see if it can handle it.

If needed, hit the three dots next to your folder and manually move it to files section.

You can see the date modified matches what’s in your OneDrive! Now to test the syncing, let’s make a change to the dim_customer.csv.
Original view:

Change made, took 40 seconds to sync from my laptop to OneDrive then about instantly it was available in Fabric (by the time I moved to that tab and refreshed my page it was there!).

Holy cow! This is a game changer! No longer do folks need to upload files manually using the OneLake explorer (very buggy in my experience). Now you can just short cut it in and allow your ETL process to always grab the latest version that’s been shortcut to OneLake!!

So that brings us to the next phase, how does this work with Power BI? Can we finally have a “live” experience with Excel file data in Power BI?

Connecting to the File in Power BI

1 – Open up your Power BI file. This should work in both the Power BI Desktop and the web Power BI experience, but my demo will use the Desktop.

2 – Connect to the lakehouse with our files. To connect to the files within a lakehouse, we’ll have to do a custom query since the main lakehouse connector only allows you to pull tables/views. Thankfully, a connector does already exist, it’s just not in our standard Get Data options. Create a blank query, then use the code below with your workspace and lakehouse id. To find the IDs, grab from the URL (https://app.fabric.microsoft.com/groups/WORKSPACEID/lakehouses/LAKEHOUSEID?experience=fabric-developer).

let
    workspaceID = "YOUR WORKSPACE ID",
    lakehouseID = "YOUR LAKEHOUSE ID",
    Source = Lakehouse.Contents(null){[workspaceId= workspaceID]}[Data]{[lakehouseId =lakehouseID]}[Data],
    Files_Folder = Source{[Id="Files",ItemKind="Folder"]}[Data]
in
    Files_Folder

3 – Navigate to your file by clicking on the link under “Data” (likely called “Folder”) > then the link under “Content” (also probably called “Folder”) > then the link under “Content” that ties to the file you want to open (mine was called “Binary” for the file I wanted). Now interact with it like any other file import in Power Query.

CSVs are relatively simple, so I pulled in an Excel file for the screenshots below to show what it looks like if you have formatted tables in your file. It shows BOTH formatted tables and sheets! Pretty awesome!

Creating a Refresh Schedule in Power BI Service

Now all that is already pretty amazing. We no longer need to mess with crazy links from Excel and can access all our data from the same place – OneLake! But how do we refresh it?

1 – Navigate to the semantic model settings.

2 – Under “Data Source Credentials” there will be a new source called “Lakehouse”. Open the “Edit Credentials” link and authenticate using the credentials you want to be used for the refresh (can be a service account if needed, but MUST be a OAuth2 source). Right now, there’s no way to connect other than OAuth2. Not great if you want to use service principals, but service accounts are still an option.

If you don’t need “live” updates, you can stop here. However, if you’re as patient as I am, then you’ll want a way to trigger a refresh of this data on the fly from within your report. Enter, translytical task flows.

Optional – creating a manual refresh button using Translytical Task Flows

1 – Create a User Data Function item in Fabric. It will be mostly blank because all we really want is a way to trigger the report refresh. This function will accept a name parameter and return a little message alerting the user that report refresh has been kicked off. Don’t forget to hit that “Publish” button in the top right corner to actually have this be live! The publishing process can take a bit, be sure it finishes publishing before looking for it in Power BI.

Here’s the code I’ll be using:

import datetime
import fabric.functions as fn
import logging

udf = fn.UserDataFunctions()

@udf.function()
def refresh_report(name: str) -> str:
    logging.info('Python UDF trigger function processed a request.')

    return f"Welcome {name}! Your report refresh will kick off at {datetime.datetime.now()}!"

2 – Navigate to Power BI Desktop and add blank button. You can add any button, but blank looks cleanest and is easiest to make clear what we want people to do.

3 – Add a Data function action and ensure the “Refresh Report” toggle is on. This is the key functionality we are looking for.

4 – Create a measure to populate the current user’s name automatically.

User = USERNAME()

5 – Add that measure to the button by selecting the little fx option next to name.

6 – Configure your button with some text to let people know what it does.

7 – Publish and test! Enjoy your new button! Keep in mind, this will refresh the WHOLE model. My model is fairly small and quick, so not a huge deal. There’s not currently a way to have it only refresh one table using the UI, but if you want to make a more complex UDF notebook, you can have it refresh ONLY the table that’s been impacted. Talk about powerful.

I will make a follow up blog soon that will cover how to adapt this method to only refresh the table you need, so stay tuned and hit the subscribe button for a ping when new blogs are published!

UPDATE – I looked into ways to only refresh one table, but it requires using client secret credentials and cannot use semantic link, mssparkutils, or a large number of other libraries available in other notebooks in Fabric. I’m hoping this will change long-term, but for now please refer to this blog on how to use the REST API in a standard python notebook to refresh Power BI: https://medium.com/@arvind.g90/refresh-smarter-not-harder-power-bi-automation-with-rest-api-python-63923b37c9a6 .

Additional Resources

T-SQL Tuesday #176: That Piece of Technical Advice for the Past Me

DIVERSIFY!

We devote a lot of time to mastering the technology that we are passionate about or use for our jobs. For example, I specialized in Microsoft business intelligence, which has a wide range of technologies indeed. But I was still relatively narrow in focus.

I would suggest that you broaden your knowledge and skills in related fields like security and networking. You do not need to master them, but you should have a solid understanding of the concepts. Learning related technologies enables you to communicate effectively and confidently with your colleagues about how to provide high-quality solutions to your customers or businesses.

Learn New Things

There are many ways to learn new skills. Microsoft Learn is a great place to start but not the only option. There are online training programs (Udemy, Pragmatic Works), YouTube channels (Tales from the Field, Data on Wheels), blogs (Simple Talk, Eric on Identity), events (SQL Saturday, Microsoft Ignite), and even books (Hands on Analysis Services, M365 Security). These are some ways you can learn.

Get Certified

Certifications require you to show that you have a solid grasp of the topic. You have to be able to reason through concepts and respond to questions about the technology and its use. While certifications do not make you an expert, they do indicate a good knowledge of the technology. Finishing the certification process also gives you confidence to engage in relevant discussions with other technologists about the subject. Another advantage is that you can include this in your resume.

Getting certified in your area of expertise should be a formality, getting certified in something different is a stretch!

Other Benefits

Here are some other reasons to diversify:

  • You will improve your troubleshooting skills.
  • You may discover a new favorite technology.
  • You could increase your value in the job market.
  • You have something new to share with the community.

Learn something new, share what you learn.

October Is National Disability Employment Awareness Month

As many of you know, I was diagnosed with ALS or Lou Gehrig’s disease a couple of years ago. Since then, I have 9 made it a personal mission to write about my continued journey to continue working while the disease progresses. (You can follow my personal story on our Caring Bridge site.) For example, this post will be included in my Working with ALS content that highlights the methods I use to continue working through the various stages of the disease.

Supportive Tools

Tools and Software Presented at Kentucky ALS Resource Meeting

After becoming part of the disabled community, I wanted to learn more about the tools that support accessibility and how I could use those tools to keep working. I am truly thankful to companies like Microsoft who make accessibility a key part of their overall mission with their tools. I am an avid user of Voice Access including using it to write and edit this post. I have also used some of their adaptive tools such as buttons that allow me to do more work with my feet. I share about these tools because I want others to know that having a disability should not prevent you from working.

Supportive Employers

I really must give a shout out 3Cloud who have really supported me in the transition to being disabled in the workforce. I find some irony in the wording that I just used, as I would insist that I am still able to work though I have lost certain physical functionality along the way. It is their belief in my ability to continue to work while doing it differently that has made the transition easier. They continue to work with me to find great ways to work together. Sometimes this includes specific hardware or software to make my job easier. Other times this includes patience during meetings and supporting specific collaboration needs I might have. The support from leadership to my peers has been great.

Working Through Difficulties

It is not always easy to keep working. There are days when I am frustrated because I am unable to work as quickly as I used to work. My mind works faster than my body in most cases. Because I use voice for most of my work from controlling my mouse to typing up documents, what used to be a single click, now takes me multiple commands which can often result in a weird type of context switching.

While the tools are great, they are not perfect. Some tools are downright hard to use because they are 100 percent designed for mouse usage. For example, whiteboarding tools assume you can easily drag and drop and make connections between objects. Many times, this simply cannot be done with the tools that I use. I have relied on others for help when trying to use tools that are difficult to control the voice. Two of the biggest examples for me include Lucid which we use for charting and collaborating with whiteboards. The other example is PowerPoint. PowerPoint assumes you can work within boxes, move them around, and is not very good to use with dictation.

I often need help from family and friends to make these tools work. For me one of the most difficult parts of this is trying to explain what I am thinking in terms easy enough for someone to translate and produce something on the screen. In some cases, this is not that difficult but in brainstorming sessions it is very difficult as a thought poorly formed is not easy to express.

A Hope and a Future

Those of us that have physical limitations can and do continue to work and be productive in the workplace. Employers who support us are highly valued and appreciated. Often what we think in our minds outpaces what we can express with our bodies. As we look at future tools such as CoPilot from Microsoft which use AI to support a variety of tasks in our day-to-day work, there is a lot of promise to make us more productive in the workplace. Thank you to all the companies who support us through tools and employment. Your support is greatly appreciated!

Congratulations to Pragmatic Works – Microsoft Partner of the Year Finalists, Again

Pragmatic Works has done it again. Microsoft has recognized us for our work this year with two finalist awards — Power BI and PowerApps and Power Automate. This recognizes work we have done for our customers with these products.

This follows awards over the last three years for Data Analytics (2019) and Data Platform (2017). I am proud to work for a great company who strives to be industry leaders in data and analytics in the cloud and on-premises. I am truly excited to see where we go from here!

Power BI Is Finally in the Azure Trust Center

With the most recent announcement of Power BI’s inclusion in the Azure Trust Center, it is a good time to review where we are today with Power BI security and compliance as it relates to various customer needs. I do a lot of work with financial, energy, and medical customers. These groups represent a large amount of compliance and regulation needs. I wanted to understand where we are today and this announcement is significant.

What’s in the Announcement?

One the primary roadblocks to accepting the Power BI service has been the lack of compliance and concerns around security. Microsoft has been making a number of enterprise level improvement to the Power BI service and desktop. Power BI now has the following compliance certifications:

PowerBI Compliance 2016

This announcement shows Microsoft’s continued commitment to security and compliance in its cloud based products. While Power BI is not yet to the level of Office 365, some key compliance areas are now covered.

I think the most significant compliance certification is HIPAA/HITECH which removes barriers related for the medical industry. As hospitals, insurance companies, and providers scramble to meet reporting demands from their customers and the government, Power BI gives them a flexible reporting and visualization platform to meet those needs. It will empower self-service in the organizations and departmental or enterprise collaboration with data. The HIPAA/HITECH certification will allow them to use the platform with more confidence and security.

Beyond medical, more institutions will be able to rely on Power BI in a manner that is compliant and safe. As Microsoft continues this journey with Power BI and its other Azure based offerings, customers will be able to react more quickly to the changing business and regulatory environments with confidence in the security and management of their data.

The Reality – You Are as Secure as You Choose to Be

Even with this significant move by Microsoft, you are still responsible for implementing a secure, compliant solution. Microsoft is merely providing tools that are secure and will comply with regulations if implemented correctly. The key to a secure environment will always be you. The data you use and analyze with Power BI is ultimately your responsibility.

I encourage you to review the following resources in addition to the ones above as you determine your security and compliance within the Power BI product: