
Take Positron to Work with Positron Pro
While you've downloaded Positron on your desktop and are loving it, you may still have a few questions about using it at work: 1. What happens when my analysis requires more memory than my laptop has? 2. How can I bring Positron into my company's secure environment? 3. How can I access data in Databricks or Snowflake from Positron? Nick Rohrbaugh, Senior Product Marketing Manager at Posit shares how Positron Pro, available exclusively within Posit Workbench, transforms from a powerful desktop editor into a fully governed, enterprise-ready IDE. In this webinar, you will learn how: 1. Data teams get immediate access to scalable, server-side compute and secure, one-click data authorization. 2. IT leaders can centralize, secure, and manage Positron alongside RStudio, Jupyter, and VS Code, all from a single platform. Helpful links: Positron: https://posit.co/products/ide/positron/ Posit Workbench: https://posit.co/products/enterprise/workbench/
image: thumbnail.jpg
Transcript#
This transcript was generated automatically and may contain errors.
Hi, I'm Nick and I work on the product marketing team here at Posit. Today I'm excited to talk to you about taking Positron to Work with Positron Pro.
If you've joined us over the past few workflow demos, you already have a good idea of what Positron is. If you're just hearing about Positron for the first time, you should know that it's the free data science code editor developed by Posit to combine the very best parts of RStudio with the modern foundation and extensibility of VS Code into a next-generation data science IDE that's optimized specifically for the data work that you do, whether you use R, Python, or both.
You can check out our past two workflow demos to learn more about how Positron is designed specifically for data science and how we're working to integrate new AI capabilities that put you, the human expert, first. If you have any questions about Positron or today's presentation, you can submit them via the Q&A panel in Zoom, and we'll spend some time answering them in a little bit.
Positron vs. Positron Pro
One of the most common questions we're asked about Positron is, when should I pay for Positron? Just like with RStudio, we distribute two versions of Positron, a free, source-available desktop application for Mac, Windows, or Linux that's called Positron Desktop, or more often, just Positron, and Positron Pro, a commercially licensed, native server version of Positron that's accessible via browser and only available in Posit Workbench.
What are the key differences in features and capabilities between Positron and Positron Pro? Well, I've tried to list everything worth mentioning here, but the key message is that our goal is and has always been that the Positron development experience you know and love is fully available in the free desktop application. Unlike other IDEs, we don't want you to have to pay for a Pro version of our software to unlock a better development experience that's tailored for data science.
In the free desktop version of Positron, you can code natively in both R and Python, install OpenVSX-compatible extensions, customize your layout with dedicated panes for data science work, push and pull code from version control, preview data apps, and even publish to Posit Connect and use Positron's AI capabilities without needing a Positron Pro license.
With Positron Pro, you get the same data science development experience you know and love from the free version of Positron. You also get to benefit from the capabilities built into Posit Workbench, our professional data science workbench that includes Positron, RStudio, VS Code, and JupyterLab. Things like single sign-on, managed credentials for accessing enterprise data in AWS, Snowflake, or Databricks, scalable and customizable compute, monitoring and auditing tools for IT and compliance, and dedicated support from Posit engineers when something goes wrong.
To put it another way, when should you pay for Positron? Well, when you need to take Positron to work.
When you have several colleagues who also use Positron, or who prefer RStudio, VS Code, or JupyterLab for their development work. When IT has concerns about the security and management of local desktop installations of data science software, or when they want greater monitoring and auditability of your use of that software. When you and your colleagues are working with datasets that are larger than your laptop can handle, and you need quick, easy access to more power. Or when your development work is critically important, and you need quick access to support when something goes wrong.
To put it another way, when should you pay for Positron? Well, when you need to take Positron to work.
These are the reasons why we made Positron Pro inside of Posit Workbench, and why our customers choose Posit Workbench for their development needs.
Demo: launching Positron Pro in Posit Workbench
Now let's take a look at Positron Pro inside of Posit Workbench.
Here you can see one of our internal Posit team environments, where I have access to Posit Workbench, as well as Posit Package Manager, and Posit Connect. But today we'll be spending our time in Posit Workbench.
When I click on Posit Workbench, I'm prompted to sign in with OpenID Connect. We use Okta here at Posit, and I've already signed into Okta to use several other services for work today. So all I need to do to access Workbench is click Sign in with OpenID, and my active Okta credentials will be picked up, and I'll be signed in.
Here on the Workbench homepage, I can see a list of any recent projects that I've been working with, as well as any active or suspended sessions I may have had open recently. To start a new session of Positron Pro, I'll click the New Session button here at the top of the page.
Then I'm presented with a menu of options to customize my development experience. First, I can choose which editor I'd like to use to write my code. Posit Workbench supports the classic Jupyter Notebooks interface, JupyterLab, Positron Pro, which is what we'll be using today, RStudio Pro for R users, and VS Code for Python and general software engineering.
I can give my session a more custom name if I'd like to help set it apart from other work I might be doing.
Then one of my favorite parts of Posit Workbench are what we call managed credentials. This means that an administrator can pre-configure connections to enterprise data sources and services from places like AWS, Azure, Databricks, and Snowflake, and all I need to do to access them is check them off when I launch my session.
For today's demo, we'll be accessing some data from our Snowflake account, and we'll also be using a service from AWS called Bedrock, which serves Anthropic Claude models from inside our AWS environment, powering our use of Positron Assistant and DataBot.
If I'm not already signed into these, I'll be prompted to when I check them off, then I just need to make sure I've chosen the right role and account.
If I'd like, I can choose a custom image that backs my session. Then I can choose my resource profile. As you can see, the small resource profile is already a bit too small to give me a good experience with Positron Pro, and I have a message letting me know that. Thankfully, our admin has given me a few other options. This medium resource profile with another CPU and a bit more memory, and this large profile with even more power.
Both Positron Pro and Posit Workbench can scale the sessions that are far larger than the numbers that you see here. This is just what we've set up in our environment today. These can also be customized on a per-individual or per-group basis, so different users and types of users can have different options.
When this looks good, I can click launch and my session will begin launching. Then when it's ready, it'll open for me.
Then I'll see the familiar Positron IDE. You'll notice in the lower right, I have some messages letting me know that my Snowflake account and AWS role have been successfully set for my session.
Connecting to Snowflake with managed credentials
Now, to connect to Snowflake to explore some data, all I need to do is run a few lines of code using the DBI and ODBC packages for R. Importantly, I don't need to tell Positron or Snowflake anything about my username or my password or any other credentials in my code or elsewhere. I can just tell it to select the current user, and my active ambient Snowflake credentials will be picked up.
You can see it's identified me as the current user, I have a connection string available in my environment, and under our connections tab, I now have an active connection to Snowflake. And if I open this up, I can see all of the different tables available to me inside of the Snowflake account that I selected when I launched my session.
Connecting to AWS Bedrock for AI features
Now we'd also like to use some of the new AI capabilities today, like Positron Assistant and DataBot. Here at Posit, we can't connect to large language model providers like Anthropic directly, but we use a service called AWS Bedrock, which serves these models from inside of our AWS account, and gives our IT team the security, governance, and auditability that they need over our use of these models.
To connect to Bedrock, it's just as simple as I did for Snowflake. I can open up the Positron Assistant menu here, and choose to add a language model provider. AWS Bedrock, as well as OpenAI, and a few other custom providers, are supported in Posit Workbench as of the latest release just a few weeks ago.
When I enable Bedrock and select it here, you'll notice I have a button to sign in. Just like with Snowflake, I don't need to remember my username, password, or any other credentials. I can just click sign in, and my active ambient credentials for AWS will be picked up, and it'll add Bedrock as a model provider.
Exploring data with DataBot
Great, now that we're connected to both AWS and Snowflake, let's use DataBot to explore some of our data. I'll open DataBot by first opening the command palette with Command-Shift-P, and then type DataBot to find the Open DataBot option. DataBot wants us to open a folder. Let's choose this air quality folder.
All right, so inside of DataBot, we'll start a new conversation here. And because, and actually you'll notice I disconnected from Snowflake, but all I need to do to reconnect it here after opening this folder is hit Resume Connection, and I'm still connected to AWS Bedrock.
So now that I'm connected to both, I can ask DataBot, what data do I have access to in Snowflake?
If you haven't seen DataBot before, it's an AI agent that we've built in Positron specifically for exploratory data analysis. So it'll execute R or Python code with your approval to help explore data, generate visualizations, and suggest new paths to explore. So here, we're going to always allow it to run the code so that it can move a bit faster and unblock itself. But we'll keep an eye on what it's doing as we go along.
So you can see we have a whole bunch of data sets in here, lots of them with test in the title. It's identified a handful that look a bit more interesting. You'll notice it tried to use select from the tidyverse before it loaded it. It's unblocked itself here.
And so you can see it's exploring the air quality table and the financial database table a bit further to tell you some more information about them. So let's check out the air quality data a bit more.
Let's open the air quality data and see what features I have in that data set.
And again, all of my queries, everything, all of my data, none of it is going directly to Anthropic. It's only going inside of our AWS accounts to that Bedrock service, which is running Claude for us on it, inside of our account.
So it's identified a bunch of different features, including the different kinds of pollutants that are measured.
All right. Let's do one more. How about a visualization showing geographic distribution of air quality?
Again, if you haven't seen DataBot before, one of the things it'll do is after each step, it'll propose a few additional options to explore next based on what you've discussed so far.
So here we can see we are diving into the data. Looks like we might be looking to make a map. We're running into some errors generating that map. Now we have a visualization of ozone concentrations. PM 2.5. And finally, nitrogen dioxide concentrations. Our different ozone monitoring stations. And then top 15 metropolitan areas with ozone. We're going to do that for PM 2.5, too. That's lots of visualizations. All right. Thanks, DataBot.
Exporting results and using DataBot memory
Another neat thing about DataBot that you may have seen in a past session is that once I've reached a point where maybe I'm done exploring my data with DataBot, I want to take the results of this and either share it with someone else or continue working with it on my own outside of the context of this little chat window. I can use this report command, which will export not just my entire conversation, but the important parts of my conversation to a Quarto document. And so first, it'll propose an outline saying, here's what we discussed and the way that I would lay it out. And if it looks good, I can approve it. And it'll begin generating that Quarto document for me.
Another neat feature of DataBot is once you've had a conversation that maybe you want to save some aspect of that conversation for the future, something about the way you've prompted DataBot, the line of thinking that you've given it, maybe you've provided some information about a data set or how to approach analyzing it that you want to use again, is we have this memory feature where I can, if I want to, save that conversation. And this will update a memory file for this project in this folder that we're working in with key information about our air quality data and the analysis that we did with that data inside of Snowflake.
And you can see now that it's done, it's created this DataBot.md markdown file. That's documented some useful information about our data set, about our analysis. Importantly, if I gave this some key information, like, for example, if I wasn't connected to Snowflake, but I actually gave a connection string to DataBot directly, that's the kind of information that would likely be recorded here in our DataBot.md file.
Workbench jobs and audited jobs
Another one of my favorite features inside of Posit Workbench are what we call Workbench jobs. If you have code that takes a really long time to execute, for example, you're training a machine learning model, you're running a really long ETL process, or you're just doing something else that takes a really long time to run, and you'd like to run that in the background while you're doing other work. That's what Workbench jobs are for.
So if I open the Posit Workbench menu here in my left sidebar, down here under the Workbench jobs submenu, I can click Run Job. Just open this menu, which will let me run a Workbench job. So here I can choose a script and a working directory. Let's go inside of our AirQuality directory. We're going to run our load data script.
And this small resource profile is fine, because I don't really care how long this takes to execute. But I could also give it one of these larger profiles if I wanted to. And similarly, give it a different image if I wanted to, as well as specify a different R version than the default, and provide any other arguments I might want. Once this looks good, I can hit Start. Then I can monitor that job in the same menu that I opened up moments ago. As it's running, or when it's completed, I can click on this and see some information about that job.
And critically, if you work in a regulated industry, or you do analyses where it's really important to verify the validity of the environment in which you're doing this analysis, we have a new kind of job in Workbench called audited jobs, which in addition to logging some basic information about this job, who ran it, when it was run, some of the settings about the environment, you can optionally record even more detailed information, like the versions of R and Python that were used, the operating system, any environment variables that were used, and even a digital signature to verify that this specific analysis was run in this specific environment.
Closing thoughts
I hope this demonstration has been helpful in showing how Positron Pro can help you bring the development experience you know and love from Positron to work, while making life easier for IT at the same time. As a reminder, we aim to make as much of the core functionality as possible available in Positron, and only gate capabilities that are designed for more enterprise use cases.
To put it another way, we believe that the value in Positron Pro and Posit Workbench comes from three areas, easily accessible, scalable compute for data scientists and analysts, the enhanced security and governance that IT needs to allow you to use your favorite tools, and the centralized management that makes it easier for IT to administer the various IDEs supported by Posit Workbench from a single platform, reducing overhead.
If it's not already clear from our demo today, I hope you can tell that in designing Positron and deciding which features and capabilities will require a Pro license, Posit takes a different approach. Our commitment to providing collaborative and innovative open source software for data science, our commitment to giving data teams choice and autonomy to use software in the way that they believe makes the most sense, and our commitment to balance our profitability with the benefits we provide to our customers and community, whether they pay us any revenue or not, is a core part of our identity as a company, and is codified in our corporate charter as a public benefit corporation. To learn more about our mission as a company, you can visit posit.co/pbc. Thank you.
To put it another way, we believe that the value in Positron Pro and Posit Workbench comes from three areas, easily accessible, scalable compute for data scientists and analysts, the enhanced security and governance that IT needs to allow you to use your favorite tools, and the centralized management that makes it easier for IT to administer the various IDEs supported by Posit Workbench from a single platform, reducing overhead.

