SharePoint lists may grow bigger with time, that’s no doubt. Depending on your user base, you might start to feel performance issues after a few thousands of items created. That’s why it should be good for you to looking after auto archive items in SharePoint using Power Automate.
Why Power Automate? It is a great tool with seamless integration with SharePoint which allows you to reduce the toil and generate value by automating tasks. If you haven’t checked Power Automate before, it is a good opportunity to start looking into the Low Code world.
RabbitMQ has became one of the most used message brokers in the world, so this is not a surprise if you come to a situation where you need to publish messages to RabbitMQ with .NET Core. This approach allows you to create asynchronous communication and better scalability depending on your architecture. But before we move ahead, it is important to explain a few things.
As per Wikipedia, “RabbitMQ is an open-source message-broker software that originally implemented the Advanced Message Queuing Protocol and has since been extended with a plug-in architecture to support Streaming Text Oriented Messaging Protocol, MQ Telemetry Transport, and other protocols”. It works mostly as a middleware for message handling.
The post will explore the features published and maintained in the RabbitMQ Client library. This library can be found in GitHub. The client, of course, is integrated or imported using NuGet via Visual Studio or NuGet CLI.
A pretty common use case scenario where you might want to auto archive files in SharePoint with Power Automate. Sometimes it is important to move some files from one place to other, reducing the size of a library and the number of rows returned in a search. You can always explore the retention policies available out-of-box in SharePoint, of course. But sometimes this feature can present some problems depending on the content type or data type you are working with.
So if you have the option to work with Power Automate, why not using it? Power Automate is great when you need to automate one specific task and it is a very intuitive to use. It is fully integrated with SharePoint and you can take advantage of its connectors to do your tasks.
Azure DevOps is a powerful tool for managing and tracking software development progress. Even more than that, it also does a bunch of other useful things like the Azure Boards feature that, in my opinion, are among the best. Consequently, focusing on the work item design, this blog post discusses how to read work item data from Azure DevOps with C#. This might be useful to you if you want to consume data from Azure DevOps and use it to integrate with other systems.
Sometimes you want to look for data that is not present in your database just to do a differential action without temporary tables. So in order to select values not present in an Oracle table, you need to use the dual option or the sys.dbms_debug_vc2coll option. Let’s cover each option here!
This post will help you if you want to execute non-query on Sybase database in C# like an UPDATE, CREATE, DELETE or others. The idea here is to explore the NuGet package AdoNetCore.AseClient as the connection driver for the database.
For some historical background, as per Wikipedia: “Sybase, Inc. was an enterprise software and services company that produced software to manage and analyze information in relational databases, with facilities located in California and Massachusetts. Sybase was acquired by SAP in 2010; SAP ceased using the Sybase name in 2014. “
This post will help you if you want to run query on Sybase database with C# like common SELECT scripts. The idea here is to explore the NuGet package AdoNetCore.AseClient as the connection driver for the database since this is extremely similar to other common database access libraries.
Nowadays, with more and more different applications coexisting and integrating between themselves, the technology you work is not barrier. You can connect to different datasource with the language you prefer thanks to the community!
This blog post will cover how to batch insert items in SharePoint with Power Automate. Our Flow will explore the SharePoint REST API calling the Batch endpoint and inserting 1000 items per time. For every Batch request made, you can add multiple Changesets up to 1000 Changeset requests. Likewise, for each Changeset, you can only add up to 1000 requests. That’s why our example will try to insert 1000 items per request just to make things simple.
In the tests I have made, using this Flow we were able to insert 5000 items in 16 minutes. All of my tests were executed in a Free Microsoft Flow account. In the other hand you probably may experience a better performance if you have a licensed account. Consequently, using a licensed account, you will be able to read more than 5000 rows.
In addition to that, it’s important to say that our test is going to read data from an Excel file. Everything will be static like the filename and the table name from the file. But this doesn’t mean that you cannot do the same by passing dynamic values. If you create a flow that runs tied to a file creation event in SharePoint, you will get the data needed to make it dynamic and save your time with this automation.
First steps on the Flow
In this example I chose to manually trigger a flow, just for it being easy to test. Needless to say tat you can change that for whatever you want.
This blog post will cover how to batch delete items in SharePoint with Power Automate. Our Flow will explore the SharePoint REST API calling the Batch endpoint and deleting 1000 items per time. For every Batch request made, you can add multiple Changesets up to 1000 Changeset requests. Moreover, for each Changeset, you can only add up to 1000 requests. That’s why our example will try to delete 1000 items per batch request just to make things simple.
In the tests I made, using this example we are able to delete 500 items in 2 minutes, 5.000 items in 12 minutes and 15.000 items in 40 minutes. All of my tests were ran in a Free Microsoft Flow account, so you probably may experience a better performance if you have a licensed account.
First steps on the Flow
In this example I chose to manually trigger a flow, just for it being easy to test. Needless to say tat you can change that for whatever you want.
This blog post will cover how to drop messages in IBM MQ using dotnet core. The post is based on the other awesome post from dotnet cookbook which I recomend you to read! The objective here is to present an overview of the code, how to use the tracing logs for troubleshooting connections and how to add certificates to perform TLS connections, all of these using the library ibm-mq-client available in Nuget.
The code to drop messages in IBM MQ
The code basically relies on some parameters for hostname, channel, queue manager, queue and port. Also there are some checks based on the name of those parameters to decide if they contain the pattern “.TLS” in the string. But if in your case you always use TLS on your connections, you can just remove that condition and let the SSL_CIPHER_SUITE_PROPERTY and SSL_CIPHER_SPEC_PROPERTY always available:
Using this code you should be able to drop messages, connect to the queues and achieve the result you want!
But if something goes wrong, you can always trace the messages made available by the library.
Wiliam is from Porto Alegre, Brazil, currently working as DevOps Engineering Advisor at Dell EMC. He has been working with Microsoft technologies for almost ten years with one year gap studying abroad in Japan attending the Science without Borders program from the Brazilian government. Also he is MCSE Productivity, a SharePoint lover.
Nowadays he is investing the most part of his time on exploring GitLab and Azure DevOps features and helping the company to develop a DevOps culture among its colleagues.
In addition of that, he spends some time on learning Japanese, doing CrossFit and playing real-time strategy games like Age Of Empires. Read Moreā¦
This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Cookie settingsACCEPT
Privacy & Cookies Policy
Privacy Overview
This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.
Recent Comments