Santa Maria : homeowner insurance

Providence : azure devops audit log - Kabrinskiy Eduard

Сообщение SARAKn » 11 май 2021, 00:48

Эдуард Кабринский - Vsts apis - Кабринский Эдуард


<h1>Vsts apis</h1>
<p>[youtube]</p>
Vsts apis <a href="http://remmont.com">Current news</a> Vsts apis
<h1>Querying the Azure DevOps Work Items API directly from Power BI</h1>
<p style="clear: both"> <img src="https://res.cloudinary.com/endjin/image/upload/f_auto/q_80/assets/images/blog/2016/07/header-querying-the-azure-devops-work-items-api-directly-from-power-bi.png" /></p>
<p>As Partners for Power BI, endjin is doing more and more interesting things with data and visualisations every week.</p>
<p>Recently we were asked to develop some custom visualisations based on work item KPI data from inside Azure DevOps (ADO), which required us to get to grips with the underlying ADO REST API.</p>
<p>This post walks through that process, explaining how to connect to and query ADO from within Power BI, use Power BI functions to create re-usable sub-queries that can be composed into larger queries and will hopefully leave you with a solid base for developing your own custom VSTS-based Power BI charts and visualisations.</p>
<p>Before we go any further, I should highlight that if you're just starting out with surfacing your ALM data inside Power BI, there's already an number of pre-packaged content packs including one for ADO.</p>
<p>This provides a number of incredibly useful visualisations over your Git commits and work items. However, although new reports can be built on top of the underlying data sets that a content pack provides, if there's data that you need that isn't exposed in the content pack, there's not a lot you can do - the content packs are by definition non editable, so what goes on under the covers is a black box.</p>
<p>This was exactly the situation we were in, the data that we needed to support the custom KPIs was not retrieved by the content pack, so we had no choice but to build custom queries ourselves. Another reason for going down this route is that the ADO content pack is only available in the online version of Power BI - if you want to develop locally with Power BI Desktop, then you'd also be restricted to this approach. Either way, this post will show you that creating custom queries over the ADO REST API is not only possible, but straightforward, and it will give you a totally flexible solution for retrieving any data that is exposed through the ADO API.</p>
<h2>Overview</h2>
<p>The solution described here relies on calling various methods in the ADO REST API to retrieve detailed work item data, however the same principles could be applied to any of the entities retrievable from the API. Due to the endpoints and operations that are exposed, the following set of steps are followed to surface the data:</p>
<p><ol>
<li>Execute a ADO stored query using the work item query API to retrieve a list of work item IDs</li>
<li>Split the list of work item IDs into groups of 200, which is the maximum batch size that the work items API supports</li>
<li>Call the work items API for each list of 200 work item IDs to get the work item field data</li>
<li>Combine into a single dataset for use in charts and visualisations</li>
</ol>
</p>
<h2>Set up the Azure DevOps query</h2>
<p>The ADO API does include an endpoint that allows you to execute "work item query language" as part of the request, meaning dynamic queries could be composed and executed directly from your client application. However, Power BI doesn't support this - it requires POSTing to an authenticated URI with body parameters, which it deems insecure, as this scenario in REST would typically be used to update or delete data.</p>
<p>However, all is not lost, as the work item query endpoint also allows a stored query (i.e. an existing saved ADO query) to be executed by specifying its ID in the request.</p>
<p>So, the first step is to create, or identify an existing query that you wish to use to retrieve the set of work item IDs that you care about. This query will be the backbone of everything else that happens in Power BI - we'll be hard coding the query ID into the Power BI query - so all subsequent steps will work on that set of work item IDs. If you intend to create charts or visualisations across a number of iterations, or projects, or areas, then it would be advisable to make this query as wide as possible - grab as much data as you care about in this query so you can then apply the filtering dynamically inside Power BI.</p>
<p>However, bear in mind that some ADO endpoints will only accept a single ID at once, rather than a batch request - for example, if you wanted to retrieve the update history for each work item - and these would require a subsequent request for every ID returned from the original query, which will have a direct impact on performance. The key is finding the right balance of flexibility (by bringing back lots of data into Power BI for filtering), and performance (by only bringing back the data you actually need to limit the number of API calls required) and this will totally depend on your own specific circumstances and data sets.</p>
<h3>Creating dynamic Azure DevOps queries</h3>
<p style="clear: both"><img style="float: left; margin: 0 10px 5px 0;" src="https://res.cloudinary.com/endjin/image/upload/f_auto/q_80/assets/images/blog/2016/06/VSTS-query-1.png" />An alternative way to approach the ADO query could be to use the dynamic macros inside the ADO query so that the query itself always returns up to date data - e.g. filtering with Iteration Path = @CurrentIteration means that as you move into new iterations, the query will always return current work item data. Running the ADO query inside ADO will show you the data set that you'll have inside Power BI to use. Once you've been through the following steps and wired up Power BI to this query, any subsequent changes to this query will automatically affect the data set retrieved in Power BI when it is refreshed.</p>
<h3>Get the Azure DevOps query ID</h3>
<p>Whether you create a new query, or use an existing one, you'll need to obtain the GUID for this query so that it can be used from inside the Power BI query. This can be retrieved from the ADO API by executing the following request in a browser, or HTTP Client (e.g. Postman):</p>
<blockquote><p>https://<strong>[instance-name]</strong>.visualstudio.com/DefaultCollection/<strong>[project-name]</strong>/_apis/wit/queries/<strong>[path-to-query]</strong>/</p></blockquote>
<p><ul>
<li><strong>[instance-name]</strong> is the ADO instance name</li>
<li><strong>[project-name]</strong> is the ADO project name</li>
<li><strong>[path-to-query]</strong> is the path to the query e.g. "Shared Queries/Current Sprint/Product Backlog</li>
</ul>
</p>
<p>e.g. https://endjin.visualstudio.com/Default ... ies/Shared Queries/Current Sprint/Product Backlog/</p>
<p>The response will look something like this, and the value you're interested in is the first "id" value - in this example it is <strong>AC9C8A69-A593-4E7F-BB63-1B93930FEEAC</strong>. Make a note of this GUID as you'll need it in the next step.</p>
<h2>Execute the Azure DevOps query from a custom Power BI query</h2>
<p>Now that you have a query inside ADO that returns a list of work item IDs, you can create a Power BI query to execute it. Open Power BI Desktop and add a new custom query through the Get Data | Blank Query option. Name the query "GetWorkItemIds" - (this is important as the subsequent code examples will refer to this query by name) and open the Advanced Editor to edit the contents of the query.</p>
<p>Copy and paste the following query code into the Advanced Editor (replacing the default skeleton query code). It creates a Power BI function - i.e. a reusable query that can be called from inside other queries - which calls the ADO work item query API, specifying the query ID and returning the list of work item IDs.</p>
<p>You'll need to update the following values in order to execute the function:</p>
<p><ul>
<li><strong>[instance-name]</strong> is the ADO instance name</li>
<li><strong>[project-name]</strong> is the ADO project name</li>
<li><strong>[query-guid]</strong> the GUID value of the ADO saved query in the previous step</li>
</ul>
</p>
<p>Close the Advanced Editor to apply the changes, and the query should show as a function inside Power BI with an "Invoke" button - a bit like this:</p>
<p style="clear: both"><img src="https://res.cloudinary.com/endjin/image/upload/f_auto/q_80/assets/images/blog/2016/06/Invoke-Function.png" /></p>
<p>Clicking the Invoke button will do just that, and, if all is well you should see a list returned of the IDs of the same work items returned by the ADO query.</p>
<p><strong>Note</strong> - "invoking" a Power BI function inside the query editor adds an extra Applied Step to the query. This causes problems when other queries depend on it, so make sure you remove the Applied Step afterwards each time by clicking the X next to the Invoked FunctionGetWorkItemIds step.</p>
<p style="clear: both"><img src="https://res.cloudinary.com/endjin/image/upload/f_auto/q_80/assets/images/blog/2016/06/Invoked-Function.png" /></p>
<h3>Authenticating with the ADO service</h3>
<p>When the function is invoked, you will need to authenticate against the ADO service, using either Basic, Organisational Account, or oAuth authentication, depending on what is supported in your ADO service.</p>
<p>Two Factor authentication is not supported in Power BI yet for this type of data source, but Basic authentication can be used by setting up Alternative Authentication Credentials inside ADO Security Profile and specifying the username and password.</p>
<p>Which ever authenticated method you choose to use, make sure that you set the authentication credentials to apply at the top level domain level, so that all other ADO queries against the same API will be automatically authenticated.</p>
<h2>Use the list of IDs to get the work item details</h2>
<p>Now that you're retrieving the list of work item IDs, the following steps use that list to call additional ADO APIs to retrieve specific pieces of data about the work item. In this scenario, we're retrieving all the primary work item fields available, but deeper API calls can be made to retrieve collections of child data about a specific work item - e.g. all the history records as the work item has changed over time, or all the linked work items. In those cases, you'd be calling the API once per work item ID but if you're querying the work items API, you have the ability to request the details for a list of up to 200 work item IDs at a time.</p>
<h3>Paging</h3>
<p>The next query to create is another function that handles this requirement to page the list of work items IDs into groups of 200. Use Get Data | Blank Query to create a new query and open the Advanced Editor. Copy and paste the following query code into the Advanced Editor (replacing the default skeleton query code) and name the function "GetWorkItemIdsPages" - again, the name is important as the functions are referenced by name from other queries.</p>
<h3>Calling work items API</h3>
<p>Once you have the paging function in place, you can add a third function to actually call into the work items API to retrieve the details - for every work item ID in a list, batched into pages of 200. Use Get Data | Blank Query again and open the Advanced Editor. Copy and paste the following query code into the Advanced Editor (replacing the default skeleton query code) and name the function "GetWorkItems" - again, the name is important as the functions are referenced by name from other queries.</p>
<p>You'll need to update the following values in order to execute the function:</p>
<p><ul>
<li><strong>[instance-name]</strong> is the ADO instance name</li>
</ul>
</p>
<h2>Combining the queries to return the work item data</h2>
<p>Finally, you now need to add the last query that ties everything together - using the GetWorkItemIds function to retrieve the master list of IDs, passing that into GetWorkItems, which subsequently uses GetWorkItemsIdsPages to slice up the list of IDs into pages of 200 before calling into the work items API to retrieve the work item data fields.</p>
<p>Use Get Data | Blank Query again and open the Advanced Editor. Copy and paste the following query code into the Advanced Editor (replacing the default skeleton query code) and name the query "WorkItems". This query isn't created as a function - it's a regular Power BI query that will return and display data that can be used in charts and visualisations in the reports view.</p>
<p>At this point, you should be able to see the work item field data in the query view, as below. This data set can now be used in the report builder view to create charts and visualisations over the work item data. The code used in the above sample includes all the "System.X" fields from the ADO API, but there will be additional fields available depending on the work item template you use and any customisations you may have made. Now that you have the data in the WorkItems query, formatting the values, renaming the columns, adding calculated fields is all possible as with any Power BI data source.</p>
<p style="clear: both"><img src="https://res.cloudinary.com/endjin/image/upload/f_auto/q_80/assets/images/blog/2016/06/Work-Items.png" /></p>
<h2>Next steps</h2>
<p>The steps above were deliberately broken down into reusable functions so that this solution can be extended further according to your own requirements. The root of it all - the GetWorkItemsIds function- can be reused to retrieve the list of work item IDs, which could then be passed into any of the other ADO APIs to retrieve different data attributes of the work item either individually or in batches using additional functions.</p>
<p>At a higher level, the same approach - querying for IDs, batching into pages and executing subsequent API calls - can be applied to retrieve other data types via the other entity API endpoints in ADO , or entirely different REST APIs that follow a similar pattern.</p>
<h3>James Broome </h3>
<h4>Director of Engineering</h4>
<p style="clear: both"><img src="https://res.cloudinary.com/endjin/image/upload/f_auto/q_80w_128/assets/images/endjineers/James.Broome.jpg" /></p>
<p>James has spent nearly 20 years delivering high quality software solutions addressing global business problems, with teams and clients across 3 continents. As Director of Engineering at endjin, he leads the team in providing technology strategy, data insights and engineering support to organisations of all sizes - from disruptive B2C start-ups, to global financial institutions. He's responsible for the success of our customer-facing project delivery, as well as the capability and growth of our delivery team.</p>
<h2>Vsts apis</h2>

<h3>Vsts apis</h3>
<p>[youtube]</p>
Vsts apis <a href="http://remmont.com">Important news today</a> Vsts apis
<h4>Vsts apis</h4>
Azure DevOps Work Items offer a lot of power and features out of the box, but sometimes you need insights that Azure DevOps doesn&#x27;t natively provide. In this blog post Director of Engineering, James Broome, shows how you can use the Azure DevOps Restful API to generate insights and even use Power BI to visualise them in this step-by-step guide.
<h5>Vsts apis</h5>
Vsts apis <a href="http://remmont.com">Vsts apis</a> Vsts apis
SOURCE: <h6>Vsts apis</h6> <a href="https://dev-ops.engineer/">Vsts apis</a> Vsts apis
#tags#[replace: -,-Vsts apis] Vsts apis#tags#

Кабринский Эдуард
news headlines
Details: [url=http://remmont.com/category/credit/] free credit file check online
[/url] Fresh News.
SARAKn
 
Сообщений: 561
Зарегистрирован: 15 июл 2019, 19:51
Откуда: USA

Salt Lake City : git clone azure devops - Эдуард Кабринский

Сообщение SARAKn » 12 май 2021, 23:52

Kabrinskiy Eduard - Vsts azure devops - Kabrinskiy Eduard


<h1>Vsts azure devops</h1>
<p>[youtube]</p>
Vsts azure devops <a href="http://remmont.com">Recent news stories</a> Vsts azure devops
<h1>DevOps: Connecting VSTS to Azure</h1>
<p>DevOps is about increasing efficiency and eliminating barriers. There’s a lot of convenience in deploying directly from Visual Studio Team Services into the Azure cloud. With plugin based build and release pipelines, it’s very easy to quickly configure a release and see the results online. You can quickly configure a release to deploy to Azure; the wizards and settings will automatically configure the necessary permissions and credentials in the two systems. <em>Everything just works …</em> unless you’re running VSTS in a separate account or environment from the target Azure subscription. In this post, I’m going to walk you through the black art of manually connecting VSTS to an Azure subscription to enable automated release and deployment pipelines.</p>
<h2>Getting started in VSTS</h2>
<p>To demonstrate the problem and how to fix it, we’re going to create a very simple Release pipeline. In VSTS, begin by opening Build & Release. From there, you can choose Releases and create a new release definition by pressing the large “+” on the page. Choose to create a release definition.</p>
<p style="clear: both"><img src="https://www.wintellect.com/wp-content/uploads/2017/08/image_thumb.png" /></p>
<p>Begin with an empty definition.</p>
<p style="clear: both"><img src="https://www.wintellect.com/wp-content/uploads/2017/08/image_thumb-1.png" /></p>
<p>For the purposes of this demo, you do not need to associate it with any build artifacts. You can select Choose Later.</p>
<p style="clear: both"><img src="https://www.wintellect.com/wp-content/uploads/2017/08/image_thumb-2.png" /></p>
<p>Next, add a task to the release. Choose to add an Azure File Copy task.</p>
<p style="clear: both"><img src="https://www.wintellect.com/wp-content/uploads/2017/08/image_thumb-3.png" /></p>
<p style="clear: both"><img src="https://www.wintellect.com/wp-content/uploads/2017/08/image_thumb-4.png" /></p>
<p>As you start to configure this task, the problem will become apparent. You can only select a Subscription configured to work with your VSTS instance. You will need to click Manage to begin the process of adding a new Service Endpoint to your VSTS account. A service endpoint is essentially a way to connect VSTS to an external system such as Azure. When you attempt to add a new Azure Resource Manager service endpoint, VSTS will display the details of the subscriptions it knows are available to you. If the Azure subscription you want to use is not automatically visible to VSTS, then you will have to configure the connection. This is done manually using the advanced settings. Near the bottom of the dialog, click the link in the instructions that says “use the full version of the endpoint dialog”.</p>
<p style="clear: both"><img src="https://www.wintellect.com/wp-content/uploads/2017/08/image_thumb-5.png" /></p>
<p style="clear: both"><img src="https://www.wintellect.com/wp-content/uploads/2017/08/image_thumb-6.png" /></p>
<p>Configuring this is not as difficult as it seems, but there are a few steps involved. Let’s walk through the fields and where you find those values in your Azure subscription. The <strong>Connection Name</strong> is a friendly name that will be displayed to you whenever you need to use the Azure service. I recommend that you consider making this an intuitive name, such as the name of the subscription. If <strong>Environment</strong> is available, this is the specific type of Azure cloud being connected. Unless you are using one of Microsoft’s Azure Government clouds, then you can use the default setting (AzureCloud). For the remaining steps, you will need to login to the Azure environment you are wanting to connect. Once logged in, you will be able to find and configure the remaining values.</p>
<h2>Configuring Azure for VSTS</h2>
<p>The <strong>Subscription ID</strong> and <strong>Subscription Name</strong> fields can be found by opening the Subscriptions blade. If you’re not sure where to find it, take advantage of the <em>Search Resources</em> feature found at the top of each page in the Azure portal. When you open the Subscriptions blade, the values you need will be displayed on the screen. Copy these two values to the dialog.</p>
<p style="clear: both"><img src="https://www.wintellect.com/wp-content/uploads/2017/08/image_thumb-7.png" /></p>
<p>Next, we need to open the <em>Azure Active Directory</em> blade. Open the Properties menu blade. Copy the Directory ID field to the <strong>Tenant ID</strong>. The Tenant ID is a reference to the Azure Active Directory (AAD) instance within the subscription.</p>
<p style="clear: both"><img src="https://www.wintellect.com/wp-content/uploads/2017/08/image_thumb-8.png" /></p>
<p>Next, we need to get values for the two fields related to the Service Principal. A Service Principal (SPN) is essentially an account registration which will have permissions within Azure. By assigning a principal and key, VSTS will be able to authenticate with Azure Active Directory. To do this, we need to create an application and register it within AAD.</p>
<p>While still in the Active Directory blade, open the App Registrations menu blade and select New application registration. In the Create dialog, give the application a friendly name, such as VSTS Connection. Use the default Application Type (Web app/API). The sign-on URL must be unique, but otherwise can have any value. I prefer to use a naming convention that indicates I’m using the application exclusively as an SPN from VSTS. After you have created the application, the Properties menu for the new application will show you the Application ID. Copy the value from Application ID to the <strong>Service Principal Client ID</strong> field.</p>
<p style="clear: both"><img src="https://www.wintellect.com/wp-content/uploads/2017/08/image_thumb-9.png" /></p>
<p style="clear: both"><img src="https://www.wintellect.com/wp-content/uploads/2017/08/clip_image002_thumb.png" /></p>
<p>The final value we need is the <strong>Service Principal Key</strong>. This key is an access key for your application to connect to Azure. Open the Keys menu in your newly created application. In the empty field, type a descriptive name for the key you are wanting to create. Set the expiration time for an appropriate amount of time. Press Save to commit your changes. The application’s key is displayed. Copy this value to <strong>Service Principal Key</strong>.</p>
<p style="clear: both"><img src="https://www.wintellect.com/wp-content/uploads/2017/08/image_thumb-10.png" /></p>
<p>Although you have the values needed to configure VSTS, there is one more step – giving the application permissions on Azure. Without this step, VSTS will be able to connect to Azure but won’t have permission to utilize any of the resources. Each resource in Azure has a menu blade for Access control (IAM). Following a principal of least privilege, you might consider assigning the permissions at the Resource Group level. Unfortunately, you would find that the particular task we selected – File Copy – fails. In order to be able to use a storage account with just a name (no credentials), this task queries all of the available storage accounts in order to find the requested account. Next, it retrieves the connection details and keys. To do this requires permissions at the subscription level.</p>
<p>For this reason, VSTS SPNs usually have their permissions applied in the <em>Subscription</em> blade. When VSTS can connect and automatically configure the SPN, it will grant permission at this level. More specifically, it will assign the Contributor role, granting it more extensive access to the account. To keep everything simple, we will do the same.</p>
<p>Open the Subscription blade and select the targeted subscription. Within that blade, open the Access control (IAM) menu. Click Add to grant our application permission. Choose the Contributor role. Under select, type the name of the application that you created. At the time of this writing, there is an odd bug with auto-complete in Azure. You will have to type the complete name of the application in order for the auto-complete to locate your application. Once it appears, click the application to mark it as selected. Then, press Save to confirm your changes and confirm the updated permissions.</p>
<p style="clear: both"><img src="https://www.wintellect.com/wp-content/uploads/2017/08/image_thumb-11.png" /></p>
<p>Return to VSTS and test your connection. At this point, the Service Endpoint should verify and show as able to connect.</p>
<p style="clear: both"><img src="https://www.wintellect.com/wp-content/uploads/2017/08/image_thumb-12.png" /></p>
<p style="clear: both"><img src="https://www.wintellect.com/wp-content/uploads/2017/08/image_thumb-13.png" /></p>
<p>You can now save the Endpoint and begin using this Azure subscription in your Build and Release tasks. Return to our File Copy task and refresh the subscription drop down. You will see your Azure subscription is now available.</p>
<p style="clear: both"><img src="https://www.wintellect.com/wp-content/uploads/2017/08/image_thumb-14.png" /></p>
<p>Now you know how to connect external Azure accounts to your VSTS environment. Along the way, you’ve hopefully also gained some understanding of setting up an application in Azure and configuring it’s permissions. Happy DevOp’ing!</p>
<h2>Vsts azure devops</h2>

<h3>Vsts azure devops</h3>
<p>[youtube]</p>
Vsts azure devops <a href="http://remmont.com">Today's news headlines in english</a> Vsts azure devops
<h4>Vsts azure devops</h4>
DevOps is about increasing efficiency and eliminating barriers. Learn how to manually connect Visual Studio Team Services (VSTS) to Microsoft Azure.
<h5>Vsts azure devops</h5>
Vsts azure devops <a href="http://remmont.com">Vsts azure devops</a> Vsts azure devops
SOURCE: <h6>Vsts azure devops</h6> <a href="https://dev-ops.engineer/">Vsts azure devops</a> Vsts azure devops
#tags#[replace: -,-Vsts azure devops] Vsts azure devops#tags#

Eduard Kabrinskiy
headline news
Details: [url=http://remmont.com/category/credit/] free credit file check online
[/url] Fresh News.
SARAKn
 
Сообщений: 561
Зарегистрирован: 15 июл 2019, 19:51
Откуда: USA

Plano : tfs story points - Кабринский Эдуард

Сообщение SARAKn » 19 май 2021, 09:00

Eduard Kabrinskiy - Devops day - Eduard Kabrinskiy


<h1>Devops day</h1>
<p>[youtube]</p>
Devops day <a href="http://remmont.com">Current news in english</a> Devops day
<h1>10 Day to Day activities for a DevOps Engineer and Team</h1>
<h3>Subodh JainFollow</h3>
<h4>Engineering Leader for Current Product Engineering (ASDS Product Line) at Veritas Technologies LLC</h4>
<ul>
<li>Like 50</li>
<li>Comment 7 </ul>
<p>On my earlier post on "What is DevOps?" I had few inputs where people were asking about day to day activities which DevOps folks involved in or what DevOps team can achieved.</p>
<p>I am penning down some of my thought from my own experience as well some study I have done. Please review.</p>
<p><ol>
<li><strong>Make sure that the pipeline is running smoothly ?</strong> This is one of the most important task of a DevOps engineer to make sure that CI/CD pipeline is intact and fixing any issue or failure with it is the #1 priority for the day. They often need to spend time on troubleshooting, analysing and providing fixes to issues.</li>
<li><strong>Interaction with other teams</strong> ? Co-ordination and collaboration is the key for DevOps to be successful and hence daily integration with Dev and QA team, Program management, IT is always required.</li>
<li><strong>Work on Automation Backlog</strong> ? Automation is soul of DevOps so DevOps engineering need to plan it out and I can see DevOps engineer spending lots of time behind the keyboard working on Automating stuff on daily basis.</li>
<li><strong>Infrastructure Management</strong> ? DevOps engineer are also responsible for maintaining and managing the infrastructure required for CI/CD pipeline and making sure that its up and running and being used optimally is also part of their daily schedule. Ex. Working on Backup, High Availability, New Platform setup etc.</li>
<li><strong>Dealing with Legacy stuff</strong> ? Not everyone is lucky to work on latest and newest things and DevOps engineers are no exception hence they also need to spend time on legacy i.e. in terms of supporting it or migrating to the latest.</li>
<li><strong>Exploration</strong> ? DevOps leverage a lot from the various tools which are available, there are many options as open source so team need to regularly check on this to make sure the adoptions as required, this is something which also require some effort not on daily but regular basis. Ex. What are open source options available to keep the cost at minimum?</li>
<li><strong>Removing bottleneck</strong> ? DevOps primary purpose is identify the bottlenecks / Manual handshakes and work with everyone involved (Dev / QA and all other stakeholder) to remove them so team spend good amount of time in finding such things and build the Automation Backlog using this. Ex. How we can get builds faster?</li>
<li><strong>Documentation</strong> ? Though Agile / DevOps stresses less on the documentation, it is still the important one which DevOps engineer does on daily basis, Be it Server Information, Daily Week charted, Scrum / Kanban board or Simple steps to configure / backup or modify the infrastructure, you need to spent good amount of time in coming up these artifacts.</li>
<li><strong>Training and Self Development</strong> ? Self leaning and Training is very useful in getting better understanding and many organisations encourage their employee to take the time out and do some of these and same holds true for DevOps folks as well, So learn something new everyday.</li>
<li><strong>Continuous Improvement as Practice</strong> ? Last but not least, It?s up to the DevOps folks to build awareness on the potential of CI/CD and DevOps practices and building a culture of leveraging it for doing things better, reducing re-work, increasing the productivity and optimising the use of existing resources. Go and talk to people to build the DevOps and Continuous Improvement culture.</li>
</ol>
</p>
<p>Refer to my earlier post on What is DevOps to know more about it. Please feel free to share / comment.</p>
<h3>Размещено участником</h3>
<h3>Subodh Jain</h3>
<h4>Engineering Leader for Current Product Engineering (ASDS Product Line) at Veritas Technologies LLC</h4>
<p>DevOps, CI\CD, Agile, Continuous Improvement</p>
<h2>Devops day</h2>

<h3>Devops day</h3>
<p>[youtube]</p>
Devops day <a href="http://remmont.com">Live news</a> Devops day
<h4>Devops day</h4>
Hi All, On my earlier post on &amp;quot;What is DevOps?&amp;quot; I had few inputs where people were asking about day to day activities which DevOps folks involved in or what DevOps team can achieved..
<h5>Devops day</h5>
Devops day <a href="http://remmont.com">Devops day</a> Devops day
SOURCE: <h6>Devops day</h6> <a href="https://dev-ops.engineer/">Devops day</a> Devops day
#tags#[replace: -,-Devops day] Devops day#tags#

Kabrinskiy Eduard
top news
Details: [url=http://remmont.com/category/credit/] free credit file check online
[/url] Fresh News.
SARAKn
 
Сообщений: 561
Зарегистрирован: 15 июл 2019, 19:51
Откуда: USA

Sterling Heights : bitbucket devops - Кабринский Эдуард

Сообщение SARAKn » 19 май 2021, 09:19

Кабринский Эдуард - Vsts output variables - Эдуард Кабринский


<h1>Vsts output variables</h1>
<p>[youtube]</p>
Vsts output variables <a href="http://remmont.com">Current news today</a> Vsts output variables
<h1>How to organize VSTS build output into separate folders?</h1>
<p>I have seen teams not care about the structure of the build artifact, instead add several steps in their release pipeline to unclutter the binaries generated from the build output. It is fairly trivial to control the structure in which the build output gets published as an artifact to your TFS or VSTS build. In the spirit of pushing left (DevOps way of working), in this blogpost I?ll show you how to easily structure the build output in separate folders by using the copy task in the build pipeline.</p>
<h2>Problem Statement</h2>
<p>I have a solution file which when processed through the build pipeline generates a build output and add?s every folder that has any dll?s into the build output. Let?s double click to see where the real problem is,</p>
<p>Folders that I have no interest in have been added as a build artifact</p>
<p style="clear: both"><img src="https://www.visualstudiogeeks.com/images/screenshots/tarun/Mar18/002_defaultoutputhasfoldersthatidontneed.jpg" /></p>
<p>The folders are nested into sub folders, and sub folders and further sub folders for stuff that I am interested in</p>
<p style="clear: both"><img src="https://www.visualstudiogeeks.com/images/screenshots/tarun/Mar18/001_defaultbuildoutput.jpg" /></p>
<h2>Solution</h2>
<p>A vanilla build pipeline add?s one copy step and one publish step in the build pipeline?</p>
<p style="clear: both"><img src="https://www.visualstudiogeeks.com/images/screenshots/tarun/Mar18/003_defaultpipelineonecopyandonepublishstep.jpg" /></p>
<p>The build generates the binaries in the $(build.defaultworkingdirectory) folder, the copy step simply copies all the binaries from the default working directory $(build.defaultworkingdirectory) using the format **\bin\$(BuildConfiguration)\** into the artifact staging directory $(build.artifactstagingdirectory) ? The publish step as you would expect published everything that it finds in the $(build.artifactstagingdirectory)</p>
<blockquote><p>The build copy step is your friend if you use it correctly?</p></blockquote>
<p>Instead of overloading just one copy step to copy everything, I have added multiple copy steps. I?ve qualified each step to full qualify the folders I need to get rid of the multi level hierarchies in the artifact folder, see the example below?</p>
<p style="clear: both"><img src="https://d33wubrfki0l68.cloudfront.net/b91a7409dca3741f4e6f07f5d623e0dde7239aa5/eab0e/images/screenshots/tarun/mar18/004_multicopystepfullyqualifyandaddsubfolder.jpg" /></p>
<p>With the new copy tasks in, the build output now looks like this?</p>
<p style="clear: both"><img src="https://www.visualstudiogeeks.com/images/screenshots/tarun/Mar18/005_buildoutputafterproposedmultistepcopy.jpg" /></p>
<p>I hope you find this useful? #DevOpsOn?</p>
<p>Know a better way of doing this? leave a comment?</p>
<h2>Vsts output variables</h2>

<h3>Vsts output variables</h3>
<p>[youtube]</p>
Vsts output variables <a href="http://remmont.com">Headlines</a> Vsts output variables
<h4>Vsts output variables</h4>
Wondering how to organize your VSTS Build Output into individual folders so you can directly consume it in your release pipeline for the purposes of deployme...
<h5>Vsts output variables</h5>
Vsts output variables <a href="http://remmont.com">Vsts output variables</a> Vsts output variables
SOURCE: <h6>Vsts output variables</h6> <a href="https://dev-ops.engineer/">Vsts output variables</a> Vsts output variables
#tags#[replace: -,-Vsts output variables] Vsts output variables#tags#

Кабринский Эдуард
latest news
Details: [url=http://remmont.com/category/credit/] free credit file check online
[/url] Fresh News.
SARAKn
 
Сообщений: 561
Зарегистрирован: 15 июл 2019, 19:51
Откуда: USA

Tucson : azure devops semantic versioning - Kabrinskiy Edua

Сообщение SARAKn » 19 май 2021, 09:39

Kabrinskiy Eduard - Ms azure devops - Kabrinskiy Eduard


<h1>Ms azure devops</h1>
<p>[youtube]</p>
Ms azure devops <a href="http://remmont.com">Headlines today</a> Ms azure devops
<h1>How to create and configure Azure DevOps Pipelines Agent</h1>
<h2>This article aims to help you choose and configure an agent for your CI Pipeline.</h2>
<p style="clear: both"><img src="https://miro.medium.com/max/60/1*ykMbX6S1sexCUvsAldnXJw.jpeg" /></p>
<h1>Introduction</h1>
<p>When getting to know Azure Devops, the most difficult question is probably what steps you need to follow to make your CI Pipeline work correctly.</p>
<p>The crucial point in running the Azure CI Pipeline is the work of the Azure Pipelines Agent, which primary task is to establish a connection with the machine on which you want to run your DevOps CI Pipeline.</p>
<p>In this article, we will consider the following issues to help you choose an agent for your CI Pipeline:</p>
<p><ul>
<li>MS-agent vs Self-hosted agent</li>
<li>Advanced information about MS-agents</li>
<li>When to use MS-agents and when to use self-hosted agents</li>
<li>Configuring MS-agent</li>
<li>Configuring self-hosted agent.</li>
</ul>
</p>
<p>We will also take a look at the differences between the two agents to help you decide which one best suits your needs.</p>
<p>The figure below displays the Azure CI Pipeline workflow with a Microsoft-hosted agent and a self-hosted agent.</p>
<p style="clear: both"><img src="https://miro.medium.com/max/56/1*sSPmoLt9yxrWtoOa6cC7TQ.png" /></p>
<p>As you can see from the figure above, your pipeline has to be run in some kind of operating environment on some machine. A Microsoft-hosted agent and a self-hosted agent are bound to manage that pipeline execution.</p>
<p>In case of using a MS-Agent, the execution takes place in one of the ready-made operating environments in Azure Windows, macOS, or Ubuntu, which is deployed to Azure on the fly (provided as SaaS).</p>
<p>In case of using a self-hosted agent, the execution takes place on your physical or virtual machine (on-premise environment).</p>
<h1>MS-agent vs Self-hosted agent</h1>
<p>Thus, the primary difference between a MS-agent and a self-hosted agent lies in the fact that with a MS-agent, you get a ready-prepared virtual machine in the cloud with pre-installed operating system (Windows, macOS, or Ubuntu). You get a fresh virtual machine each time you run a pipeline.</p>
<p>If you use a self-hosted agent, the pipeline will be executed on your physical machine.</p>
<p>The table below shows the main comparative characteristics of the two agents.</p>
<p style="clear: both"><img src="https://miro.medium.com/max/60/1*ThIGx2TmrYl31fHq2-kLnw.png" /></p>
<h1>Advanced information about MS-agents</h1>
<p>You can get information about the pre-installed software or hardware for a specific MS-agent using PowerShell. For example, to receive information about software and hardware for MS-Agent and a Windows-2019 image, execute the following PowerShell script.</p>
<p>By executing the script, we get the following information about the virtual machine:</p>
<p style="clear: both"><img src="https://miro.medium.com/max/60/1*Gsyahi8l-6h7nURuK_M3qQ.png" /></p>
<p>As mentioned in the table above, some virtual machine images for MS-agent have a broad range of useful pre-installed programs. To find out what programs are pre-installed for a Windows-2019 image, we can create a PowerShell task with the following script:</p>
<p>After executing this script, we will see a list of pre-installed useful programs (this list is very long and the figure below displays only a part of it):</p>
<p style="clear: both"><img src="https://miro.medium.com/max/60/1*eZzSn-zD4-n2FRBzLNpcmQ.png" /></p>
<h1>When to use MS-agents and when to use self-hosted agents</h1>
<p>If you have a small project with source code that does not take much time to download, it also does not require a lot of time to compile the project and perform unit tests. In that case, Microsoft-hosted agent will be the best choice. As indicated in the comparison table above, the virtual machine image and all pre-installed programs (Visual Studio, Python, Chrome, etc.) for MS-Agent have the latest updates. After executing a pipeline, this virtual machine image is discarded and does not affect the execution of the subsequent pipelines.</p>
<p>However, if you have problems related to performance, accessibility, security, or other problems of that kind, it is better to settle upon a self-hosted agent. One of the important drawbacks of self-hosted agents is that you will have to keep track of operating system updates, as well as clear the disk of artifacts between pipeline jobs.</p>
<h1>Configuring MS-agent</h1>
<p>The setup of Microsoft-hosted agent is quite simple and straightforward:</p>
<p><ol>
<li>In the Agent pool</strong> combo box, select Azure Pipelines</em>.</li>
</ol>
</p>
<p style="clear: both"><img src="https://miro.medium.com/max/60/1*FxDrNUK4m92V35ji_FWJiA.png" /></p>
<p>2. In Agent Specification</strong> drop-down list box, select a required virtual machine image.</p>
<h2>Ms azure devops</h2>

<h3>Ms azure devops</h3>
<p>[youtube]</p>
Ms azure devops <a href="http://remmont.com">National news</a> Ms azure devops
<h4>Ms azure devops</h4>
When getting to know Azure Devops, the most difficult question is probably what steps you need to follow to make your CI Pipeline work correctly. The crucial point in running the Azure CI Pipeline is?
<h5>Ms azure devops</h5>
Ms azure devops <a href="http://remmont.com">Ms azure devops</a> Ms azure devops
SOURCE: <h6>Ms azure devops</h6> <a href="https://dev-ops.engineer/">Ms azure devops</a> Ms azure devops
#tags#[replace: -,-Ms azure devops] Ms azure devops#tags#

Кабринский Эдуард
news headlines
Details: [url=http://remmont.com/category/credit/] free credit file check online
[/url] Fresh News.
SARAKn
 
Сообщений: 561
Зарегистрирован: 15 июл 2019, 19:51
Откуда: USA

Abilene : azure task - Eduard Kabrinskiy

Сообщение SARAKn » 19 май 2021, 10:29

Эдуард Кабринский - Implementing devops with microsoft azure - Кабринский Эдуард


<h1>Implementing devops with microsoft azure</h1>
<p>[youtube]</p>
Implementing devops with microsoft azure <a href="http://remmont.com">Breaking news</a> Implementing devops with microsoft azure
<h1>Become a DevOps Engineer for Microsoft Azure</h1>

<h6>Estimated Time</h6>
<h5>3 months</h5>
<p>At 5-10 hours/week</p>
<h6>Enroll by</h6>
<p>Get access to the classroom immediately on enrollment</p>
<h6>Prerequisites</h6>
<h5>Intermediate Python, familiarity with Linux shell scripting and cloud concepts</h5>
<p>In Collaboration With</p>
<h4>What You Will Learn</h4>
<h6>SYLLABUS</h6>
<h2>DevOps Engineer for Microsoft Azure</h2>
<p>Microsoft Azure is one of the most popular cloud services platforms used by enterprises, making it a crucial tool for cloud computing professionals to add to their skillset. The DevOps Engineer for Microsoft Azure Nanodegree program teaches students how to deploy, test, and monitor cloud applications on Azure, thereby preparing learners for success on MicrosoftРІ??s AZ-400 DevOps Engineer Expert certification exam.</p>
<p>The DevOps Engineer for Microsoft Azure Nanodegree program teaches students how to deploy, test, and monitor cloud applications on Azure, one of the most popular cloud services platforms.</p>
<p>3 months to complete</p>
<h6>Prerequisite Knowledge</h6>
<p>Intermediate Python, familiarity with Linux shell scripting and cloud concepts. See detailed requirements.</p>
<h4>Azure Infrastructure Operations</h4>
<p>In modern deployments, automated deployment and management of cloud infrastructure is crucial for ensuring the high uptimes that customers expect. Understand the DevOps lifecycle and the basics of infrastructure management in Microsoft Azure. Learn about cloud security best practices to keep infrastructure secure. Leverage modern technologies to create robust and repeatable deployments in Microsoft Azure.</p>
<p>Deploying a Web Server in Azure</p>
<h4>Agile Development with Azure</h4>
<p>Automated Deployment of high quality software using DevOps principles is a critical skill in the cloud era. Master the theory and practice of Agile project management with hands-on examples. Execute a Python centric Continuous Integration strategy that uses testing best practices, including open source code quality tools such as pylint and pytest. Couple Infrastructure-as-Code (IaC) with Continuous Delivery using Azure Pipelines to streamline the deployment of applications to Azure.</p>
<p>Building a CI/CD Pipeline</p>
<h4>Ensuring Quality Releases (Quality Assurance)</h4>
<p>Applications that have been built and released into the cloud need to be evaluated to ensure proper performance. Test cloud-based application performance and functionality within the pipeline itself, as well as after it has been deployed by using different types of test suites such as Selenium and Postman. Exercise those test suites against a variety of endpoints, including a sample eCommerce UI, and REST APIs. Build a systemic application monitoring process based on alert triggers in Azure Monitor and custom log files in Azure Log Analytics.</p>
<h2>Implementing devops with microsoft azure</h2>

<h3>Implementing devops with microsoft azure</h3>
<p>[youtube]</p>
Implementing devops with microsoft azure <a href="http://remmont.com">National news stories</a> Implementing devops with microsoft azure
<h4>Implementing devops with microsoft azure</h4>
Our new DevOps Engineer for Microsoft Azure online course will teach you how to develop and deploy cloud-based applications on Microsoft Azure. Learn online with Udacity.
<h5>Implementing devops with microsoft azure</h5>
Implementing devops with microsoft azure <a href="http://remmont.com">Implementing devops with microsoft azure</a> Implementing devops with microsoft azure
SOURCE: <h6>Implementing devops with microsoft azure</h6> <a href="https://dev-ops.engineer/">Implementing devops with microsoft azure</a> Implementing devops with microsoft azure
#tags#[replace: -,-Implementing devops with microsoft azure] Implementing devops with microsoft azure#tags#

Kabrinskiy Eduard
top news
Details: [url=http://remmont.com/category/credit/] free credit file check online
[/url] Fresh News.
SARAKn
 
Сообщений: 561
Зарегистрирован: 15 июл 2019, 19:51
Откуда: USA

Top headlines this week - REMMONT.COM

Сообщение SARAKn » 19 май 2021, 11:10

Azure devops self hosted - Kabrinskiy Eduard


<h1>Azure devops self hosted</h1>
<p>[youtube]</p>
Azure devops self hosted <a href="http://remmont.com">News websites</a> Azure devops self hosted
<h1>mohitgoyal.co</h1>
<h2>#Cloud #Devops #Automation #SRE</h2>
<h1>Run Azure DevOps Private Agents in Kubernetes Clusters</h1>
<p>Since in last post, we discussed on how to run Azure Pipelines agents as docker containers and configure them accordingly, the next step would be to run them on the Kubernetes platform. This kubernetes cluster can be on-premise and/or cloud and could be self managed or managed by the cloud service provider itself.</p>
<p>One of the reasons you may want to run them on Kubernetes is because you want better utilization of your kubernetes cluster. Another reason might be to leverage your existing knowledge of the kubernetes platform and work on it. Another reason would be to not use Microsoft hosted agents, as by default you would get only 1800 minutes of agent time to utilize, for free accounts. <br /></p>
<p>Not much talked about option is the need to run testing (functional or integration testing) on the private kubernetes services which are otherwise not publicly available. So they can be accessed only from other services within the same cluster. By running a Azure Pipeline agent in the cluster, we make it possible to test any service, regardless of type. This is going to the focus for this blog post.</p>
<h3>Provision an Azure Kubernetes Cluster</h3>
<p>For this we?ll be using Azure CLI. Make sure Azure CLI is installed and configured on your machine.</p>
<h4>Create Azure Resource Group</h4>
<p>An Azure resource group is a logical group in which Azure resources are deployed and managed. It is mandatory for any resource to have an associated resource group. So the very first step is to create an Azure Resource Group.</p>
<p>To create the same, we can use below command:</p>
<h4>Create AKS Cluster</h4>
<p>We can use the az aks create command to create an AKS cluster. The following example creates a cluster named aksDemo with one node.</p>
<p>The command may take few minutes to complete as it needs to provision lots of resources. Note that monitoring for resources is also enabled using --enable-addons switch. If everything runs fine, the output will be JSON-formatted information about the cluster.</p>
<h3>Connect to AKS Cluster</h3>
<p>To manage a Kubernetes cluster, we need to use kubectl . It is used to connect and manage kubernetes clusters. To install kubectl locally, we can use the below command:</p>
<p>If its installed already, there is no need to run above command. Once its installed, we need it to connect to AKS cluster created above. For this, we can run below command:</p>
<p>This will save the cluster connection information in local directory .kube. To verify, that we are able to connect to cluster, run below command:</p>
<p>Make sure the status of the node, should be ready before we proceed further.</p>
<h3>Deploy the Agent as Docker Container</h3>
<h4>Create an Agent Pool</h4>
<p style="clear: both">We?ll need to create an organization agent pool, a project level agent pool and a queue. This all can be created in one step by going to project settings -> agent pools: <br /><img style="float: left; margin: 0 10px 5px 0;" src="https://metavrse.files.wordpress.com/2018/12/create-a-project-agent-pool.jpg" /></p>
<h4>Create a Personal Access Token</h4>
<p>A personal access token or PAT is required so that an machine can join the pool created above with the Agent Pools (read, manage) scope. We can create the token from our profile:</p>
<p style="clear: both"><img src="https://metavrse.files.wordpress.com/2018/12/create-a-personal-access-token-02.jpg" /></p>
<h4>Save the Token Configuration inside Kubernetes</h4>
<p>Take the token and create a secret in Kubernetes containing the token and the account name:</p>
<p>Replace the values in above command appropriate for your account.</p>
<h4>Deploy the VSTS Docker Agent on AKS</h4>
<p>For this, we can use below kubernetes configuration:</p>
<table data-tab-size="8" data-paste-markdown-skip> <tr> <td >apiVersion : v1</td> </tr> <tr> <td >kind : ReplicationController</td> </tr> <tr> <td >metadata :</td> </tr> <tr> <td >name : vsts-agent</td> </tr> <tr> <td >spec :</td> </tr> <tr> <td >replicas : 1</td> </tr> <tr> <td >template :</td> </tr> <tr> <td >metadata :</td> </tr> <tr> <td >labels :</td> </tr> <tr> <td >app : vsts-agent</td> </tr> <tr> <td >version : " 0.1 "</td> </tr> <tr> <td >spec :</td> </tr> <tr> <td >containers :</td> </tr> <tr> <td >? name : vsts-agent</td> </tr> <tr> <td >image : microsoft/vsts-agent:ubuntu-16.04-docker-18.06.1-ce-standard</td> </tr> <tr> <td >env :</td> </tr> <tr> <td >? name : VSTS_ACCOUNT</td> </tr> <tr> <td >valueFrom :</td> </tr> <tr> <td >secretKeyRef :</td> </tr> <tr> <td >name : vsts</td> </tr> <tr> <td >key : VSTS_ACCOUNT</td> </tr> <tr> <td >? name : VSTS_TOKEN</td> </tr> <tr> <td >valueFrom :</td> </tr> <tr> <td >secretKeyRef :</td> </tr> <tr> <td >name : vsts</td> </tr> <tr> <td >key : VSTS_TOKEN</td> </tr> <tr> <td >? name : VSTS_POOL</td> </tr> <tr> <td >value : dockerized-vsts-agents</td> </tr> <tr> <td >volumeMounts :</td> </tr> <tr> <td >? mountPath : /var/run/docker.sock</td> </tr> <tr> <td >name : docker-volume</td> </tr> <tr> <td >volumes :</td> </tr> <tr> <td >? name : docker-volume</td> </tr> <tr> <td >hostPath :</td> </tr> <tr> <td >path : /var/run/docker.sock</td> </tr> </table>
<h4>Verify the Agent is connected fine</h4>
<p>We can run below command to check if the Azure Pipeline or VSTS agent is running fine:</p>
<p style="clear: both">You should be greeted with an output like below: <br /><img style="float: left; margin: 0 10px 5px 0;" src="https://metavrse.files.wordpress.com/2019/01/kubectl-describe-rc.jpg" /></p>
<p style="clear: both">We should be able to see the agent in the pool as well: <br /><img style="float: left; margin: 0 10px 5px 0;" src="https://metavrse.files.wordpress.com/2019/01/vsts-agent-running-as-k8-container.jpg" /></p>
<h3>Deploy Sample .NET Core Application for testing</h3>
<p>For this, we can use source code from this repo. The respective Kubernetes configuration is covered inside file named docs/k8config.yml. It consists of a single service and an replicationcontroller:</p>
<table data-tab-size="8" data-paste-markdown-skip> <tr> <td >apiVersion : v1</td> </tr> <tr> <td >kind : Service</td> </tr> <tr> <td >metadata :</td> </tr> <tr> <td >name : dotnetcore</td> </tr> <tr> <td >labels :</td> </tr> <tr> <td >app : dotnetcore</td> </tr> <tr> <td >spec :</td> </tr> <tr> <td >type : ClusterIP</td> </tr> <tr> <td >ports :</td> </tr> <tr> <td >? port : 80</td> </tr> <tr> <td >targetPort : 80</td> </tr> <tr> <td >protocol : TCP</td> </tr> <tr> <td >name : http</td> </tr> <tr> <td >selector :</td> </tr> <tr> <td >app : dotnetcore</td> </tr> <tr> <td >?</td> </tr> <tr> <td >apiVersion : v1</td> </tr> <tr> <td >kind : ReplicationController</td> </tr> <tr> <td >metadata :</td> </tr> <tr> <td >name : dotnetcore</td> </tr> <tr> <td >spec :</td> </tr> <tr> <td >replicas : 1</td> </tr> <tr> <td >template :</td> </tr> <tr> <td >metadata :</td> </tr> <tr> <td >labels :</td> </tr> <tr> <td >app : dotnetcore</td> </tr> <tr> <td >spec :</td> </tr> <tr> <td >containers :</td> </tr> <tr> <td >? name : dotnetcore</td> </tr> <tr> <td >image : mohitgoyal.azurecr.io/dotnetcore:14</td> </tr> <tr> <td >ports :</td> </tr> <tr> <td >? containerPort : 80</td> </tr> <tr> <td >imagePullSecrets :</td> </tr> <tr> <td >? name : regcred</td> </tr> </table>
<p>Do note that this pulls registry information from the kubernetes secrets named regcred. It can be created using below command:</p>
<p>Replace the values with appropriate values applicable for your enviornment.</p>
<p>We can also verify if the application is deployed successfully by using kubectl describe rc , which should return an output like below:</p>
<p style="clear: both"><img src="https://metavrse.files.wordpress.com/2019/01/kubectl-get-pods-03.jpg" /></p>
<h3>Write a Sample test</h3>
<p>We can use below PowerShell code to test to invoke an HTTP request for the application and analyze the response:</p>
<p>We would be particularly interested in $Response.StatusCode and a value of 200 should indicate that application is up and running fine. For the simplicity purpose, the above code has been left as such.</p>
<h3>Run the Test Cases on Azure Pipeline Agent</h3>
<p>We can run tests from either a build or a release in Azure Pipelines. Because integration and functional tests tend to run after an app is released to a certain environment, we?ll call the test from a release. The test files need to be available as artifacts from a build or from a source repository. Since in our case, we are running PowerShell based tests, we can use a PowerShell task and run inline-script. The important part of the release pipeline is to use the deployment job on the pool name that we created earlier.</p>
<p>Now if we did everything properly, we should be able to create a new release and the test should run successfully in the Kubernetes cluster:</p>
<h2>Azure devops self hosted</h2>

<h3>Azure devops self hosted</h3>
<p>[youtube]</p>
Azure devops self hosted <a href="http://remmont.com">Current news events</a> Azure devops self hosted
<h4>Azure devops self hosted</h4>
Since in last post, we discussed on how to run Azure Pipelines agents as docker containers and configure them accordingly, the next step would be to run them on the Kubernetes platform. This kubernetes cluster can be on-premise and/or cloud and could be self managed or managed by the cloud service provider itself. One of&hellip;
<h5>Azure devops self hosted</h5>
Azure devops self hosted <a href="http://remmont.com">Azure devops self hosted</a> Azure devops self hosted
SOURCE: <h6>Azure devops self hosted</h6> <a href="https://dev-ops.engineer/">Azure devops self hosted</a> Azure devops self hosted
#tags#[replace: -,-Azure devops self hosted] Azure devops self hosted#tags#
https://ssylki.info/?who=cheap-used-cars.remmont.com https://ssylki.info/?who=credit-personn ... t.com/news https://ssylki.info/?who=remmont.com/ll ... cy-careers https://ssylki.info/?who=blue-cross-ins ... emmont.com https://ssylki.info/?who=loan-amortizat ... emmont.com
Details: [url=http://remmont.com/category/credit/] free credit file check online
[/url] Fresh News.
SARAKn
 
Сообщений: 561
Зарегистрирован: 15 июл 2019, 19:51
Откуда: USA

Nws news - REMMONT.COM

Сообщение SARAKn » 19 май 2021, 14:47

Databricks devops - Eduard Kabrinskiy


<h1>Databricks devops</h1>
<p>[youtube]</p>
Databricks devops <a href="http://remmont.com">Current news</a> Databricks devops
<h1>DataBricks Automation with Azure DevOps</h1>
<p>Databricks on Azure is essential in data, AI and IoT solutions, but the env. automation can be challenging. Azure DevOps is a great tool for automation. Using Pipelines and product CLI integrations can minimise or even remove these challenges. My team is currently working on a cutting edge IoT platform where data flows from edge devices to Azure. We are dealing with data which is sensitive, and under GDPR so no one should have direct access to the data platform in the production environments.</p>
<p>In the project, data is generated by sensors and sent to the cloud by the edge devices. Ingestion, processing and analysis of data are too complicated for the traditional relational databases; for this reason, there are other tools to refine the data. We use DataBricks in our Lambda Architecture to batch process the data at rest and predictive analytics and machine learning. This blog post is about the DataBricks cluster and environment management, and I?ll not go deeper to the architecture or IoT solution.</p>
<h2>The Automation Problems</h2>
<p>As any reliable project, we have three environments which are development, user acceptance testing (UAT) and production. In my two previous posts, Azure Infrastructure using Terraform and Pipelines and Implement Azure Infrastructure using Terraform and Pipelines, I had an in-depth review and explanation of why and how Terraform solves environment generation and management problems. Let?s have a study the code Terraform provides for Databricks.</p>
<p><strong>Wait a minute, but that is only the empty environment!<br />What about the Clusters, Pools, Libraries, Secrets and WorkSpaces?</strong></p>
<h2>The Solution, DataBricks Automation with Azure DevOps</h2>
<p>Fortunately, DataBricks has a CLI which we can be imported for DataBricks environment automation using Azure DevOps Pipelines. The Pipelines enable us to run PowerShell or Bash scripts as a job step. By using the CLI interfaces in our Bash Script, we can create, manage and maintain our Data bricks environments. This approach will remove the need to do any manual work on the Production DataBricks Work Space. Let?s review the bash script.</p>
<p>First, we will create a Pool for the cluster by waiting for the completion status and the Id. Then we will create a cluster by using the created Pool and wait for the completion. As our cluster gets ready then we will be able to use the cluster id to add Libraries and Workspaces using the following script. there are two support JSON files which include the environment properties.</p>
<p>In our Azure DevOps Pipelines definition first we have to install Python runtime and then DataBricks CLI. By having required environment runtimes then we can run the bash script. Here is the code snippet for the Pipelines step:</p>
<p>To be able to run the script against the Databricks environment you need a token. The token can be generated under the workspace and user settings.</p>
<p style="clear: both"> <img src="https://samanax.com/wp-content/uploads/2020/05/DataBricksToken-1024x272.jpg" /></p>
<p>The environment variables and settings are in JSON files, and the complete solution for DataBricks Automation with Azure DevOps Pipelines and support tool files are available from my GitHub repository.</p>
<h2>Databricks devops</h2>

<h3>Databricks devops</h3>
<p>[youtube]</p>
Databricks devops <a href="http://remmont.com">Latest news headlines today</a> Databricks devops
<h4>Databricks devops</h4>
Databricks on Azure is essential in data, AI and IoT solutions, but the env. automation can be challenging. Azure DevOps is a great tool for automation.
<h5>Databricks devops</h5>
Databricks devops <a href="http://remmont.com">Databricks devops</a> Databricks devops
SOURCE: <h6>Databricks devops</h6> <a href="https://dev-ops.engineer/">Databricks devops</a> Databricks devops
#tags#[replace: -,-Databricks devops] Databricks devops#tags#
https://ssylki.info/?who=mls-search.remmont.com https://ssylki.info/?who=houses-for-ren ... emmont.com https://ssylki.info/?who=central-park-a ... emmont.com https://ssylki.info/?who=remmont.com/da ... nka-gossip https://ssylki.info/?who=apartment-guide.remmont.com
Details: [url=http://remmont.com/category/credit/] free credit file check online
[/url] Fresh News.
SARAKn
 
Сообщений: 561
Зарегистрирован: 15 июл 2019, 19:51
Откуда: USA

What's going on in the news today - REMMONT.COM

Сообщение SARAKn » 19 май 2021, 18:21

Vsts jenkins - Eduard Kabrinskiy


<h1>Vsts jenkins</h1>
<p>[youtube]</p>
Vsts jenkins <a href="http://remmont.com">News highlights</a> Vsts jenkins
<h1>?????</h1>
<h2>?????</h2>
<h3>[Jenkins]??????(?)??TFS???????</h3>
<ul>
3679 0Jenkins 2018-08-12
</ul>
<p>Jenkins????????????Git?Subversion?????????????????????TFS(TFVC)???????Daily Build??????TFS plugin????TFS??TFS Check-in???Jenkins???</p>
<p style="clear: both"><img src="https://dotblogsfile.blob.core.windows.net/user/stanley14/48db4598-3a48-475b-af8d-545c65d4cf31/1534045467_07584.png" /></p>
<p>?TFS(Team Foundation Server)??????</p>
<p><ul>
<li>??????Jenkins??TFS??????</li>
<li>?????????Jenkins?TFS <strong>?????</strong> ???? <strong>TFS??????????????Jenkins???</strong> ???Git Polling the repository??The post-commit Git hook?</li>
</ul>
</p>
<p>?????TFS plugin???????CI Job(??)??????????Git?subversion?????</p>
<p style="clear: both"><img src="https://dotblogsfile.blob.core.windows.net/user/stanley14/48db4598-3a48-475b-af8d-545c65d4cf31/1527508510_22079.png" /></p>
<h2>Jenkins ??TFS Plugin</h2>
<p>?? Jenkins > ??????</p>
<p style="clear: both"><img src="https://dotblogsfile.blob.core.windows.net/user/stanley14/48db4598-3a48-475b-af8d-545c65d4cf31/1527508694_31073.png" /></p>
<p>?? ??? tab????????team?????Team Foundation Server???? ???? ???</p>
<p style="clear: both"><img src="https://dotblogsfile.blob.core.windows.net/user/stanley14/48db4598-3a48-475b-af8d-545c65d4cf31/1527508948_58456.png" /></p>
<p style="clear: both"><img src="https://dotblogsfile.blob.core.windows.net/user/stanley14/48db4598-3a48-475b-af8d-545c65d4cf31/1527509024_64519.png" /></p>
<h2>TFS??????</h2>
<p>??TFS????????????? > ??> ???? > ?????+?</p>
<p style="clear: both"><img src="https://dotblogsfile.blob.core.windows.net/user/stanley14/48db4598-3a48-475b-af8d-545c65d4cf31/1527509156_53319.png" /></p>
<p>?Jenkins > ?? ??? ??</p>
<p style="clear: both"><img src="https://dotblogsfile.blob.core.windows.net/user/stanley14/48db4598-3a48-475b-af8d-545c65d4cf31/1527509226_63877.png" /></p>
<p>??????- <strong>??????</strong> > ?? <strong>???</strong> ??</p>
<p style="clear: both"><img src="https://dotblogsfile.blob.core.windows.net/user/stanley14/48db4598-3a48-475b-af8d-545c65d4cf31/1527509273_00137.png" /></p>
<p>????Jenkins??URL???Jenkins???????????? <strong>??</strong> ???</p>
<p style="clear: both"><img src="https://dotblogsfile.blob.core.windows.net/user/stanley14/48db4598-3a48-475b-af8d-545c65d4cf31/1527509333_79583.png" /></p>
<p>* ???? Jenkins ????????????????????????????</p>
<p style="clear: both"><img src="https://dotblogsfile.blob.core.windows.net/user/stanley14/48db4598-3a48-475b-af8d-545c65d4cf31/1527509373_90902.png" /></p>
<p>??????????????????????CI???????(Slack and Email)??????Jenkins???CI JOB?</p>
<h2>Vsts jenkins</h2>

<h3>Vsts jenkins</h3>
<p>[youtube]</p>
Vsts jenkins <a href="http://remmont.com">Latest national news headlines</a> Vsts jenkins
<h4>Vsts jenkins</h4>
Jenkins????????????Git?Subversion?????????????????????TFS(TFVC)???????Daily Buil
<h5>Vsts jenkins</h5>
Vsts jenkins <a href="http://remmont.com">Vsts jenkins</a> Vsts jenkins
SOURCE: <h6>Vsts jenkins</h6> <a href="https://dev-ops.engineer/">Vsts jenkins</a> Vsts jenkins
#tags#[replace: -,-Vsts jenkins] Vsts jenkins#tags#
https://ssylki.info/?who=credit-card-pa ... emmont.com https://ssylki.info/?who=best-insurance ... emmont.com https://ssylki.info/?who=used-car-prices.remmont.com https://ssylki.info/?who=1-bedroom-hous ... emmont.com https://ssylki.info/?who=remmont.com/black-bossip-2
Details: [url=http://remmont.com/category/credit/] free credit file check online
[/url] Fresh News.
SARAKn
 
Сообщений: 561
Зарегистрирован: 15 июл 2019, 19:51
Откуда: USA

abc news - REMMONT.COM

Сообщение SARAKn » 19 май 2021, 21:41

Azuredevopslabs - Kabrinskiy Eduard


<h1>Azuredevopslabs</h1>
<p>[youtube]</p>
Azuredevopslabs <a href="http://remmont.com">News highlights today</a> Azuredevopslabs
<h1>Azure DevOps and Jenkins in perfect harmony</h1>
<p style="clear: both"><img src="https://miro.medium.com/fit/c/56/56/1*8tCLF3pBk8nadipPxbHesA.png" /></p>
<h4>Brian Benz</h4>
<h4>Nov 2, 2018 В· 11 min read</h4>
<p>I recently had the opportunity to present several talks at DevOps World | Jenkins World 2018 in San Francisco. It was a great experience as always, and I want to say thanks to those of you who visited our booth and checked out the talks from my myself and my fellow Microsoft Cloud Developer Advocate Jessica Deen. I?ve had several follow-up questions on my demos that have inspired me to document and share them here today.</p>
<h1>The Demo Application: Spring Music</h1>
<p>When I build a DevOps de m o I like to use a base app that will reflect real-world experiences that real DevOps professionals encounter I their daily work.. I really like using this version of Spring Music because it?s a Java-based Spring Boot application that can run by itself or connect to a variety of Azure data services on the back-end including NoSQL and SQL storage options. It also uses a built-in Gradle wrapper for builds, which makes it portable to run on all of Azure compute services, including Windows and Linux VMs, Azure Kubernetes Service, and Azure App Service on Linux. This means that it?s a single application that can show multiple real-world use cases for running and deploying and application in the cloud. It?s also a great application to demonstrate multiple real-world use cases for working with popular CI/CD/DevOps tools and services.</p>
<p>Here?s a screen shot of the application itself. You can view Rolling Stone?s top 500 Albums of all time, by Title, artist, year and Genre. There?s also a list view, and you can edit and add albums.</p>
<p style="clear: both"><img src="https://miro.medium.com/max/60/1*667ePz0QuXW4BkM6DyZqng.png" /></p>
<h1>Combining Azure DevOps and Jenkins to deploy the app.</h1>
<p>So now we know enough about the app ? let?s move on to how we get it in the cloud using Azure DevOps and Jenkins. For a complete picture of how Jenkins and Azure DevOps complement each other, have a look at my post on the Microsoft Open Source Blog. In this post we?ll cover a couple of key scenarios in depth.</p>
<p>I work a lot with developers who need to manage and deliver software projects that contain multiple languages and platforms. A large number of them use Azure DevOps, which is the next evolutionary step up from Visual Studio Team Services, which had it?s Origins in Team Foundation Server. They like features like free private Git repos, test plans, Kanban boards and custom team dashboards.</p>
<p>A large number of these developers also have CI/CD pipelines running in Jenkins. These developers have leveraged the incredible selection of Jenkins Plugins available to make building and testing of their code as effortless as possible.</p>
<p>The good news is that you don?t have to choose one or the other! Savvy development shops have leveraged the best features of both platforms, and Microsoft has made that easy by developing several plugins that combine the best of both. For example, you can provision and manage Azure Virtual Machines as Jenkins Agents with the Jenkins Agent Plugin for Azure. You can also deploy build artifacts to several destinations, including cloud storage using the Azure Storage plugin for Jenkins.</p>
<h1>Prerequisites</h1>
<h1>An Azure DevOps Account</h1>
<p>If you don?t have one already, can get a free Azure DevOps account here.</p>
<h1>Jenkins</h1>
<p>If you already have Jenkins up and running, you can add our plugins from the marketplace. If you don?t, we also have you covered with a sample architecture and template that can be deployed securely and easily to Azure on Ubuntu Linux, and includes the most popular Microsoft Azure plugins for Jenkins pre-installed. Either way, I?m going to show you steps for working with Azure DevOps and Jenkins together to deliver an application to Azure.</p>
<h1>Jenkins Azure Credentials Plugin</h1>
<p>If you use the template above to create Jenkins, the plugins you need are preinstalled. If not, you?ll need the Azure Credentials plugin as a minimum, which enables storage of Azure credentials in Jenkins. Check the end of this post for more Jenkins plugins to explore.</p>
<h1>Create a new Azure DevOps project</h1>
<p>Let?s set up a base by creating a new project in Azure DevOps and importing the Spring Music github repo into a new Azure DevOps repository. Go to https://dev.azure.com, log in or create your free account, the click on ?Create Project? on the top right and give the project a descriptive name (I called mine spring-music-devops)</p>
<p>By default, the Azure DevOps repository is private, which is great if your app is enabled to share confidential connections and keys that you need when building and testing your application before it is deployed. Of course, you can also make the repository public as well, but we?ll keep it private for now.</p>
<p>Fill in a name, description and click create:</p>
<p style="clear: both"><img src="https://miro.medium.com/max/48/1*Jgf8-XLNOS0P3vCkkoReuA.png" /></p>
<p>Azure DevOps creates a project you can use to manage Kanban boards, pipelines, test, and artifacts, but let?s focus on setting up the repo for now. Click on Repos</strong>:</p>
<p style="clear: both"><img src="https://miro.medium.com/max/46/1*7bXxfnyvrVpg8M57dLIoow.png" /></p>
<p>This will create a new repo for our project based on the Spring Music GitHub Repo.</p>
<h1>Import a GitHub repo into your Azure DevOps Project</h1>
<p>Click on the import</strong> button under import a repository</strong>:</p>
<h2>Azuredevopslabs</h2>

<h3>Azuredevopslabs</h3>
<p>[youtube]</p>
Azuredevopslabs <a href="http://remmont.com">Latest news</a> Azuredevopslabs
<h4>Azuredevopslabs</h4>
I recently had the opportunity to present several talks at DevOps World | Jenkins World 2018 in San Francisco. It was a great experience as always, and I want to say thanks to those of you who?
<h5>Azuredevopslabs</h5>
Azuredevopslabs <a href="http://remmont.com">Azuredevopslabs</a> Azuredevopslabs
SOURCE: <h6>Azuredevopslabs</h6> <a href="https://dev-ops.engineer/">Azuredevopslabs</a> Azuredevopslabs
#tags#[replace: -,-Azuredevopslabs] Azuredevopslabs#tags#
https://ssylki.info/?who=san-francisco- ... emmont.com https://ssylki.info/?who=best-car-insurance.remmont.com https://ssylki.info/?who=units-for-rent.remmont.com https://ssylki.info/?who=no-credit-chec ... emmont.com https://ssylki.info/?who=rental.remmont ... ndoli-jail
Details: [url=http://remmont.com/category/credit/] free credit file check online
[/url] Fresh News.
SARAKn
 
Сообщений: 561
Зарегистрирован: 15 июл 2019, 19:51
Откуда: USA

Пред.След.

Вернуться в Профессиональный электроинструмент

Кто сейчас на форуме

Сейчас этот форум просматривают: novyjtop и гости: 42

cron