Indulge your taste buds in a culinary symphony at Barbeque Holic, Kondapur's premier buffet destination. Immerse yourself in a gastronomic journey that transcends ordinary dining. Our lavish spread boasts an array of succulent grilled delicacies, from perfectly marinated meats to delectable vegetarian delights. Savor the smoky perfection of our barbecue specialties, expertly crafted to tantalize your palate. With an extensive selection of appetizers, main courses, and desserts, Barbeque Holic promises a feast for every discerning food lover. Impeccable service and a warm ambience complement the culinary excellence, making it the ultimate destination for a memorable dining experience. Discover the epitome of indulgence at Barbeque Holic in Kondapur.
Posts made by raivivan
Kicknology Solutions stands out as Hyderabad's premier Digital Marketing Agency, driving businesses to success through innovative strategies. With a talented team, they provide comprehensive digital marketing solutions, optimizing online presence, boosting ROI, and achieving marketing goals. Your business's digital success starts here.
I am trying to call a system which does not have it's api bulkified. So basically I have 1 record and for example 1000 child records. In order to send this info to other system am currently required to make 1000 api calls. Can we use middleware Dell Boomi to do this for me.
In short, I call only one Dell Boomi api with all 1000 records and Dell Boomi breaks into 1000 such calls and send this to other system.
Is this scenario even possible? Any suggestion in the right direction would be helpful.
In Tosca, default timeout I have mentioned is 35 seconds. So due to that the if condition taking whole 35secs to fails. Isthr way to mention seperate wait for condition alone.
Need to mention seperate wait for Tosca if condition
When the Update button is clicked the Toast message only shows the maintenance need even if the conditions are met. Not sure what the correct syntax is for getting the conditions of work.
Here is Syntex
I am new to Salesforce CPQ, but I have done Salesforce Rest API integrations before.
The requirement is to send record from quoteLine items from the QCP to an external API to retrieve certain data and update the quoteline field in realtime.
Will it be possible to do it without using Apex Rest Api?
I have previously designed it to pass from the QCP to an Apex Class and make API callouts from there and after getting the response passing it to the QCP to update the field. But using this design may cause it to hit Api limits in the future.
I installed docusign and docusign for CPQ on my demo org, but i am not able to see any of the fields or related sections under the Quote Document object. I tried to proceed with the integration and finished the steps required on the docusign website, but when i send out a document i am not able to get a response back.
We have an on-premise DevOps 2019 update 1.2 (17.153.32407.5) that we want to migrate to Azure DevOps (cloud) but we can’t find the data migration tools for 2019.
The tools posted here or
are only for DevOps 2020 update 1 or higher.
I have some confusion about the terms SDLC and Software Process. With respect to these (and more or less similar) terms, a have a few question.
What is the difference between SDLC and Software Process ? (I understand SDLC is not just Waterfall).
Can we map SDLC with Unified Process ?
About the activities- Analysis in tradition waterfall model, Do we do Analysis in Unified Process (Any unified process- Agile or Rational) ?
using the SAP Cloud studio SDK, can any of you please help us understand if there are any open source tools available which can interact with the SAP Cloud studio SDK or if there are any licensed tools, primarily to implement the DevOps automation for the code quality or test automation aspect.
I want to use AWS elasticsearch to store the log of my application. Since there a huge amount of data to input to AWS elasticsearch ( ~30GB daily), so i would only keep 3 days of data. Are there any way to schedule data removal from AWS elasticsearch or do a log rotation? What happen if the AWS elasticsearch storage is full?
Thanks for the help
I am trying to setup all projects from Apache Hadoop stack in one cluster. What is the sequence of setting up apache hadoop ecosystem frameworks. E.g: Hadoop, HBase, ... And if you tested with some specific set of steps can you tell what kind of problems can be faced during deployment. Main frameworks for deployment (Hadoop, HBase, Pig, Hive, HCatalog, Mahout, Giraph, ZooKeeper, Oozie, avro, sqoop, mrunit, crunch, please add if I miss something)
i am new to Cyber-Ark password vault and my try is to integrate artifactory with cyber-ark for some generic accounts whose passwords should be stored in the vault.
may i know the exact proess to get it created. We have one safe and i tried to create a safe account
I'm aware that there are a lot of questions on SO with similar content, but I can assure you that I have read the most of them completely with answers and comments.
My situation is slightly different in means that our company is mostly Java-oriented and thus the standard ALM tool set is already set up:
SVN for Source Control
Jira for task/issue tracking
Jenkins for continuous integration
Now, I run a team of .NET developers and we need to set up something similar for our dev process and the discussion is whether to go separately with TFS or to reuse the existing infrastructure and plug in the .NET projects there as well.
I realize that the biggest pro-TFS argument is good integration with VS, but with our current setup I am wondering if there are some good arguments for not using TFS at all.
I am trying to display the age of an issue that is not yet closed in JIRA dashboard. i.e. current date-issue created date. Display the duration in days for every unresolved issue. Is it possible
I want to deploy a small data center for data analysis purposes. I will get the data mostly from web applications. I know I can setup a hadoop cluster and scale it as per necessity. I also know that OpenStack is a free and open-source software platform for cloud computing, mostly deployed as an infrastructure-as-a-service (IaaS). However, it is apparent that some industries are preferring hadoop on top of OpenStack (Sahara). Thus, I want to know the difference, advantage and disadvantages of Hadoop features with or without OpenStack.
In brief, if I put Hadoop on top of OpenStack, what extra features do I get?
I am facing problems with Informatica HTTP transformation.
In the HTTP transformation, the integration service connects to the HTTP Server with a request and the servers response is recorded in the target table.
I was planning to build a web application that will take a user name and print the password in the next page and the password will be recorded in the target table. I had problems establishing connectivity with the database and now simply looking for websites that are already deployed that can make this possible.
Can you suggest any website? A very simple transformation will do. A simple website or any help to make a simple HTTP transformation possible will be of great help.
Thanks and Regards
We use Codepipeline from AWS to automatize all the deployment processes for most of our services and DevOps tool workers.
Every day we use more lambda functions and microservices in our company but is getting hard to manage their deployment of them.
Anyone has experience with CI/CD tools to manage a couple of dozens of microservices in production and development?
I never use Codepipeline for Lambda, so maybe is an option that I did not see, but I am not close to using any other tools as an option.
Any recommendations ?. Thanks in advance !
All my infrastructure is on AWS for a long time and we are using all Devops services of AWS like code commit, and code pipeline, Now there is a new requirement to push one of the web applications to the Azure cloud platform.
Is it really possible to connect from AWS DevOps tools to Azure to create,deploy and maintain updates on web applications? Basically, CICD process on Azure connects from AWS services.
Thanks in Advance.
We have multiple servers on AWS and we have repo on Azure Devops. We want to deploy the code from Azure CICD to AWS DevOps servers. Any suggestions or help would be appreciated.