Vs
BAN USER1. Optimize the database reads and reduce the time from 7 secs to lesser
2. Do lazy reading where by you read only the minimum amount of data and so reduce the number of database reads.
3. Denormalize your data and so that you dont do multiple joins
4. Build a cache and load the data eagerly.
5. Use AJAX like mechanism to load in the background what is not critical.
6. Use Materialized Views
I assume this question is about the increasing the number of application servers. If so then it depends on what kind of processing is the App servers are doing. If they are connecting to some backend datastore, then it comes down to how does the backend db scales compared to the increased number of App servers. Will they be able to handle more open database connections ?
- Vs June 19, 2017Localize the problem to which process or service is causing the issue. Disable all other services and see what is causing the slowness of the server. You would need to find out using perfmon to see the CPU utilization, RAM, Disk I/O and then localize the issue.
Check if there any heavy I/O happening that is making the CPU idle.
It is unclear on the purpose of this question. What are they trying to judge by this question?
There could be multiple reasons on why the log could be getting filled. It might be verbose logging level or exception in a tight loop or a background operation failing continuously and logging.
Problem statement says that Read would read a set of numbers and add odd number to the odd list and even number to the even list. This essentially meant just a sorted insert for each of the numbers. Solutions above tries to remove even numbers from odd list and vice versa, which is not correct.
- Vs February 10, 2016In the above solution there is minor mistake. Suppose you have a tree with the root as 75, the left child as 72 and right child as 78. The value of X is 73. The above solution would return 75 instead of 72. The comparison should be to the absolute value (Math.Abs(x - valReturn) > Math.Abs(x -rootval);
- Vs August 14, 2015Here is a simple C# function to do this:
private static void Printi8n(string input)
{
if (string.IsNullOrEmpty(input))
return;
int length = input.Length;
for (int i = 1; i < length; i++)
{
int substrLen = input.Substring(i, length - i - 1).Length;
string result = input.Substring(0, i) + (substrLen > 0 ? substrLen.ToString() : "") + input[length - 1];
Console.WriteLine(result);
}
}
This is a variation of the merging two sorted linked lists problem. You would need to traverse the linked list till the point where the next node value is greater than the current node value. By doing this you can figure out the starting of two sorted linked list. Then merge the two sorted linked list.
- Vs September 28, 2013Construct a bloom filter with the large array (Use a bit vector for storing the large array). Then search the smaller array in the bloom filter by hashing its value. The problem with this approach though is that you can get false positives with bloom filter unless you choose multiple hash functions and larger input array.
- Vs September 28, 2013The easy solution to solve this kind of problem is MapReduce. Since MapReduce is not allowed, the other alternative is to sort the string (url) ==> int (visits) in each machine independently such that it is the increasing order of visits. Then each server can send their top 10 visited URLs, Visit mapping to one of the server.
This single server would receive data from all the others for the top 10 visited sites and do a merge to decide on the top 10 sites. ( A n-way merge sort of the URL vs visits).
This is good solution. One optimization is that instead of searching all the values, we can do Binary search since the values are kept sorted by time.
- Vs July 10, 2018