zortlord
BAN USERThis can be done using a simple tree traversal
public static ArrayList<int[]> getAllTriangles(int[] arr){
Worker worker = new Worker(arr);
worker.execute();
return worker.result;
}
static class Worker{
ArrayList<int[]> result = new ArrayList<int[]>();
int[] arr;
Worker(int[] arr){
this.arr = arr;
}
void execute(){
recurse(0);
}
void recurse(int depth){
if(depth == 3){
int[] tri = Arrays.copyOfRange(arr, arr.length-depth, arr.length);
result.add(tri);
return;
}
//for each unused position in the array
int lastIndex = arr.length - 1 - depth;
int newDepth = depth + 1;
for(int i = 0; i < arr.length-depth; i++){
//select the position
int temp = arr[lastIndex];
arr[lastIndex] = arr[i];
arr[i] = temp;
//recurse for all other possibilities
recurse(newDepth);
//unselect the position in the array
arr[i] = arr[lastIndex];
arr[lastIndex] = temp;
}
}
}
Well, you're going to have to somehow check every person in the World but you also know that ages are roughly limited in the range [0..130]. So, you could theoretically use some sort of recursive halving sort to find the ages. This would give you answers in roughly O (log n * 130) :
public static int[] getCountOfAges(int[] ages){
//first compute the indices at which each age group will start
int[] ageCounts = new int[130];
ageCounts[0] = 0;
for(int i = 1; i < ageCounts.length; i++){
ageCounts[i] = findFirstInstance(ageCounts[i-1], ages, i);
}
//convert the indices to actual values:
for(int i = 0; i < ageCounts.length - 1; i++){
ageCounts[i] = ageCounts[i+1] - ageCounts[i];
}
ageCounts[ageCounts.length-1] = ages.length - ageCounts[ageCounts.length-1];
return ageCounts;
}
private static int findFirstInstance(int loValueIndex , int[] arr, int value){
//find an index where arr[index] = value
int valueIndex = findIndex(lo, arr, value);
//find where arr[index -1] < arr[index]
while(valueIndex - loValueIndex > 1){
int mid = (valueIndex + loValueIndex)>>1;
if(arr[mid] == arr[valueIndex]){
valueIndex = mid;
}
else{
loValueIndex = mid;
}
}
//return index
return valueIndex;
}
private static int findIndex(int lo, int[] arr, int value){
int hi = arr.length -1;
while(lo < hi){
int mid = (lo + hi)>>1;
if(arr[mid] < value){
lo = mid;
}
else if(arr[mid] > value){
hi = mid;
}
else{
return mid;
}
}
return arr[hi] == value ? hi : lo;
}
Using a map like what Sathish does is probably best O(n + m) and O(n) memory (cleaned up a bit so it's easier to read):
public static int[] getCommonArray(int[] arr1, int[] arr2){
HashMap<Integer, Integer> instanceCountMap = new HashMap<Integer, Integer>();
for(int i : arr1){
Integer count = instanceCountMap.get(i);
if(count == null){
instanceCountMap.put(i, 1);
}
else{
instanceCountMap.put(i, count + 1);
}
}
ArrayList<Integer>results = new ArrayList<Integer>();
for(int i : arr2){
Integer count = instanceCountMap.get(i);
if(count != null){
results.add(i);
if(count > 1){
instanceCountMap.put(i, count -1);
}
else{
instanceCountMap.remove(i);
}
}
}
int[] resultArr = new int[results.size()];
for(int i = 0; i < resultArr.length; i++){
resultArr[i] = results.get(i);
}
return resultArr;
}
public int getShortestFrontModdedPalen(String str){
if(str == null){
throw new NullPointerException():
}
int definatelyAdd = 0;
int nonChanges = 0;
int endPos = str.length() -1;
int frontPos = 0;
while(endPos > frontPos){
if(str.charAt(endPos) == str.charAt(frontPos)){
nonChanges += 1;
endPos--;
frontPos++;
}
else{
frontPos = 0;
definatelyAdd+= 1 + nonChanges;
nonChanges = 0;
endPos--;
}
}
return definatelyAdd + str.length();
}
O(mn) solution with O(mn) memory usage:
public static ArrayList<ArrayList<Point>> getMineClusters(boolean[][] mineField){
if(mineField == null || mineField.length == 0 || mineField[0].length == 0){
return null;
}
ArrayList<ArrayList<Point>> results = new ArrayList<ArrayList<Point>>();
boolean[][] visited = new boolean[mineField.length][mineField[0].length];
for(int i = 0; i < visited.length; i++){
for(int j = 0; j < visited[i].length; j++){
checkMineField(mineField, visited, results, i, j, null);
}
}
return results;
}
private static void checkMineField(boolean[][] mineField, boolean[][] visited, ArrayList<ArrayList<Point>> results, int i, int j, ArrayList<Point> currentCollection){
if(0 > i || i >= mineField.length || 0 > j || j >= mineField[0].length){
return;
}
if(!visited[i][j]){
visited[i][j] = true;
if(mineField[i][j]){
if(currentCollection == null){
currentCollection = new ArrayList<Point>();
results.add(currentCollection);
}
currentCollection.add(new Point(i, j));
checkMineField(mineField, visited, results, i-1, j-1, currentCollection);
checkMineField(mineField, visited, results, i-1, j, currentCollection);
checkMineField(mineField, visited, results, i-1, j+1, currentCollection);
checkMineField(mineField, visited, results, i, j-1, currentCollection);
checkMineField(mineField, visited, results, i, j+1, currentCollection);
checkMineField(mineField, visited, results, i+1, j-1, currentCollection);
checkMineField(mineField, visited, results, i+1, j, currentCollection);
checkMineField(mineField, visited, results, i+1, j+1, currentCollection);
}
}
}
Not sure why a divide and conquer approach would be requested- the problem can be solved easily enough in a linear fashion with O(n) complexity and O(1) memory:
public static int countConsecutive0s(int[] arr){
int counter = 0;
for(int i = 1; i < arr.length; i++){
if(arr[i-1] == 0 && arr[i] == 0){
counter++;
}
}
return counter;
}
However, if you had to do a divide and conquer approach specifically like if you were doing some GPU work, then you could do it like the following (pseudo coded up from the perspective of each core)
linearly count consecutive 0s in assigned section
if(not the first section in the array)
if(section starts with 0 and previous section ends with 0){
add 1 to count
}
}
save the counts to a working buffer
use logarithmic compression to sum all the counts from the different cores
Runtime complexity will by O(n / c + c log c) where c is the number of cores and memory is O(c)
I think that a recursive approach here may be simplest (alternately, two stacks could be used that store the running value (not the sum) and the position in the tree so-as-to not be recursive). However, in java, the atom structures don't easily lend themselves to changing values so use a 1 valued array or other object to store the sum. Complexity O(n) and Memory O(log n) (for balanced tree) or O(n) (if the tree is unbalanced) since it's recursive.
class SumTreeNode{
private int value,
private SumTreeNode left, right;
public static int pathSums(SumTreeNode root){
//will use this to store the result
int[] result = new int[1];
if(root != null){
pathSumsRecur(result, root, 0);
}
return result[0];
}
public static void pathSumsRecur(int[] result, SumTreeNode node, int pathSum){
pathSum *= 10;
pathSum += node.value;
boolean isLeaf = true;
if(node.left != null){
isLeaf = false;
pathSumsRecur(result, node.left, pathSum);
}
if(node.right != null){
isLeaf = false;
pathSumsRecur(result, node.right, pathSum);
}
if(isLeaf){
result[0] += pathSum;
}
}
}
Make a queue to store digits
Iterate through the digits in the number starting at the rightmost position (least significant)
add the digit to the queue
if there is a digit to the left
if the digit to the left is smaller than the head of the queue
put the left digit in a temp
replace the left position with the head of the queue
put the temp in the current digit position
and output the remaining integers from the queue
return the new number
else return -1
Complexity is O(n) where n is the number of digits and Memory is O(n). It could be made memory O(1) if the processing was done in more of a String char approach, but I thought moving to arrays and string / char conversions would be slower.
public static int nextLarger(int val){
if(val < 12){
return -1;
}
Queue<Integer> queue = new LinkedList<Integer>();
queue.add(val%10);
val /= 10;
while(val > 0){
int nextVal = val %10;
val /= 10;
if(nextVal < queue.peek()){
val = val * 10 + queue.remove();
val = val * 10 + nextVal;
while(queue.peek() != null){
val = val * 10 + queue.remove();
}
return val;
}
queue.add(nextVal);
}
return -1;
}
if arr_0, if arr_1 > arr_0
add arr_0 to the results
for(arr_n (where n >= 1 && n < arr length)
if arr_n <= arr_n-1 && arr_n < arr_n+1
add arr_n to the results
if(arr_length < arr_length-1)
add arr_length to results
Complexity is O(n) and memory use is O(n) (storing results)
public static ArrayList<Integer> findMins(int[] arr){
if(arr == null){
throw new NullPointerException("\"arr\" may not be null");
}
ArrayList<Integer> results = new ArrayList<Integer>();
if(arr.length < 2){
return results;
}
//set to true to handle arr_0
boolean lastDown = true;
for(int i =0; i < arr.length-1; i++){
//count the first position of a plateau
boolean thisUpOrEqual = arr[i] <= arr[i+1];
if(lastDown && thisUp){
results.add(arr[i]);
}
// don't count subsequent plateau positions
lastDown = arr[i] > arr[i+1];
}
//check the last case. cannot be a plateau
if(arr[length-1] < arr[length -2]){
results.add(arr[length-1]);
}
return results;
}
I don't think all of these answers consider the case that if you succeed, you won't continue making shots.
For example in challenge 1, you could make both shots ( p * p ), fail the first shot ( (1-p) * p * p) or fail the second shot ( p * (1 - p) * p). This sums to
2 * (1-p) * p^2 + p^2 (which simplifies to p^2 * (3-2p) )
And for the second challenge
p^4 (make all shots)
4 * (1-p) * p^4 (fail one shot)
10 * (1-p)^2 * p^4 (fail 2 shots)
which simplifies to
p^4 * ( 10 p^2 -24 p + 15 )
solving for these equations, you find that if shot percentage is less than about 78.5%, Challenge 1 should be picked. Challenge 2 should be picked if shot percentage is better than 78.5%
Why hasn't anyone considered the situations where a player wouldn't continue. For example, for 2 out of 3 shots, if the percentage change to succeed is 'p', then the chance to succeed is equal to:
(1-p) * p * p + p * (1-p) * p + p * p.
This simplifies to
2*(1-p) * p * p + p * p (which is basically the probability of only 1 miss plus the probability of no misses)
Here's a short table:
p | succeed
25% | 15.6%
50% | 50.0%
75% | 84.4%
for Challenge 2, the chances can be computed similarly:
10 * (1-p)^2 * p^4 + 4 * (1-p) * p^4 + p^4 = p^4 ( (1-p) * ( 10 * (1-p) + 4 ) + 1 )
Here's a table:
p | Win
25% | 3.8%
50% | 34.4%
75% | 83.1%
There is, however, a point a which it's better to choose challenge 2. That's when p gets to about 78.5%
public static boolean hasRedundantBraces(String str){
// this will store the position in the str for each of the open braces
Stack<Integer> positions = new Stack<Integer>();
// for each character
for(int i = 0; i < str.length(); i++){
char c = str.charAt(i);
//if it's an open brace, push it
if(c == '{'){
positions.push(i);
}
// if it's a close brace, check for redundancies
else if(c == '}'){
int lastPos = positions.pop();
//if there were other open braces and there are more chars
if(!positions.empty() && i < str.length() -1){
//if the previous open is right before this one and the next char is a close
if(positions.peek() == lastPost -1 && str.charAt(i+1) == '}'){
return true;
}
}
}
}
return false;
}
How about a very simple implementation using the median O(n) complexity and O(1) memory:
public static int countMoves(String str){
//find the median or median-like position
int leftIndex = -1;
int rightIndex = 15;
while(leftIndex < rightIndex){
leftIndex = findNext(1, str, leftIndex, 'x');
rightIndex = findNext(-1, str, rightIndex, 'x');
}
int total = 0;
//now move positions back towards the edges
//start with the left
//find first open position
int open = findNext(-1, str, leftIndex, '.');
//find first seated that needs to be moved
leftIndex = findNext(-1, str, open, 'x');
//now compute the move costs
while(leftIndex > -1){
total += open - leftIndex;
open--;
leftIndex = findNext(-1, str, leftIndex, 'x');
}
//now do the right
open = findNext(1, str, rightIndex, '.');
//find the first seated that needs to be moved
rightIndex = findNext(1, str, open, 'x');
//now compute the move costs
while(rightIndex < 15){
total += rightIndex - open;
open++;
rightIndex = findNext(1, str, rightIndex, 'x');
}
return total
}
private static int findNext(int interval, String str, int currPos, char c){
int index = currPos + interval;
while(index < str.length() && index > -1){
char checkC = str.charAt(index);
if(checkC == c){
return index;
}
index+= interval;
}
return index;
}
With a little more consideration, the entire approach, which is easy to read now, can be improved to operate at quicker:
public static void bullsAndCows(String p1, String p2){
if(p1 == null){
throw new NullPointerException();
}
if(p2 == null){
throw new NullPointerException();
}
int bulls = 0;
int index = 0;
int[] p1Counts = new int[256];
int[] p2Counts = new int[256];
while(index < p1.length() && index < p2.length()){
char p1C = p1.charAt(index);
char p2C = p2.charAt(index);
if(p1C == p2C){
bulls++;
}
p1Counts[p1C]++;
p2Counts[p2C]++;
index++;
}
for(index; index < p1.length(); index++){
p1Counts[p1.charAt(index)]++;
}
for(index; index < p2.length(); index++){
p2Counts[p2.charAt(index)]++;
}
int cows = 0;
for(int i = 0; i < p1Counts.length; i++){
cows += Math.min(p1Counts[i], p2Counts[i]);
}
cows -= bulls;
System.out.println("Bulls - "+bulls+", Cows = "+cows);
}
This can be easily computed in O(n + m) (where n is length of p1 and m is length of p2) by checking the bulls, and then computing the number of matches based on char type - number of bulls to answer the cows. Assuming the chars are unicode.
public static void bullsAndCows(String p1, String p2){
//handle the easy cases:
if(p1 == null){
throw new NullPointerException("\"p1\" may not be null");
}
if(p2 == null){
throw new NullPointerException("\"p2\" may not be null");
}
//get the bulls
int bulls = getNumExactCharMatches(p1, p2);
//get the cows
int cows = getNumSimilarChars(p1, p2) - bulls;
//output
System.out.println("Bulls - "+bulls+", Cows - "+cows);
}
private static int getNumExactCharMatches(String p1, String p2){
int count = 0;
int index = 0;
while(index < p1.length() && index < p2.length()){
if(p1.charAt(index) == p2.charAt(index)){
count++;
}
index++;
}
return count;
}
private static int getNumSimilarChars(String p1, String p2){
int[] p1Counts = getCharCounts(p1);
int[] p2Counts = getCharCounts(p2);
int count = 0;
for(int i = 0; i < 256; i++){
count+= Math.min(p1Counts[i], p2Counts[i]);
}
return count;
}
private static int[] getCharCounts(String str){
int[] counts = new int[256];
char[] chars = str.getCharArray();
for(int i = 0; i < chars.length; i++){
counts[chars[i]]++;
}
return counts;
}
Defining a number as the following structure:
[optional '-'][any number of #s][optional '.' [any number of #s] ]
public static boolean isNumber(String str){
//handle the easy cases
if(str == null){
throw new NullPointerException("\"str\" may not be null");
}
if(str.isEmpty()){
return false;
}
char[] chars = str.toCharArray();
int charIndex = 0;
//allow optional '-'
if(chars[charIndex] == '-'){
charIndex++;
}
//handle unlimited amount of numbers and single '.'
boolean noDecimalPt = true;
while(charIndex < chars.length){
char c = chars[charIndex];
//if the char is not a digit
if(!isNumChar(c){
//it may be a '.' one time
if(c == '.' && noDecimalPt){
noDecimalPt = false;
}
//otherwise this is a bad character
else{
return false;
}
}
charIndex++;
}
return true;
}
public static boolean isNumChar(char c){
if(c < '0' || c > '9'){
return false;
}
return true;
}
This can be accomplished in O(n) where n is the number of pairs of number:
public static ArrayList<int[]> getOverlaps(ArrayList<int[]> ranges){
//handle easy cases
if(ranges == null){
throw new NullPointerException();
}
if(ranges.size() < 2){
return ranges;
}
//sort the ranges on the first index
Collections.sort(ranges, new Comparator<int[]>(){
public int compare(int[] o1, int[] o2){
return o1[0] - o2[0];
}
});
//setup global tracking variables
ArrayList<int[]> results = new ArrayList<int[]>();
//get the start and end of the first output range
int[] trackedRange = new int[]{ ranges.get(0)[0], ranges.get(0)[1]};
//for each range
for(int i = 1; i < ranges.size(); i++){
int[] range = ranges.get(i);
//if the new range does not overlap with the old one,
//then this is a new range and the old one should be put in the results
if(range[0] > trackedRange[1]){
results.add(trackedRange);
trackedRange = new int[]{range[0], range[1]};
}
//otherwise, the endpoint of the range could be extended by the new range
else if(range[1] > trackedRange[1]){
trackedRange[1] = range[1];
}
}
//there will be an additional range that is not captured by the for loop
results.add(trackedRange);
return results;
}
A BFS is probably not the fastest for this case. Use A* to get the paths.
Also, your complexity for the TSP is very wrong. Unless I'm mistaken,TSP is a factorial computation so finding the optimal ordering of checkpoints is going to be quite difficult to compute ( O(18!) which will be 6,402,373,705,728,000 different paths). You're better off using some stochastic algorithm like an evolutionary algorithm or an ACO to get the checkpoint ordering.
How about a dynamic programming method. Make the observation that any value in the range n - n*sides must be reachable as a summation of other summations. IE:
f(m, d) = sum{i = 1 .. sides} (if (i >= n) then f(n-i, d-1) else (0)) with base case
f(m, 1) = m
Complexity would be O(d * (d * n ^ 2 - (d -1) ) ) which is roughly O( d^2 * n ^2) with memory cost of O( d * n - (d -1) ) which is roughly O (d * n) where d is the number of dice and n is the number of sides.
public static void computeTotalSums(int numSides, int numDice){
int[] sums = new int[numSides];
for(int i = 0; i < sums.length; i++){
sums[i] = 1;
}
int offset = 0;
if(numDice > 1){
for(int dice = 2; dice <= numDice; dice++){
offset = dice -1;
int[] nextSums = new int[dice * numSides - offset];
for(int i = 0; i < nextSums.length; i++){
int sum = 0;
int sum = 0;
for(int side = 0; side < numSides; side++){
int sumsIndex = i - side;
if(sumsIndex > -1 && sumsIndex < sums.length){
sum += sums[sumsIndex];
}
}
nextSums[i] = sum;
}
sums = nextSums;
}
}
for(int i = 0; i < sums.length; i++){
System.out.println(""+(offset+i+1)+": "+sums[i]);
}
}
Actually, you should ignore this code. Without a clear definition for what a triangle triplet is, I assumed that it was only a subset with 3 elements. The above implementation is simple a really fast back-tracking algorithm that will operate in roughly O(n^3).
- zortlord November 26, 2014