roshenw
BAN USERUsing bit shift to find the powers
public static void bitShitForSumower(int j){
int x = 1;
int y = 0;
for (int i = 0; i < j; i++) {
//bit shift
x = x<<1;
y += x;
System.out.println("x = " + x );
}
if( j == 0) System.out.println("The Some is 1");
else { System.out.println("The some is " + y);}
}
An implementation without Bit shift
public int bruteForceIt(int x){
int j = 1;
int y = 0;
for (int i = 1; i <= x; i++) {
if(x > 0){
j = j*2;
y += j;
System.out.println(j);
}
}
if(x == 0){
System.out.println("The Sum of Powers is" + 1);
return 1;
}
System.out.println("The Sum of Powers is" + y);
return j;
}
If we know the span of the numbers we can use a array of that length.
Every time we encounter a number we can increment the index.
After all numbers are added, sort it. In this case we do not have to do the bookkeeping we do to maintain the heap condition.
And only sort once.
However, If we want to find the highest top K numbers at any given point during the process heap is the best option
From the Limited Information thats given and given the fact that you are only looking for the first URL that is unique the best way I can think of is to first Sort the file.
Once you have a sorted file you just scan the file linearly to find the first one that has no duplicate next to it on the list.
before:
google.com
apple.com
careercup.com
bloomberg.com
google.com
apple.com
bloomberg.com
after:
apple.com
apple.com
bloomberg.com
bloomberg.com
careercup.com
google.com
google.com
You can easily pick out the unique one.
The initial cost of sorting is a major overheard for a file that size. For which you can use some of the techniques talked about in the comment above. Merge sort or Quick sort should be fine.
Note*
Radix sort or Bucket sort can save a ton of computation if you can convert the URL to IP address.
Can someone please point out if there are any logical errors in this algorithm. Thank you!!
public static boolean DFS(Node A, Node root){
if(root.data == null) return false;
if(root.data == A.data) return true;
root.visited = true;
for(Node x: root.adjecent()){
if(x.visited != true){
DFS(A, x);
}
}
return false;
}
This implementation works. However it has a time complexity of n^2.
public static int[] addExcept(int[] test){
int[] array = new int[test.length];
for(int i= 0; i < test.length ; i++){
int total = 0;
for(int k=0; k< test.length; k++){
if(k != i ){
total += test[k];
}
}
array[i] = total;
}
return array;
}
public static void findSmallestAndLargest(String test){
int high , low;
int[] numbers = new int[test.length()];
for(int i= 0; i < test.length(); i++){
numbers[i] = Integer.parseInt("" + test.charAt(i));
System.out.println(numbers[i]);
}
high = numbers[0];
low = numbers[0];
int counter = 0;
for(int i=1; i< numbers.length; i++){
if(numbers[i] > high){
high = numbers[i];
counter++;
}else if(numbers[i] < low){
low = numbers[i];
counter++;
}
}
System.out.println("Low " + low + " Hight " + high +
" counter " + counter + " Length of input " + numbers.length );
}
Repsharoneruther, AT&T Customer service email at ABC TECH SUPPORT
I am Sharon, from Bethlehem USA. I am working as a Park naturalist in Sammy's Record Shack company. I ...
For every element when you traverse a heap to update values doesnt that have a higher complexity that nLogn
- roshenw February 02, 2015