LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How do I count duplicate elements and instead elements input the result of the count?

Hello,

 

I'm a student of the university and I have a problem which i can't solve myself.

My English is not that good but I will try to be clear.

 

Given a sequence of data from units and zeros: 00100101000000010000100001000001. Create a program that counts the repeating characters and instead elements inputs  the result of the count together with the character. For example:

the sequence 00 is replaced by 20;

instead of sequence 1000, 130 is entered;

20310 is replaced by 00310.

The sequence specified at the beginning of the condition will be encoded as follows 2012010170 after performing the specified actions.

 

Is there somebody who can help me?

 

Regards,

 

Eigintas

0 Kudos
Message 1 of 9
(2,784 Views)

What have you tried so far?

 

It seems that the problem is irreversible, i.e. you cannot restore the input from the output. Can there never be two 1s in a row? What are the possible values? How to you distinguish counts from values later? Do you just want to count stretches of zeroes? What is the maximum length of duplicates?

 

0 Kudos
Message 2 of 9
(2,751 Views)

This feels like a homework problem that you want us to do for you.

 

If there's a specific thing you want to do that you can't figure out how to do, that's something this board is for.  What this board is not for is posting your homework copy-pasted and expecting us to do it for you.

 

Please come back after you have a specific question, or at least something that shows us you put in an effort.

 

Start with looking for a way to input "00100101000000010000100001000001" into your program as an array and go from there.

0 Kudos
Message 3 of 9
(2,739 Views)

At least could you tell me how to change sequence number, let's say "01" to the recorded "21"?

0 Kudos
Message 4 of 9
(2,641 Views)

@Eigintas wrote:

At least could you tell me how to change sequence number, let's say "01" to the recorded "21"?


You did not tell us the rules that would convert an input string of "01" to "21". Once you define the rules completely, implementation will be easy. Also define terms such as "sequence number" an point out which part corresponds to it.

0 Kudos
Message 5 of 9
(2,634 Views)

Here's a simple way to count repeated characters. Maybe you can modify it for your purpose.

(assumes that the input only has printable characters)

 

altenbach_0-1589746039106.png

 

0 Kudos
Message 6 of 9
(2,626 Views)

Ok. I have this binary number 00100101000000010000100001000001, so i want to know how to convert it in to decimal (20120101701401401501 this is a answer). I know the logic of the conversion should be like this:

 

bintodec.PNG

0 Kudos
Message 7 of 9
(2,622 Views)

@Eigintas wrote:

Ok. I have this binary number 00100101000000010000100001000001, so i want to know how to convert it in to decimal (20120101701401401501 this is a answer). I know the logic of the conversion should be like this:

 

bintodec.PNG


 

So modify my code to do just that. Shouldn't be too hard. Right?

 

Other possibilities would be to treat the "1" as delimiter and measure the string length of the items.

 

Earlier you said that "01" should result in "21", now that combination is not even in the list.

 

Are these really the only possibilities? What should happen if there are two or more "1" in a row? What if there are 10 zeroes in a row? Does the input always start with a zero and end with a one?

0 Kudos
Message 8 of 9
(2,596 Views)

This sounds like the Look-And-Say Sequence, except that the description is a little bit different.

This is 1st semester CompSci stuff. You are already thinking in terms of rules that transform the input to the output and have example input-output pairs. Just implement the rules in code.

There also seems to be an inconstistency in the rules. Are You sure You wrote them down correctly?

 

If You have trouble on where to start, check out and investigate the far left portion of the code @altenbach posted. The first function on the input is called "String to Byte Array". How does the output of that function change based on the input?

 

As a note in terminology: You are certainly not "converting to decimal". I suspect that using this terminology might prevent You from solving the problem. This is a problem about analyzing a sequence of symbols - it just happens that the symbols can also be interpreted as numbers.

0 Kudos
Message 9 of 9
(2,564 Views)