2

I am currently working on a school project and one of my tasks is to implement a 16-bit by 16-bit 2's complement integer divider as a digital logic circuit (in other words 16-bit input divided by another 16-bit input). The output is straightforward where it shows quotient Q and remainder R. Also special cases like dividing by zero are taken care of with preset conditions.

My primary issue here is that the only way that I am able to implement this is by using long division or a very long recurring subtraction. Even then, I'm not sure how to implement long division without creating a messy circuit. Open to suggestions in case there is no other way.

Because of this, I have looked into other division algorithms like the Newton-Raphson division, but I don't think those algorithms are possible to implement as a logic circuit (and I just don't know and understand how to). So I was wondering if there were any speed-friendly division algorithms to do this.

3
  • Intel is famous for their implementation of a fast division algorithm in the Pentium family of microprocessors. IIRC, the algorithm allows division to proceed two bits at a time, instead of the normal one bit at a time for simple long division. So if this is a school project, I would stick with the simple compare/subtract followed by a one bit shift of the divisor. Commented May 31, 2020 at 8:08
  • forget about 2'os complement do an unsigned division instead and handle the sign latter (its just negation + inc/dec) what you do is create 16 bit division from 8 or 4 bit divisions and use LUT for that ... see division by half-bitwidth arithmetics Commented May 31, 2020 at 9:33
  • Also Newton-Rapson is doable in circuitry ... Commented May 6, 2021 at 10:41

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.